var/home/core/zuul-output/0000755000175000017500000000000015155023056014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015155042151015470 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000410276615155041762020274 0ustar corecoreCikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIsdr.k9Gfͅ ?KlEڤ펯_ˎ6_o#oVݏKf핷ox[o8W5~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHevp7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR ώI8&xėv=E|;F}Zl8T*v (6pk**+ Le*gUWi [ӊg*XCF*Adv cXk?`QlrTvb)E,s)Wɀ;$#LcdHMJR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߿)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7sivh̞Qj_P]2@vN (C9yO|$UvވkZoIfzᑇy ^t }|#qKrdK\D2s&[#bE(mV9ىN囋{W5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (DnA6_sE炱}r4(9ifhs8u'8KwI~3v4&8[qߎ5.)Q VE JN`:a!KM/+9 bG+މG uIo1]ߔr TGGJ\B BR 4X\r RYGVق?<6jHSJ Jno#ˏl_}z?1:N3cl.:f 3 JJ5Z<{kJ_O{*Z8Y CEO+'HqZY PTUJ2dic3w ?YQgpa` Z_0΁?kMPc_Ԝ*΄Bs`kmJ?t 53@հ1hr}=5t;at 9:Ѥ߯R> kH&Y``zG,z҄R K&Nh c{A`O'd1*-B[aL"T 1dȂ0TJ#r)٧4!)'qOFz|&4@2ƭ1-RN%?i¸ `eH&MJ!&ᙢ(<<-ja0Tazkm{ GYә7}U>>a>Ҟҝ>Ϗ ,ȓw`E_d$Ə{(he NSfX1982THwnUC9fDx5X@O5OޔL<'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oT tMK%\t=[ٹ:11:2`c J1bV_gɊ:%u~,nbW;W8QufiŒSq3<uqMQhiae̱F+,~Mn3 09WAu@>4Cr+N\9fǶy{0$S#:Oz4efb#hQ #_ފH&z!HAd |}p TRi*KsmM+1 P0W YW ].'~&^%80=1JgޛIgǽgr&P29LcIIGAɐ`P-\zʡP=_RFZx[|mi G ʹo7T׋b!g K#XoV甬6xڂ I &m>AtĘ5dw9}ŒEanvVZ?B巻?qr7@sON_}릶ytoy͟מseQv^sP3.sP1'Ns8tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1þh[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %wG}/9nh7l%>'ct Հ}a>-:(QxPyA Z ULJ- upƜ/4cY\[|XsܑdI [@3YNє0vNۈ/:쇲=T u)1 QLLj`K -D,(7N*,< JDA?VǞ©H\@mϛ~W-ce{0d8G}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظ&Pg%㢒\QS܁vn` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL?{WƱPz;| \;_D[T/BI GH8@"t*"9!EZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘ+~PR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb { ޹na4p9/B@Dvܫs;/f֚Znϻ-mmBz-p^:ZYUv`Ƌ-v|u>r,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bCD6b&>9VE7e4p +{&g߷2KY,`Wf1_ܑMYٚ'`ySc4ΔV`nI+ƳC6;җ2ct"*5S}t)eNqǪP@o`co ˎ<عLۀG\ 7۶+q|YRiĹ zm/bcK3;=,7}RqT vvFI O0]&5uKMf#pDTk6yi*cem:y0W|1u CWL;oG^\ X5.aRߦ[_Vs? Ž^A12JQ̛XL:OEUپOY>WK-uP0\8"M: /P4Qz~j3 .-8NJ|!N9/|a|>lX9T ҇t~T1=UF"t; 8-1I|2L+)WȱL˿ˍ-038D*0-)ZyT13`tTnm|Yhi+lQ&Z!֨řoҒ"HKX 6„=z{Ҍ5+P1;ڇ6UNE@Uo/>8.fgW]kY0Cgcu6/!_Ɩ} ' Ў3)X<seWfSv!ؒRKfs%(1Lhrٵ L.] s?I,HCԢ[b C-lLG+@_$c%* _jR|\:dc5u= A@kUc\ǔz;M>dUN/aFRĦ@x؂ǀ$6%}N^ \mQ!%8j0dUo=rh>*YȴU3Q,̸*E%59sTzɟڮ2kg ۱wEUD3uKrr&"B:p`\E)j<).R&#ÃecE,dp"nPS 44 Q8ZƈKnnJei+^z '3JDbSK;*uБ:hF ѹ @˿ޗ~7g9| hLXULi7.1-Qk%ƩJ4^=ple;u.6vQe UZAl *^Vif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%eU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl?6o AMҪ1Ez&I2Wwߎ|7.sW\zk﯊溺^TW^T\*6eqr/^T77WNZ7F_}-򲺺VWQ77V\_v?9?"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsT~$U>ceއE)BI>UljO|Ty$ŋrwOtZ7$ "i 8U 7bSem'k?I+ػ߶]rTp۴ isgY,4uWp%%qjF֏?RY bū{ UTFY:Dƪ0WF!XmTcUlwX *0~ƪRwDQ)nVM9zHcR8+y]z<"*F .j{ܘ:'<(jsDƪ0Ǜh JD<xPcUUX>z;V3X/XU^pJ*̶A^9 dU ƧROBxӕ6>g7Jg*Qi^.F/iTB( xO4.}sgc@8/=2,o{<%{:%Gnr__jy//L|e6W}Ҝp󲸲u ~lnO\y/+҄WC]s=hc~ۚXj؆턁r?Ft[mo= U͵dEɳ1vC[{;|Y ?ݕ4x˗Z<0P)!F lf ﮃx2|OȄt-.R7\|Azx?GwC 0|/0_~oͿY됻:B{IzzB K PޚCZ)\|]a6WnoyDaHx>[/RwE!Crn3Ώl:U:S3 +wކ?JxgT >P~EE_P-= -#/trc5 oxR,yx%G?.G_ 1?cTwtGs&?W쵮iOS 2aDCXS^e3V*zsm,Ӵ7܅6-#=Sܙiwߘ7^ONh5ډ'9 mHjyCBP8(b[~eVqvsv|sz.*_RG:FX2YV5*6}ͣEڼ\`ojq'x_[9 f+F'u\Eڃ͂&X߰=UU.&29:xWx^<>e8>-}Sp[ZbI`NEh JWq7N, ){XFhW4?h~ ({uXxXAdN1[b%+DEUC;ňmm1PcO~뉢 =@6; Q*}m"PBR_2.,JT*4&qVv+>^\Q(M%ڐ8EoQe"U.̲Ã%Q;ЂezD$=DrRII$\Јoˉ,FYۇOEST8DG,(hjfy3 s,gO,΢ŒC)m8@Ud>f;qlmH[b-:Jи0ː )GhއToQf,;?bVi\`UHC+\)25WF  { j$8"k&{hXXS샠:DJLJa>VQ׽Κ[4@NKeNE˞J߾m9/J3@d捭TQRj}x" o) M^.pmSfVZRe< nQ41QR ;rIm\^@pHdWޥdyjIݟrse%p "SV *6@_Cc^,`A)K}!>7뙙ȶOӕ J uza&渚Q+t(Ǖ -9zTRvRX Xw٣@C"}Y&j@3k=#7kIdd}w0^T I%A8=GE@$JuڪA`+ Ծʼ:r݃ں)Q4U}Λ$447DO9=tiRy$k@<~T])Q݂Ӏb8ܖi2πև`J]C+Mdø.۾Urx2ap{Ea\yDf[jxe'K̤{\+Q\%mVY._jc;.> ²e΁H$\J0 AdZalUFl6t{l!/.^Y,zJw\_=cXt3}?ʳ(8,~FчԘIC1pӭWb۵ftYy%J u]"Ag~P9]V`Ã5jtkhJwj2±X\rݲ_35m C, \ ]5g a8<`4IzuMЏ.sZF}ennVCf2j]ܒu+(6tY&Q3@kVmeGX:di|-"I&#MqH.r˒ACе{Wm?e& }Ё%9=)[1M[OI4dEbbGO%'M#X@STu!"]WFBO..boF/m7 /[2u ZCv^dg0i aJ6"ƫ _du5@`h{,׎ rL#-`8xĝ!'eu:'-wx> Dk3 k2iOȰ  ,x/0}]7D=sr]br,iS7`6m C,MT̔5,RFLQv߰59Y]LꪚR$>=4,$zչi 6xaNMտ*.[,كXMrψv{$ut\S Rd,fygoi{'\+cTK?ÒMZH^ LYFk[)$  g&) +}𮹲m Ұ*;l;}ŌQQ @X{3Tm #DhC)0bpHN#@vBnA7sh(`6 /a)L dgè|݃ySyu#i`g|gi]Xji@ ÛiN+XRLQ}{+ҖwPY,N2i8| !Js8mxY(8 !\:)bV N_gnP._O/_F ٳn[ Sa"{.ӏ?t4ٻm%`8:wL{qN줽&Hf#:R]@Kv%ٹY bR>|}0nL8ₕ˫6W?5a̮+7:{0YE=ZA2-[SŹt*ʴx[T9~vgs‹ՂޣLZd8Z,lҕǓu@1H?"v Q6Hܚ:t0KBU .4ȏ~<>4__Ϗ_ptNP[;`qk2S:`LBj81P P44/Z+N Fp{sB`4l{o79Y֤`61@l O&@@Y?IEL[%b(vN G!~O$Ga1Λ phqYt, (`q<_¤/0/@z8wql8D@+R&"{d/ڀOI SoQSou0c pP4<:f#wNMwc@46Dp̳h/pWH;L;s @i:?Se!N)U7΋Aت[Ǟxu܃e;?e]/p(4{ ~^0&8Hy. hevTLM Ʃ3qFs!p% pFyN8cV9vesgn/if슫Y@{6롴x qa=9A|iY\밷dKϠ\G[oEv_I_Ȼ'nkum^muu$.CiBruhCrsyr sX)8"'/~$;iJ8qś.HIÑ6G?)밤;e]:gp*t.\]}4:o{8jl ް˽ސhk;FnYui'o j::u޾q 'ʣo뼒["a{CophD18Yo^wRWC'bO2h)RH䳃jE {9#.WTgC&\Ƈ5 EqḥbA1Y\$Z4+t!^u<$ x3 J~gлC,3M,'tXh|xlI~uw{ V ֪ja@YBBREfd"&3(\(ByAK$Ӄ)a|/B u0szGQVqO4Yݔi*s0O`Y֫[δ C)*;XA1b¨d 3gn]#U3OL1t\`B+~]Re/ci ݐatd0Ddޒ#G3Һ,YF"s$0]Õ> b3lF] .muF_a?w4o@=oᣞ "RfW3oU1yNVquHh68b* r gѸ^,e80OxEmsQՎ< ynh̿ʾh0FjLC=KEGtZQ&ϰ!{;p{|~Bn()vhAq-=Y`⸣&}jsO,b,63'ӡ4CsAӥWU FR칦Ҡ'[o/'>wkdynEfȴ݄#"?n/&&J&x&sL˒ "g3$e2A,!dd1qb6'-xޕvKtT_"ST셊!yZdq) |\lQ/FvNrU0U:ޜԛ\:/kXw a(ֵVdj?YCp\xʠV`=J꧹Β8y-a#>:}E.H]ߕPLkxBljĕ[k1SOt@㖏6T >i4sCGxFY YE(gܵa6"u,afX 8DvLh?J*%0c9>,8e-!(8qԏBy6qB`XYX6tXxhAG06f;%9r` 44CSϞfϞv0 O/iKjh> =X:G9{2VV:>m4+UQy>ednha:Z9lD$͚ZU;Y_v ,;iΚ9=/M: )1 KYܔhU34u$U:>bY9<(Xd-\m?@Kou-(BPt?A鎂m}Фt6;4*ݳUJjׇ UҴ΍M벶l aDe;Jʶ[ʷ'(QPv[P{ AQP{ A: 'mA-uOPwGA-ڂz[m/ގz[^P?A4h l!h~; lt%t./%X93"*}|;NKz^Y/g$SaMV0rAOpO>BJ wLqδ(ITϠ@]bAmRvX&<=@ ^6:-l:94OтFEVl"2.MA`V_A-cZv 4W#YkS+Z+th~\dq@-b!&*IuZh 3Տ ?A4ϳ"ˑq|Ku%)ϫ65,!H9OcSOcLYYå?iaV} 1 |dIi *CнTὁwpo4XTV0('YchA)_?oDޝ^xafU0 ݅!lmE~,I%R}I_ex!MlʓU:h(<ǝ,e%׫:y=0^yg0gD.gaq@<_BY̲P^:NiML$ A[ܱN37n^C{ʆKfyO edߌA3BF5hW&K!p0!C}`r,8YXi6;OIM3_dHXήy7ف3 [ AQxxR5I&3;PUP'40$XO;ΑUãV?_@LOJH&e:fJyr%/4Ͷ.LLpͅqM؇UYYsꛍU*ʼnr_]$⌊"Ad.E>xbW1%̅{x L x̫ua{yCy>\g}JDHêY*ҫ b$1PthQA$5I+TʅڥbQY`glfʹO%Q>,Ш44*k&<#m{JjCy5Iy7)G!?+\,0 ^0QME<`c˜ /7ZߢR+rj0{U) Sԋ+T d.:g#*$;~Cj:x d(D IHE W0r8wkd,ٮЃ'0ʇ%$? /)Χ"q8?w\|yUڟ7WބѪR37MMѮ?k/c =+r"1_mkqZ:SmaǴulD'ڱ ̿iY*;GA=>SC؄P]F“G'+㷬~Y2ezڇ-㖽Q-ނY1>*מGP6Vx+%ʒLuՆժY)BϣY6ϷygyŔOs,fw ;Sm!)~ڐ`0t!R.j#˅)n-H^6z`6h)ӸzUax2.3=8J4 m8I~;=P)xQc=,]0z5: (Gnh+'8xAf]S9NURE _dVc>̈́MWdAUP6 }h,gٗp];yd tu ;ä:P}K}n-򸼬N:FhōxsA8U=mrf{Ywp@bKH@͗4|= y AHe`u_-00ʥ<Ǘ$|DJ(rK',gZ\%\Z)[=fk(6a T*PM)VTZZ+#}WD x磓w]՚^kP~3cogʘgmh} /ñ uUBn}MDxU`-Qh]0X򔊜5dPMQP%zH2DCAҚUUz&*H>-٪TN$Z%qW%2F"+@n.9"2/DkܕщOJ8>?1uP;U2*Q~W85M"竐,2(vU)1-ʥ&Prd(aջj؈d}L*lerkqRO@hZo6փnsvi#NWo#vqޒߡ(Ho"Y߉X+#%UX,AmEY6ꯌ?"n|,68=L͒-M0P| FOŽ༭8o,Z\_nTź]М2\\BQ)q>G޵$2gw܏{r Ddl辶(w5T/+N\]j5li>?Σ:6ƙ?k 9ۜ7 { ÿO֢mcy'AZwmԵ^Cw0%Wd b渇ީm"awofy ^Q۱~0~ÇF+!*aer"$EM9[z).O~"F<ⵚ'/x+nGno,83_. GSo3 ݹSNx+"D)_\2{3Eȣ($6IlGi@Xib%4jLًs >CLT"zވ/ Qj9SUQ$7FPvۺ}O6*w߲dgE.j.t"F &=* tz.tZ4Xr/ڨT Ui|#޻Ɵ)~8pV[[xY>#G7|{bew~H"hWu1ZwQE"B= eBW4`F BFDgJ#1l9\<=%vM=,E߽ W41$HeyJw?r(cDEۛػv@I6M3{I'fe d`\<iDG ۻ&r@ĺҺWtDM$HfoH2@;7F> ZHK"9 )"uS9p`T , r< *4%HT84m8 TK2g*h fnm0VV/JZEA #k5L#$1eѹ+B5I ⏽Z;1#Hn* ܐ0M2I80J-Rƌ_4̩R 1m׋%rn>bRVIy/tmr#e*D1tEI4 na3qiDϘw5'I-k@_#> # nrfܒ ʣ L{muפ)c0d5/jBN+p=jsI'rā*Ӝ7 G;h#CKɨAVh8R$฽ EEIQG耚UuJu=~CDhEnlNk;Q$ͻ|{Bk] 7 xb"1?dT 'oNh1 ^[HJ)5/S 5L _jn&Kzn@;`[ FtIi5Oyc A7k$q˗RwV^T䠄^ u ,+0ސk')Np^, \s7{ewfU?vwI84jN<,4c#@%Lr$t7H(NqYps+tFPk.T65жX,A A- eJ2ⷯ:檪:Ot:Ȫ7^NGNGnݹq~Jq> f#?*MQPW6Ct]|5K3yQ.ZqR]vjcL5ybr/ fEJ@H7)L1*\ɑKxfx,8pQ;:o ?mo9߱w5y<]H,N{]WFݿNf`$BG5*2!JSx?$ fIwx}sWcDVg2F FvYe5{SIķb.XƔ~.^H44fM-+8N;w]?E,6v( ;˂V)9l4F_,AɊtqIew7G׼ڑWڹ$1&ØM zY 9BG^se P^K t9f R+4*ytKg 2B}ITk΁)~0.i dU)j H$"aMfCeOƂ n۶VY/#ɮQRӖ0h'GoNŘƆ&fVtwSİ1g e;=h10F Fr;anH\1TzkW&vК} Kυ0y˹ wÅjEP&kTPLQ^Ծ!&i{2]EEʻ'a =:zJ+T5I@]doj0m4u"5=κYlL Ӏ.Ԉ!\SP^0E {,eQ~#}0\AyftX y49Ez=a(tQ"uGbՖ8E4è.0 C=JR6ZLkS1wu%YUq"t/'WNw,M>-tAHS/w_XpW}'uPaNPJ1'3Wkzc%A{^T#kO.i`C:I5ie(Q '&,XNnqPﱤE6M!3B*$墭XW1L<@+/0cZFAdZuZOZ@8i]TL7E{r|X|rvo,8NNccҤyRWs1?.鴱djPu} 09B(ZG PNt|1K JV<)Fc8(k/qOi<{Jޙxg=48;?[Ur-LDD .ТdN.%, xQQ|;Rc]35ބn4Ui(7>jzx{ǂdl3{睩lxoqشx`)g^</,XIߴDmWn^Yju[H:x`eYnh2] \MZ*QOEerJdPrT:Ui&Hb%U4ID i]\ᣈ #-$GgӜy刜rN9։uW/ɮc?y\ )Dz'9>wFBzMS|TEAsܣAɋ9"iʌK/:pͳ"l"tQm(ѧn1NV:MoRL!ˍ.}tdR>ϗR XkJeJ= J=Y{/,s1RhӠ{C`YBYʞ?"~-ѓ?1>޻C6P"*- *PcD{_ !tgiռ>jR޿pQVOފq(ИF;vb& Ak?SFKL&S]{O4ǥf.ZEŃ|?G.5=$0?5\ T~ փ-R 5Dc@\J* A҉^`Pā#LEheЅC0J)AspzS^Ÿ,p[ڤdhhc:E/,,I'8~w[ s9Ryk޻|ٔq+0dq2IKJሹ`i|GSg|C>֚-^4qLۍ*ĜA7ʖAxL.ݼikLWӚ_)g_:_ b}Wf/؋: K}ko Wq{ htr d>߳裛=Tr4ㆁfhr,Ac]\pzr|z2nzۛE\atص*c=@/kE\sӓ rnV%h;F&'@`aR|04G4esYd0GEB  "ܲ8K v[8{Q f)1GeQ_fpX J}2J_G.@y+ɥ 3Vb^4-PaʀN7WK+'RR$f:׹u)~4'a*.1&ˣ%!1^ ,A rfqrܩij-@ibW{-*9wza&3vY XHmʃu?-6nIV-QVٻfլϼE,:v]/~< hR?\343"Oݼ&V~߇ ӱ`/5;X\ >\S ax7a2HA-IQkh\0 P'*1eO7|E*{ϋݫo͌Lc,]ZP{>Xe-e91v*G|_l#Rvӫl@쐔5$׿&P"~>^A+ 2?7aT&hܮla⚫pSCՠ_ |f^9l,LlNYpU?%Yf b40dޫY+4Su=Wi+ OGsἨEw쨜\D 7SŸ)a:P~B x|;P0zj ZI$Қ&̬5NŇrr]4gXEep*(]p8{8v_YKp<̈́7걡pPGdA0#ʛ0];H~R\a/Hlu=&[VI8sa1DImjwUuԻY9;uFhҝ*y OҥH{|s/f<>/~{ד,E8<ta|,i{|րR;a])'p#~ ;/<̓bn N!P9UY ~ILw@3sj?.pVdQ]㠧MGUz o*`l&m- e9]{ȻsKe~9N5o/neJc31_7V?\{hVP9rͮES_G`mHGBG/u]piTeset&oS$vwv f}q\͂{Ѱq+77`퀝 =zl*i!(?ﳸWZ DzFD[0.G_^0^\}r׍5M=cp*kܤ˿oLK*0|VVssczP&x^lv9wΝ=F w?Œ,~8_7n89i٧.{l~ %+>Yfe0}9vVoteTgIum~S,b 0K0X#v{6mkԥUpX+)CnDC $҆<&$޵Xj[*KDQx@"ѥ0S":p)mN"06i!mLu rgm䗬`Nq|N; ,蹁i$ .iCkGb/a'u"r B6 $ Y\[B`ơCJYMJܰzJ$n !' ۳'|qKUqʾ_B7-^$Lӣg״C  :I\bAZIV IuV[_Wr>$;Ef#b߿~5)p.'U-ǀqo\貪K]/[B/nJ 2~|qW|.e6'%kXm/vgx+eP{Z=zWMZ34yt":Z)OM~S9qo{yɫVҗw xVΑY"V;G q'59YlvC6D}4;"r$ԗ-Cfg@ᣟ 3!()A T.hlx8ˇdE}E0Hc6 ;O֒;FѳhϕeOή+ӨhrSg~xflF;TeM1Yf52]3ƜQߛQ:o -){o@x)^ @e6*pTWՙi1Z KLH /-qI1rE=^TC;]kx]m}~G鿄K=N[tM~9HUpa6|NQTbt4>NAw4O9/)$سCzaH 6S`>}&{5lWyAuV[} &x 9T@\hn<9"YAvƳ}h+ӌ0O"ɫہr>r$Xz@>[t܍'DY4knJQeQp y x1jGcKZF}zW lgCO>-?״m6l[<%smH=ێƟDZmv0OFǿuF ]ړ[#:ZӮlޒ(6ou Eoނ|7< d[7xx;RE:]j7QFo<sg-ͷ}M7/xQQyb霾qt;S?y!-rg/WZn#@֓YN@ !E1a,)5FxiT\ YoџYE,~tζUǷbi_)>Y T&U!wV;o*ߦК*n5ΆH4z"#yA"C^Z/$RrFc\`PZlYW3vJ3vxHңьUy\3bD$)|BXqlZ9cń 3.0rl0ښѶVѶ!EshXjk3My^0l\ՄN`%h(bm;jm;__3^.i8;ʫڕwiF`6&:[2"$1(jqǴdHJbc*AvAsL^$Bm?ߕ2Ϫ6PRqBQpD# eqP2ӆ62)Hf`nJJ K='^ k!=JhasIn¬sV):^$ނ%ͺ| 0ga>W0;]_VGjքHLFJ=,RR!䌰po"%XZM/ʶJ!%QZkHKpJJð#w8b!P: >YJBVHG\+z'x;,fhqAv}G LN9d-q(Otݖ$PJXV `N蹠-'YyCaF"Zc na*ayQ s%>#XaBt&nX+5D(\_L燯gL`Ul$CHB &)5RI.0n0UHx8C03a#V")\3x"ַ@Sa PNQxCh:`0 brľ39cR \1"*i67FEHL{R]szUbP4ׂhĄccZgYơgjxȑ_p ֖Yp>`.;;zDJvfA)YnYMZ[T-Z,V|(HcTL& D\΢6qULahEγ=;h--3h-VPf{}y*`g0&Z>ɷ&$IuF FXp\GE{hE/:?tQ.jn|KuUGJl[rdޘLzԔ1"*}*,Dw)4gy*$EURk`iݰ,-tڕCLWAyapY{MCb$6]qкAVj)@UV?mDFkaj^ʸ,Jw!E3"Y+Ɏ:31S'qcT`v7Y8 0) bɔ(¢-l4( RD4qQx<*{ĔQ.V%GT\s̷r)rdke0&*`qhPF44qCXLPh0ZY?Ԃ1v;]؝F_s"ow͊^Nul.w(d蓮ʔPۊ./.}T@PGUGMH[MoR[x,?˺o>_U ǬƗUHu(o\L{nMT{|p?3o)qfޮ[f43\ [i,`E,ܮj2ʷxTyBQ8kκZ )$}cxl`f&a+D\;?WЎu."){!A9U;B"$2aB[,'Z)ycl>&\`ݵ]K5LJ*mpiy2LKg!b;YRA б=@u} }\2t mEom͖Wwbפֿǯe6:;&3+X!'Y'voyuA߻Gt6ts&>x( XNT"nKpBڗΜ0&g] ,uNWwIv>c .B[O:jK}%[h㼬]b@"Pq47C~<>{~v)tf~J"#ǩk B=pߋCgGĹ*e0M3[ZyV7՚R[k~_-.ټEr͓mqzIM:B-wgfւ>l2iȚ,4泋=tX۫ڟ7yN2^. lnd b 8?哬b8aN\vK~:Nyl{c}<}Rc`8QmߒzA78˯EGgP'u}≒ƷGɯN/f_3ߐV6-誶+]َ&r8tǁ_$ه/:߳⟿9/u5?vklSYXtM܏v?y= Q1kpg&48|Bm/̓w.)ӟ]쪃p׌'vRij!n&|J*fűZ=tvGҗ@~7p# 8;̰gkAAYOhʏS{m}1rr;cs֛VdX'VɛYUx_'ڻi*iau=p)N.Fhh Uh/_m**:@)DUofkm0au20V (.F%3{їzjYwP OԢ\e T(u}1p{1SM HbV{7^*Q)HmjxcSa Ќ/a=0i$>* i`Eh Bwe]Լ֬jML ntU4T+9JEEQl/}jbaFV+~1)];!VQH Q$"Ό lj gFɸFQjL<h\PD[{Gj l=og$;sMpҝ=<&Вּ'=g/s^';O^~{ x8j $OGgGXf_:ㇿ±>c4Io<!}+x0H]FwDg,m1%2,1ashLES = ,Yך?apEǚ?ah5k4(c JxF#^z{Z$r(jLK-y"96⤽7ٕ"9hiF<&y*,󷇝IW6B`s cAhuKSjfO_/ (%ڱ?JDf ̀Q%VlL[ Wjp2vبnӝ0 --{N~7b%1+Z4UUO$pb†]j0չx` ~w+hK[e2IŰ,d(2t:`w&HJv$kb7X>eOEpz\8j0{h9IUl$ZAIjTخcO:-ҔvhS%8Xx VXhV;+]3y~iyH7rZכ/xx8*0ex%eY0MY0Z7PEc0Zh *Qִ+BJ^m#~w5MSl#zQ߬"ZF4Ħ/vrG:F j$8 =>~SxJ@Zԗx4 v8kf0Bt;}%Ti)wapIKWJ_K ^{咋!|jKj'tGn/UL}wh3$f%ӣЖVŅ3>rqF[k3߅3ya3xz:*Kc)j.*;e 3$h`&FIaB%0U7mݰQot}R5JanRR]U0-p&4̓JR lU] d3dT\ᦊjT֩MpӔFZ'*'e=rR&ۇM72ڋi0˓-U9#'ڛhok4,&(3{䦆nK> +{B 'ڛhodq(Kj=+fwUI(Ӊ&#ZPmV쓛bsdG{L)I )ID{퍕J V*ÿ0U$D{[JM5 9K7g4y{Jho^E+2T{5' U6D{퍑*+T %M9!RCG{1Nj#'FD{퍔J*4 e17OOU \Qho^I% Wc)&!E' X夌1.I%/gTUPf&ڛhotSC#70YV,/? ϝ.}3痣ÎE^^G_uNx:7֏}Mu$iDUJUTCUZԕ՘v W]oxnJNN˫oQ[=ʷu_ۺM׮aU9Ƒw,*N*!q|bbys1zNDd T/ER;+jwi[3ۿg{^s/ۄv( S3n<*cC w9iͰunM<ݢs!Sp(*SUrj-U546Khj_PҧNn&w=2h Ն=ǡzylMi^xʤ"OA򃪹0a S4uo뤶FZC>ahFkWӗ"1b,T2@nxBS[((;w^#k0Y$["7R"mXb$&Z}6Ik"FL R5_*v) , #_DI#ˡ G rr+ nz7rzL<6Ni͡Z҄$|i蝈]T:8y7b-B%6EqhʰH> SGd j<}b s-c!d:L}ZG`=r䊰Aue&"WjW&l:Ҟo6]ݶAy6cSf 3<B.ywԉ"OYE\zqaDNSS#7w \G,#:p,g~tC]?4-195. D@1id1tħn(,.?A&8Y"3;Yio'k MsP~V0Zt %ʟc'|b=zR ۑDE4Bxk2`w2ҥ 8\2` GJ#0) -GҢǂ ,*zɀ8ɀgwg(X#F_$I#2]]F2b 2ˈ82"@&%4X|+KH(aW9t|Zkcz;#ܩJM"-)t=OI‽&Gߢ1gAlb }b5ݹ y!69X8- ๋@J K ,%1W ,4KGTY2/KCՃ@# (Dp%#aI%,PYr$͋.Y$t֓,Ae/#cǼ[Ͼ$8k<dh:Zl* kS,0C oClZ G;htWl0-0ʉ\)Ia3;tC{ (̣`ʃr`sKE)rLM+~V/O^/E  +"LY _ءX2A@>;_RwissN1Zˠ\;13gvFC;phӴ-D;6;o]FE\o ,7XӦx3P+`7Or +My/ʹƼXa$߄#ԇ%?4[-F b@؆NȬ>b ,\ޓ·8wqhWSl}Fχg䡄O6'rj}3DClj"hq.nMo!)aѴ(N9v Shske#xw/7W׵>Dv!hu Bb*ؾ]dd1 FM\d^d8D.RJI04mc1h`5x9NV|2ϓe.dƖyȡjDZ=uQ1id9&R6`1X$65DJ!>O5̔LY '8$FL$FIcZ #s  Р]"OYsOؚF؝Fbk5i½l4zNG;]PQGEp48rm+-߂HhE4]c_Mަy^cne|h,&A "Q-AsnI/դJeH>Ԭp^(`GmC7}u& 9X&zZ ɣ_\қ}V6VX5776S3X;o2r{rrZ&gEdx%:jb|Ey4@h[*j"_/oY&}m3xFOrmƕ+~L7Aq{1(vq7;0STFś_ϡO?"eԭ||aw_+vO+kn. {ϗv?gp7?}oa@5\:|hh76߮U"3d>y~~b]E}dH]̩ABs1k>y'^Ta-EF"ff6Bڦ#78Cud$Vs&ևlbgzoxEرV=8;%!2MY cZTuC.Cq/!ޒQl]9Ӽ[n:ySdd1FujTwn"8tlA"B L,6qꖛ<*Vž{f;F(MY+Aƴ@1h OƁJ/z^ /cބ¨/֤@Kt))CAvJXàCAnl߫D"nٔ_I#(Rl. '*iLC<01r Eܚ4n QzȭKܚ[OK>bwRt8ŐD$S`d1 #ՄNIW:3%t^_>Jte`Adr "fjCumSye\釨Լ2md1ޓT?d28u!BCC^>`vAdr ".BC6Αxs ~[MNSF3N08VW)#AXpXC{NWTU{gkL H"EPY((@]:!'5r0PN9(8 487k(ŴBEb]Zi$"^^WIʰAEȧ6T)GG'[@#fOSYpM\.Rx<\p0F\wh5MsijPe5.f) ,CMl"Ǯfb,8rĜȣ6PYVZ!'D-m6??+Bp۵ ,"rHhPIjrDn3Bui"/ya5#3B@`λ䖼DF#dޟarU;\%dsHL!OYY>]!w y}mq}sQPPnrO7mG]6PK8j垂C+XOwa/To ̯E 0KCbsՙJo0FB`sW&,+Jd2>&5WS %,7J$m,G"tc;hY\s69 0r٣NB#,hi y]T< @er"'4NNӿ*9ldv{%>B.P򴍥Y]ᄅLض1ԬN!YIP ,x4b Mey2X!a6A8$~sgRg<5~vmj8 Bb׹%` rWN2_]?.6ycmUW/WnVgnn^[I9 CUj^߼՛tm~K0gjgWyu:\d Y&E_|wJR?}̎yY[{)p+l,`اgzcW vOg2x<&=CC%̄ӛ[2`"pD΄.j-<3}v>슏t-wֈ>A6دF|omW+ͻO/W~Kxk{gߜX?앟m2Msh]QvM߾xfj}u;Sbo#x}};|g/Iqoտttn.fL>9:,W}Vݤ%3.7^Bz#ȃsH궳|=G5`;׻׾kz,zwV7/=je(~ >xnL'?{Fr A ]l%16X%Q!uݯgH_4Iy$؅Li3GwU9sL7DxTs*(R>.^2IS|WNʇhQ,ZU>pBvP)F%g=cH}̞y|CjqZ+owQ)~ 'n&x:--7|Ǒt}d4!9J{}Qiv{TP-[hI1 'XL:#xƑfDh50%#QG~[5Բ5 {~ vi.eQq7D/ b9w0ilWQ]k#.wˤ2Me XMo5SOxF(9JySSIyR*QI*fRMy)/J\OۧS)gxL'+WX2b .)V-)2D1ݒb`ICHp1t kZ\Kz4P0IshbO,G!|4ۨ&/]"i{ \CT?՚n{M5d4) )a~{Q8؊bEIfPs\Ǯ}IN$Q&^,1Ý6 ƂF-0% n4h׍>Ceů9\)N qj=܎_ѫa Q+WׯA˙\HCIbݎ>ySc3!r(/PeZyKӂZab@}Z`T90M o\Wzb(f%u|G]TEf8oׯ=N>&x߽mO1´#['>Qќrc`.8 O<)a9)䴌R e\B$EɌ3\/Yb%޸y%9gQq$m}27}Y?M"5`efDul VZw_`ÇaRA8R +]iUU߅N'{RȤl\ܤ#*wMg$}H8i%yf\F^Z_x܏|9|z߱z 5 -h>ACMIc~]ZQSCov?^4&i=ɊI-Vk-*a&X8Vir¬@KʴT"N^Fa'B½yit@ܹFsT*3Jt9)?/vSE?s*p8?vud<=s &T<' k|%n1x2qRp"{.[I-h/ǦO}GQ\7-|<ErVCBsJ^>d6mco\o SIKQ$q^6wўb2T߄;M[CNb@cX;EʵZ9(LKR5GXjW᧸z hDK5qegsks#=J562WyDb?4Lw z,iZ`Ԟy@iӖe33?։x8YA:˘q8dT©Z!y8éLPǙpIH,WH`/bfȖ,f2Vw+3)9Hwdr`RMC`QǦ!ngee>eEB MW=UeQE0sN\e?OEQm:x{:C1AGyжfP8MY09坟3X53n}.< r}bW6S6z6oIPGgu^f u|m&\Ҍh|lmDb-x*dS UœO1ddRvU)"wτKM1BFGG-k+9bӾKcutVLNwLXmeecoemJ.Sjmax2wD~u_ǥ?} |ƒi{Cg-s8Giטe372-࢑=+'㩫L!+,A˘V߸Y齲7f=q< ń4"ܕWuw_6h2i`P^y=}U-UDed{Ӕq> de247k*-z6R(nB9) 6JeUWZ] Ոyx1fxgHbmV!DRG#ϵF̮-Dᔪ؆˲һt~e(-0_JU<SI;w 22nU-6n_&ˀ}wA8ږd?+2`oŐ8p(ृdHs9xP]k @,w3nT'HD(&zd3hSNwʆ,aY_љ0ՑQ^p?0WÄ?X?qc_6t \[oguӳwN?TDEd,͘#u>f-uSeQ_WޤmfW{U`5t u}ݨבRôvWcs a ot=Pc:1c`$G 4~tnMۻwj.2wunA1L7%+TzoP_~~WSlf[_^./`767gu F;Gg?kC:TpU,cO,V3A RP epI)\+ 2!(omCwm&x φw>:ˌ ! 5 p36/ ˱Ah@ bjSA[5a(J $QGT ZOpǬ }mƱpuO4d{&C<sRl)b΂w0fÀ d+Z㒢bhe&P tkcM̨37@kB`S )2j-9,ցV"$(j%&z Nȳ`:vQbK2 $F.LA#PИVZ)@0~+X ਔL@M. &HG#VnbMyLLZ-<10(| R S>Me4l8'L#** ^@X(R 7 0 DpvXe=6lQ =N B<)(!*9b &`+Hq6ES\7Ō FE `0Ĺm`Ṡ[@oV]=E4T@c1C2F,  ߪ Pp'`AY>R CŬW`ԂIe bc& g V*p =G]ko#Gv+>X"g78Fl#@QCZ9MQԃ&j`4:U}sTd#hb(O@mU{2)!q3L 4Ht`9C ^#ZPUZ !#zycBM!*ppj 1|2ޒW5K+pJs@F(8P-@LYĀI4"vMPe&iT+11II]݃戮C/|>%Ya?.Gֽd)qw{vS=[,Ũ0D:li: ƨzP,]:ЋQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuf1uOFV?Fն7Fu>I1:pnQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQu^Q'(#SNu2Zi_Q'Լu~FjQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQwfO'y'hnc=}_w } )y-د'Np65ڍi_\As'^=^ƣ<.`;V3z<⌐mmz'`K =E_ /`%K6VZHYZ2X^ 0tlV43=r &0 X5[Z14[KϢ k{6o`92lоE_+wJ*ճL /a6ΦV}/vw .1]`"k+`Czpgq6QsaD8#$itoʃ\ޓk5k`Fdd4rryGI> qqx5iڻ;~ ~gPd2IHb VF+.lx3)$^MG|Ng4+ o'2䜂9d ќE"1$E}h9Ea ys!&Ǜu~MT-VBjkb_}6K{TB* ݪl9W$+[k,&()؝u5ɠ&qnF,$KYA7wC{,Hm B%G;f9cf eZέGۃ\hm{qPȓ~ۃ\K"z6Vyb{*o XͭV}^WŒ^moc`Fk*-?OPᣐy&(n/nN}p<޿;)YҧmGLN`5KS$;Z;s;FE~2hf5L3{#2\_L{.t/'=ZеTsv<@ȱ/ '5FcR=Na1m6se~L!^]g{l8|ol!Ϧdͼ9eBu>Ka7~nh"/W* 'AG.ۃn8We[RFJRed7HY_bcˊ`)𾀥RPz|EOrytc`R =ڀd>ErLviٶ Lk+6K&;;v@ #X2!˒ILR*՞V%5}X  eOiTov>5aN1ȑ7o*J, Xرg Ֆ%_e 8ݓ; H{JQ~t`aLwq .h53uP} `^MULړ*ઞk!zɞp]#+;,5 ۢe_ȵQثۃJ# zaORe{`5B$|4=|=!a'V7\iKwCk,U.9֡J{[ KߌL@^"\;~ϓEkj1}}<$Ony˩9wҽѡNM*DGG|2/FajN;OS,< E^8Wnfb<w>30blV:i{]lW3U;|Q? mG \Z~2 Q!mu߫ Bb%"j1'?ujh OҤM~96eɺ8;E2kҚ\1pH"IzҠuGazh~^yG-]mPfUXY*ÜL 'fklX{'wX*{sVaRmM*ι.#ֺ:&.!+6O/ouxpJTԕV^z*t _68ڥ;L+]c1IFIu@m- m n|}yC,kOiWT@j{TD=\ k0õnv5/XhE3Zgޅ N5_&zZbUKvАCorY +lybVL6Ѽ;V>Eҵ]~pUጩv}-(?Wrj:yXӏy p'䌛!H\RAЛ/npCy0$T/G$ɂ{\|i>xfHVl{r%>H.O_;ږRn0p^Y[wjTٻ7n,W>b2Xa7Lf^&WjIѥdRuI*(JV;N[:uH~uH~<HQ_\,ϟ qA32#2U>)0Z]h}꾃g>KZ{o:WasNJ],+she5sjsx{]{GS:ZMNw-wT\>KK_F5ֱ/6j٩s査<@-,m$Z5!s&bbIb.*N7 >ipDHӮiOUoi|_Y||8MTV^VOVq!t5Uw㩯FqR}?]y 1~@F:xϫd4ywկ_''ϡ`&e.lm oxź&k]ҎbvPk_l$`\lCd'hiV%lx|i_7EދlWE5/+o=]+4ݰ^o^~QӥHEy?o]|@[p @)wNNx@y7"mssP*Gʺ9FֲQ{P\o Uo0!o,[9htZ(\~m*t%G{7U㚷sQ1&J[aGF "$:͆|4u*x=&~83=!լ=鶕π}uzic4{˝p}g&`6"h-Յ KWX /':L{NĈv!Ă2S5ZN,X[9'5y_Qc|>j n3|*1ޘx33=A+l RŹ_tie~s7WoIW1@BbR|i0R%(G3,'s<:nguqKη nmUpF- xQiWˊzfGϼu[|sdz3w[px^=[ޞtui'O*uҔznzXY \FFws*e :o1G,f<=//+'yRkΕ8;Oe_5r'mbQێpL 'gXu8J!* UL{4-8p5 [rzp8D|?̎%tW(b`qķnIEs]?[eHS?|rs4Ʈ-S<χbW}rkM[nu!%+ F>Bޤ&)HѾaz>t&*d#81$5@{884$2gtJDĂ)1ZnTɛAG[_ ak3ĺ&Wt6Tsjnծk^˭ڹKrV\Ko]/qC\`l34Mq>o}:J|js@_hpJKTCw]u'!kA3&= Zxs]oV Eh50gBʘqpLD&$S#FkuN_,% cqr y¥1O' 骃ݸѝ< VXC '2\9}7w{_߫u15q<QD5Ncp0u# `T",bY#Tʳrl::D$|?Ax^:xwy/Px$EkvVL9aGtf&ZOGT3qlYtAHTŪSsdy<^j_2 {9pZetK;m[Ki~>GFsKe{ַxcm6?F=kGo]vV14#c'o/<{xs$?m w;Ȅa, w_ v}ݯ?;~c3l,āo"FR6yso@wsr;ν~H2څ6>߻hR<8ȯXoHN;kd4Hc,qjk&}Ou(EYc{c^g4{{I:Lo>SDvg-=vg[^۝;[ow^#UcJP5x=ӉGLlm,s_'Y],ꕭ;ݳ57NF?[[P% 57(M63 1){( hGj+)yƷ߼-]/|%3;F|r|O0uR;$!@ KVe}x0i,8@\Y8?%UIcBЁ2.Ѷ"k5rRvvUq F0J0֛5ԒZU1`~}(c|q0%yHHD"Bс26hA^&rv=v,)XB"qakJB62Qc cRK*u }"`2Ɏyh+xe):^Z9n>PHc yLx_p'{cc#~ Rʑۈd(3I<&E}b#}Ԣ|~AakARxy2EHo(Y*^* &== Y+抵8ZdH}Slt #(Ys-^s- 0B`Mu@"U1 yD3'LE;5Jװ*ܵzvc2)"M@HS0Ve}H5nO).rr±@%G\(㊏iVNN1Jv[E}qå,g0Z5>{P-JD*OhӀI1у#UyQ!I౒dh4~FcmH'뽰q\|ń,rC)%9|Q0ԭ!k*dH/nlĉ2O P蹯W^寧:(͐iv>.gcsgc|`8DDTkbyIJ\HO{㪶Jo.5fS N8Pa0ܙ? 'T9Eʸ\F"ݵt{6@yW&H)؏ IHӐdp4, uoY TY5_UttUWx!nxLT'abR \J&AS]>.' eJIuFJFJx_)83#gt6%&kp.m+/ 9m,F&tS#gmIWA^ld` v>$;Cv6&ikLS:"eqjbZ$-'[nVQUu/&U6M1IEN)CNo/D LBkSXQu8&| 3@Qi!Z.\wh ĴU `$8r[J :RR 9* GZ (Է$Յ%8B5tAmTU>+<Ɠ׼GjBJ$CLqV[q'g\X0{lbV"J v F@|81TԳ{W^x.E8aYj3w60fjaM23Sxngk񐲁Qlx5dZmgU6?a޸j ٬(@;-.^Յ!-xŠzt' Jؘ~Jc0-7ۃ%fIX&Au&JZ r7s,3ZwuG<֙QϥnT7+TmuëI΁#KJ·w(<ֺ=gڊ2jпj lw AלƋ[UȵAvVkɘ,EI*h\p$s"ϵjUhOz.8m/(3`t$uPLj!蜤RK1p34]x8Uf$b6Kg8n鍴Rn!j[#[Gƀz8I?-h#eQQHµEDx/hdyF@[K01X76e2Z~&ə G FA%&\EڲT~*_B<ݶP%=pFQ[y׼ qr:^M[Fun 9jz*z{^{t{!.{#܃Dc &8UAVߝd$,D%Ñ194MqL{;P[-F-?<S9.t: Go0SVazG|Nz:5O5&+O7 @Jk[TG 㼏!R:p,JVۣ]+KڊV|Ϸw=5KM!)љCc Ktn,wIql6F[hKMÓِ%H2'%ٗ|Ӂ#dt/Y1S:p;LH>ϛ|`uduALQ̞iU0n|E]Bu.Q9%4AP`$\>.G]vyZu2g)gEnl1Iuд ,Sw[ւx]ւZ~&i}z_18¯J|sҩO* 8͙ A񆟑Hzϡj_:p[$#OuǪV e 7D]3$q&%D c8=0ڍ=P%mQoU e<_6+#4uW5.dug8 KҘi%q*Q]s0}}gRE*<*Ҝԏ]Д$B@7#̻]f:jd:eo c gShByQ/7pNh`4 l=Z?Vn;*`|:]GAZBLt Z :^G~9mQ-yNOGXuc18FۺC-?D0K;/Z 8>FK$ t<)`p=0V?dn8VFB,D iO.;f\P5h `pnR|bn U(56u՜w"RO3A (1&+iCP`R6Q[nzh˹5JG*"ag)HJ"N2jXF:pl`0eg<![9Y]>8C\yU}ne4>aUއhĩv8NI4^JǼGLrXsh2`])SQMo7 ?*k0Kh0Ơ41vm 3,nd>x'އ2wlp.Y`q°1lCC+Cu<^{K[*\nm$tB3|F;T$cL{v$ F{$ jy$gр2zgԝd8|7*jpE,).RTX4`)Z4*`^lXMoKȝ@TFS5$ МRf✼5O?3iKbwgy䓄L{QXGJrD&ql!`K-nW>b4k|cbJ 6RA )yY~&uو2#Ycx*IG1k\cO!.q1%EJu&Uf\%ӯk0%~+s cI8}HQFȮK4DGɘ94ٲv g@٩94QcG!^u;3.i0bg~T+@ T |Ws}=*:p\HD|hȅ*HR3 uLXku dN}Yyq,ԐDC)HQ ]fN`nkΨY}>$]zkh`gȺ}0{W2GN ̂J|1Uq.`[$]HavL ԐLBеt_Tl ͔*9q^!Y\Jܕ9[Ec2*݁#w f# 1MKF (Gf?vG+p85GZyJ=((KrnHIFhB{:^d19_Ʊ-Bh Kr5^e]?]h048Ψрnj(%dA9C$\AL(MrT?7领?ͪznV>ioGM%hi4%#"ƨh0Q"N3a;:<{n# #yC3djxy_M5_'|rŒ+W5s} &/re ܉DJ9YI*SMr&Y .!0xKd*|ޫ ۏZ 9 QR:[(=C)HGs:}Cimr1!bҁc7?'𒷮&_fw 79f;ͪnqK6{b(DUQ(HLrI_'jt-H3Y$XunP^`_v Ygg B!-U,|[vxA &٬{< Џ,]x#w7ܣwi+Bu%eqǍR:p\b=+ k( F{%O]SƟ ௼,i}ikR1 lT=W-ћ %:p\J؈ŲȘ W|$٩I7|\kf#jYL GdRySVY9Cd Y0#fHҁpk5[.2GT;Vb^&yblsx'( 2Usv2 dnMi5HqU7m +idG45]LCA^]cƱ.f> /`,И^ATVTc%#Z%gC;p|d6H.kQXLcf)Z>!Ƙ cᙑĞ2#;gܚڭ6׻w[?䷠ߠDעZa%4ٮ% ?`xsO|,57yaʪӇ0,bxl'STnTzpUoL]4ywe oJ;M˺5fǬG~*YYqhl/?5\{4Ɨʗ3NY"=SD 곣1=TKYS/dE5}t?~ZyZڮ?zC5U-F^J5 Ά۾6U,7_8y`3S@ɓKfIj0ھ~= ϓEU eG?X/5ٲF߮|zgJJ@Ƙ#s#>j,dfsz:\KdeJee1\ƕzkW}V-{دU5rD9R0AM W r!x SP8WVMq6_7@'nAc% Avز9<&H~Is,aw=fсY涰w=2OPa3!ǖnM29C`pmɮCy`܍^؈*cx'&A91DCw1U7Ł]xN*M`Z{Я6h|YSۺwȽN È 5ʉ>3l@|ćX_91oխm᯷\]@'Wu>`I9N/r&4yHd.i `*@q;]PLI0+,ɴs}|WUV|7&X7U&F8ivGW~ )g>Ѓ- dȳlvo71"YlU_edDf&~;=˷ 0t$>q WƝ@-e/w_H`-rp^|u3 "bFHOU~&}W~}W_+fZ.?uzͺ!u/>UZYR|"[,PqjaF|=bsVޚQZHEZsR/&GF|DW\<i|^5c!J\;*Q1*h.ƾ7.NZ$D: , M&5/>?Z%eZxܐgG" dHݭaW L}&RI\Q'u뷛kV:@{0ZĘZB[d͜ q ۍXhS&j?1#8C/Q7}_eqPr=zKL" |`w Oqw<nƤOg/`Ӂ*P mIx945 %rQր3O U6\!8<8"iN'K;))N&YZ8\ee!3ߎXxeP `Խ9.@c )^az )!Dkt^Ë4AJ@7bfqq:")Rw/Iզ`-zJ1`qZIl"i+m"$Tk|6%nk8h"q1t?cA5'x:$kQsuaI_^'m1HO x25U Hʅ`phӉ*K G D؂ŌޟkciJjNT1x|xHh C7Oc ]*he+Xx/KXεc;|"̸}p4|DRІ^0#hDg%8پdbWD0j;EؒyG |>nb#S*a.*vv~2PhDh cz {|]-5ktbL,r$ƭamrT1{~|4f>qhiiØT{stt R9h DCcWi';tbL'\OEx(׊ jLdխLBkZGd$ rXITӭU2X;QZ3D)3 O"y TL5lIPx^I(7M[v\B=T`;iϋ:6vmί}MWד}0 U̮FGxuzu{9 +53JOwÿ_Gf,7[?ˍvr@5-M'TvՑ]?-?Ap׃RLBR +â:&uK1n-^ݓŘ`u$&[GOAa\Qkk 1Bݲ1ϪP..xxġ$.5>/|yc%i?SVyF2 =HP/;ӜJzX]߫YENfv7{GqW3+PAWb?0%)ܨo5`3^[&}?gx?`_g`>C~yk^~#ГIo9ix~eUmr:Ӟi&ix4P ԀKAzA?T=>xXA5z{_ǀ{M'FX9J^ nC[1Sh-+JZRə3ovyZC]BzC]ՍRDJ{} *o+S´>dW Dk@5< ^3rFuāC$⽚DׯL }Ȱc FwzYy߰0* kF9WfIj9Tp5#0b+Rq=5_0Ni {J jݛ{xe޿z j?y#n ףF U`OGC72,J תfҋ\4i*Ldym Y[ғYF+zbF0 %̵l "3]iM-FDnmd]+u.c<7}XIU!((̕ aȖ5#"%Uޚ(I8,hCo%b~]bXc4 >FR?|9k#ݼ0ڸ&Rnsk^fPItS wJ-ytsw, 4W.P+''L9+@B1 婦*X<3bT_Ui[VRk#6B5FJ<…'s|exުK|Āx'8צt)hF;trbL2llǜ 4ﴱ?ymO 2̙=,6<hپ(4Fo/0Z)֟daK1qw޾9Ft V 0_:4}!n{?xGM&_ ZfW)a~~woWEJo_k=o ?8s.=zVW]k@ex蠰:N]Vϧo?k&ߒ7?F9*U".)/w_y]A)&H֊wm <$J>wv8uΗ{x n1CI# V4Gn{4G>/ q@qIɎOⶡ FOQrp@'UL~&]jXn1T+,6#/ZvmPb>(ƪN)Tf5Z椄.Z$GF|2K\IuxIyG <b~I,s5z~q~ɧ> _J1-䵣ogia\Yj)_}ym񋘯ok)W1&Ѣ{=T=@3$/Oh4 Dyqdf0fMjܥ AV2R,kfڠ*xٵ: .Yݼqw/5i~($ Nf`_Ml􊙬KzFu4nFɹGG"L ~zq2Xte#V: l-qKS4[{zF1qşG>8S-;nDfGV~ؽZmni\0K/axS('l?Iq40;$Ylp r X,)=ERK+mIl8bX*Tu|HyƝ뤫 Q> :u+a4xR΋w^t"x+  [XN(-Z~YXrnPo&[$`ӍNT" &yaYO|{odw%JNsmj$΋R$؍S 7" ya= kd>ԣϩIsSҽ;ע^85I7';5 u17mmF' w9UmLo/XwpC(2^HBN6^xIFPwje&Ӹc An yP[хP!-h>&e>KevyPENFl I8{ Ӭiz7Nbp$pÐ0Q{w_:.I06͠vv4O~s˓.U߶T}fh7}u{A,oS T` &Oz'B * P@x5!x=jߧ߶iDλRW1+q>ܗ~a *_Up:l[kG*6: N`d"íkrʞ z\[ØpWlw0,3g66.&񵍬NefLq>+e kt/sZ!. :ӼL&[6=MT' s8hLg~\~|/c.ܫd: `rx\ۛg>*!~)pư(l X75ukEXD`ʥ4m٠SeG+zʾp/) km|$}өΞ]"Ta{m{)+emm"$-A!bY,a=0w"q Na1RԂVПx)~^I 2yHGPIg'ûvRѲ( cgvGJaT{VȲߊZ$ Kyb=TBcFl Kwf?%tTGFeLyiDŽӻuI- Eû*ZՁϷ:~& BB(!9Iه긔$*0B!T:X* ߞ !,4Z\pqa#$"jA#9EJGA" .ENW~ /5@Lp9fFXAZAX zmX!m'pX?xG1e]z E2pկ\ڜGqM{7zZh WW&V I>\>*zA&KwOF?N7wXw6O?ޒ<^҅cTpMꞱhdj.=LTEvi3>N'ieROuI fa#XJ;UżlŠiv& %Mdͳ=2^ov15dANOI 3 ^* %h" ;4lA5(wo\./x;t TI k& >ɮ̳K VvyVZi,hHqk8tpdPhFw4@ r'݆Q&@j! _{KS/^M>OH")>/kBVV>O%ZOͳ^hNqZЦMT1Ȋ;c#Q- &kka/C-T^+qسn܂sS &V_s|8{;8ʻy?4P'sp;~͉#LC櫗!߃~N!#:Li0B4Bcثg|&l]~ݥFd4|kkg{Z Kr>A?-5LCHc{kօ*hCtaKrVxOwA((Vv%jWۣn4mTk[1owTr7@ A.6)o.'q)ʢk0v3nьX>xd8_.aV‹oHPzj <[Ǖ:m\tcoHǤE8 Cr|T$m1 NYD8V!б-MhENiZNmmJFU-qϚ,I: El  - S'NÝOO\QGR )(@ǢU:![x&9ThbJ?W~)}7_^ Bţ͠'cpΡt76AH9>WLEFOT]ĵO77pܳcPM^St^όRO v8"ނp@Lcw?vSpN_[Lwzi@Y^\xHj  'ajԖ(BC@^ņ(&)*e# qgq;Uwy.O;:ێqMp}(=wmˡy=iFg?uOR.%,I_>``9g&~5 .h+dE¿!T͌+!yLFl%{Q@fs:;J7cqO7xЕ7 c"4 5zBBr{Rx,֤0%տlȂ ?NyO~3&rAtJY @vΤ©bv)A5L W3 Z%UQKRY8*J }w䏘 :OCy-qhcI'ƀR>pm0fǔwct[f'Aa>E,fv4/ HPF;6>yU:}KAEwי&,O)YV)6,_X*}_&B"`V0rw+G{7c~yiS8vX^oh{uH,ʁ@IMRR`+ V.I@pk(t;M Z*}\͗f;jR_BN W ~UʴD@/mR{({Ksf͐g[VsG]7۪ͷa2uCO qfa~%;ᖧyf|_=Neƿm"f(<̫a/ýa9>cxj0ҧq/j9rˇ~en rd ~/eavM}NxD,X[Y$]$y?\@$/7&9:PVa+ߦxG^7.ba0vEr8LK! I_“`x1\D_gJʙUm/y(ˢ;!S6%Y|4TDUD!9~=%Y\Đc ZվUZ\z'QXTt^Aq&WKlпq4&\.QyW,GLR#y] >}0^|d<[kxABn0Jrlr֜>7Qzo¨/Lg J{ x,6 r_;` X1߶Tq+6@hc}p;:X/0YrsȅTPe[~Z̹\blǦYxTbMb=0ftŴB?Nt-#HٛgcoTķ+j\p93BK9R,Vзw=ustEEũDc =dO8tib6q?1h-~.$~]:\ڦп3\?h I3*=UGj?=uxիB?2Z3=xD@$z=:;s%Bx @;%8%Y|SMW1!/.dB &@d17?O>gW}~f3wۼ^^{KkmhܡΉdi­DzN : މu] u^S_o9n.ay>?}Nar>a=ޝ1iIh%1}e=;TV cȥa⏇saǎ(ޠb*/c5l{v8ܡn^josvP[Vq ALבxUGw_Fz ֿdU? vM ϼ{ɢvނt| E\~@} ?@miFO~#sG}su^>0O{̚ye``N2]{ w(kM.}` zKBKx-(kǸU)кR <ެ;jGofo\x`+uMP~OyTYg3_FV' oe3oPMZSN5'r xFkeH$wұDN]8m%C g4Y.A{CDf I,f& LM KwVlzɄ^LHM#`(- -윎ֺ鵛M_/CޯoRX=^kEH#1NW!A.,G| G\7))YjAdr- 7xGKt8Hs4&kc} B K$l WZ=QrZt`D!GXm8K J3e`d,rژAmO#E1%,*0ǡyA4?e}X QY[J}yAD@s@%o>O/E(2|)!rR<%bQ,dKNd&ߵ?05P &UR⼂-IHÑPvWӎ\%%#fw"0KwU621cʻǎ郚 T{T8+ dbWD0 =5iB@5)cςXSxͣg}795K1:Y(&aR2jEіx'L2(B,jg 8lT~,L}&RCr2*̟`Tu 8u2gcyK%s:+f5K|#kpk~k):Y|4M݉"8^F~e>TDTW9\I8+r!21{yZwdm'fOl̼56vG RkxZ#eK֮-74KeѮ1u"ZZz2ER` (yrcHI\}`H !Z:YnkXҌ^ R:F ? ehvB5\Pvc?[̈́ >>ɹ{r3t2!9mR&ku.2x5{XxxY9͂9dcqv(/L;oZ "(0y^&.8M]msRz6/7 m{$fBBSR)O Vt>ܜ;5hf@(v1ԓXBsM[s:2 x6cfhPFUB_F '&>B&d 3×DP2͞R f`r`D40sq0Ұ ^]9kK.T~zFT56.(!~"2A$ۧ"c0Yx7r}䵣 v51R$ǑL@g@׫5kWut]?wC_oQ40%Ow"ͼ잏 Yѕ  qvv:FX:[J2`j~M:5U25XlMdˆy'o<G̹:붟 w [}](D:T `1=I9od#(#Lzy  'eAtBSU"2v>rM9{A44gJĈ]o6pxrO {1V^ 0ƽ)ݱ(06&-=[[ !Un: Έ&#IDhM<j J+*u5O/XM XYiń4$,]o$K2!?5M^ύN1 {C[jN B"AJQdJ2hUG QUpii""!)OKmx$Nv[i Bn)cdB28'd͜ q-FSmbDg%U㩶{lR1!3AG7;>|W˟-l g^)dS1DtD<&qȨ ~UM LqDfѼ\/!݈BFLS:c+\=\َV^ HvRvS\l;ki֡=.ݲ#c!$+}F5 0,)L*Rz2!A4q/ O)w߯Ѻ|&zɵ hl=a`'> ii~ 7Wa􏵰tO A4MSDXomX@LzO6%"ݐt!Lub U2!]_$.Wd޼$!Bq ;ћ \pByk ?UO;],T=u^rSO R ^&9_ =Q9SiSCMyYZpG} vK=`*v٨ 4$mʔM.*U\^?: 'ۻ2u-Q dp6#іV:d:ȸ8{de H;ιWmOWSܿN_^I_{FYxp[1(I:tiG+.MFFBSm ? dNE81'yt^5b]|YV]i;:Fޣt4j/Q|jصי`h6qj Op¥ϩ9Q\an7 ^O;KC/ND ٞA||Ӻc3QӠڄ dp5mb5ۇ:Tݜ6MHz髾#axex$ O*=F!#"baI9:$|d\Ηڊ##wuZ4J54MG13ozN]98I;,jOX0@2!=kӱIObB28N$k[ 4ٖDw{f|d\$?<^\x񄌇-Pxu|sZ>+#e:ZIi 5Ofc#XJ|p `ַؚHd68Q_%>7 (f۵ȐܸEhח  4GK2W<|%QhWc{ƴZ 8:dB28ؑ~mI2svGČO4/㇉wNG<]ل dp ; KA']?0.I<Kjn;H='b@Fq|M"~ 'L)2'kkj`I =~ҫ*XS dpIXnUU5d}cmGhB28!u{rC%"$CL ΀9=3xG tWWꥣ47:!pE);`Yq /Q&SIn'5m5QMv[nk[ZoqGFn8w-a&Y +K,ʀiBȒq7xdW`E޻Iw} (o$:0< ܑ|!,˵Y]726ع/[ȷt!; 1oZ{]wVJ XZS^qฑ+L&m"`ޯ_w-mI vs`d7MEX"u$e_ՐzxF-U]]>ȈCETwh;j0.VK@[[Ƹay !=fKvĤ;J; ӼzN+#.afM~g Žl'G~>̫U^(M6ӺVo0o2~{U~|zs5ANk=ɗf&;,[4 wSēW狗oq7O ͚ Yw߅Y],f˦mwv 4~׬=͋&ndg*u@Mtt؀46oKVi 0 PqKoBVu]fg)keo3C(!nAFSD,[T4=_a%!tNF\^qҺs1~X3*׶{wgL_o=Ԗ5}4*lT;M^.O!Ų[ k2[+&0 > -oБB!9 Ve"9ڀ3vFhF[k >>Dˡ("ge"@Ykުͱ7Z{H*7U( Zc6tuLwi(};o⪼ݍ >oC  a'N!lWU{0,J)HxjS.3b>JVU؁FwwŇ2՛?% #lմSZo0:s5phSRO,~?7ofgn o 9NgkgyW+cZeѳ7{w"F'h q5C&r7=f M6ufc{? O =J)DW fȕ>J!OC$fy΃c@cKescKoLoL7mޱ,Ia_YFGEv:0Y/"+Ei```Z`2rqeMrYG"ٽ!>Ѩ r+C! ѾgGWWwbRe㪫q+QWw臥ԕv]zf(k\/nϳ\* u2Eaxk\`|o}jX'ܯNH;K*/g`ۋCj t.{;k!:f_;m$pcQ;罝{?v!j?Fr94/@p6]Эmڨ0<.Wͼ4#yc 5&l_foTm?).nKJlzҭ]*W%d!2RG xkɖt֭Y,Q*G^Nn|e{NhcT2yO z==]ܡQҸB ocT"Ґ[Ƹ$oY!wԿ#[%kہD^TCmACQݍnߥZj]> x~q0-~] y܎Gl)UO_XChUrcɵ3>w0Vxӻ42ʜHAqFI^eǥl39e{)(½秥[j;o|l#:FTuNq? s/yOvoO48%Ќgr<:EH@O5y?aNwhܦϐ>݈GP$ۣ6ro"dת}x[z\rFDP!k\& k&(l,⮃gn/.ۣm^.JQeCJ?ѰL«3z5ǧ|q~ҼyI[+8:GXMaPcңc {Ca$x̓l!RqIh}q|jLP;S'L" |}b?B N-tSPyCoqKX(=hM#qP.qسBxU=P^fpA c#V:64cq(ToXmmiIBTI+HRMZ 5alm&ˑP"{ F F V 7 Uwr8R!ژ;kby DH%s:ۣPj[s,O&I31l=k1MϞBx=^P2͞TρCLx?!77\C %gR\qPWb2vCp [EuX Cxj[|'V v.s:L=EKJ J5(=P)S[KT"9Xi3x'Bc.4soP^2kZ & RuÆN'ӳPHRF vMT>O3^EBx%+!S^꾽tT%B$r)D\pSR#)\SK2xXߞ@xcym84_JyŌ(b \`C(oAj_)fଁ,9O`- Ir;?B+4$EAQ$ZI,e ;m=8J8/s(+X:EBxNM/b !x" 1+f‰(SPްF5$,a;n(޳C(/;9` nRQboz |P,]@(Ḱc&'W&,l+arO{:x+ )nq4qY f3S޸[&f@^qls5Fy?{?!7WkhzU2@KSϞBxnEKR) V*K) ct^@x!yO-BWU\xHF}a&BxE0 K(4 `%L@xem/0 8Q'20h"'-D1!0p*mcݠDA4. f9-IJ2Hses! t)ss  PᲶoEJb y0 c- U$%XH60A#8i=uPފ 0X+jH3 ;,Qi{!jw~"g9H)Д(Ah#\B\BxM@M@vڗ7[+~ys|P3ʪg SGHR&-IHt:\!*jFf5s.0 a:2‚D{\!jgt.)Q,a `yNȹH 5sL*/r$ƵU6$2cܓBxAY+8%yx5 NJ߻C(ogf:Üٽϐ]n?Cn/,\vU>2)>+`X*7 5{N">tu\Nˉ/3RWF۩-9_yfϛ>yrЬ+<:֓\>WՌt@\qEO#%/'5^3.~OU@ddYr#G`rxu~[_gևu[r+ `Wy| /&p,-@pprݡnwBNIVM v|˦ pl62Go;CXOcr/ZnP9|~r1M/Ϡ_ko)؄gZGbH~E2CAO[Y҈=C&)z#ʲdbg<^쮮ǯU?A\~eg/8 z4%K؇-,?:޷5);\w ~ ~3^9e3f ?J7b*ë4\5;\_c%#͞l$U[Ai[=E~/v_U5/eg@ۧRY m|z ]Z*u$Z)񘻠 6KsQ;c:OZ#Q!R봛ͧ?Ά30>> $d|-~a|6O_nc#!ϕ%J?UlEr>ȥgz?d޺:<>˟Tc_.?d35_]ukE%ր6[ޅT8Y<:̈;(Cq"*%3<)vA fF3z"SSMGm\"n`3 >}y]@bi'.ӏ&/MҽxI-o:B.ɥ^d/()xIib V6  %I;4$+\|?Z #q{\icV^Eo_IcDP"UǨEoшIe4M Pj0V`3KTTЁxiÜ2sH*r'L&RRRS}b]D(Z,%o2aO ~~$%'̓W3fa0͘F欅l#2) I4Fw#mha+]T>-%V&d 8ϙ%- -NYa#'03&(s)1.}J`6܂h3uf-P{ΈS%\ *,ιc@Abe`;@z"l9g{K"y7Ujq,($-79Z(9ZD; MsĘŐLx}"&)&9Z-C>ǀe'8i?rǁ fe E>&?᜘d_p#L % BJ x4%(ݕ" ,pNtMɽX)|r[6 !'':dƢEbJ 2I5+@v)08YG9jr? 9yLN<2vyGQ.|=ʯEg-U|c ׃5@E.4y} ;NqI#AuPlrtB(P)L1 B0-OO]5ԭ+A`Y LCur͛x p !zbRN$%>g4YNmM6, 8-脹hV-7;KSڐ#ʗ##EKh=r|}}wIŋwbAМW&❙2&@NB$NdIr.<Ş8b>QJ*R+pG00C i:^RMoNz: 7:Py$P4mN/E%Mb}#7!C\%q60OsBP~1i] )zK9WmEGa. na-9* [jugG;K\b+*vo_;#KL{׎!n!s^tIWKs})lKE-@5aH#[nTݘhaک>ko]{p\Z(mYbvĂcj$?B \7,ih2ۀq2Jp䵦 Vt< aKqFXE-L  'V!x) &HeRzIf>%m HDf[쐳;n:ujNA1.TP[!"8Z;Fߒ;6jc=#)?zA X#{p9nAV{(ZA>|+KϨutu7w}As8hS"J~Wc5z_v:]LrM>;t%+OiLSgvbkSp4)tZnD0ȍCѡ;uq À>etsbiU[V[\ʓn7wmdlhwg~Ľij|KudDNYטu ƚOn|[4xnt]M^u[9X:lg݆6 `BW}=]jwz硟Rk? + $%6Jm=>1Ź3}@~r[׌%hsTvhXb#VG#$ }Ou(EYmWg5etwNّ2U~˙wwIBйeq[wss}}j!>M ӹf嚽b5'F q\/I^- (wp(4n@$$D>#8p"b@ "i  RO|XZQ&֜H'S k5c';l/˳t!V弽}'U !9 :$F(V9vWd 6*I$nHa+v.ơk.>٨nx*|8h5~ߨ`%>@1s2K$EMDnd1at:`!@geH’zSPֳ@B^H`kAu!= ۨl+X:ddhTj",(NS$<(bQ{prŬ"Sy2fӲN:I;@(ZNr['ֈn' ;w{B8@(FUJ0>p5z.uGjPs$14 Z.\7}uB̝}gԏ{h6QabXP.<*cŹ)10!V9r1$Մ(jk+ ƀN%eLD&@ O'Ǒ#| h@0lIV~B3a-=Ȫ"Ydf"32^)*0K3,n-y5$w+H;X$^C oaL1|jN5}CYՁ~V/:L^kY}Lp2a oG ,>?Sd0q>wI,)mEq vQJ&1S =89sI9Qpݫk_ E3sіej W5az g`+}C.~*MWT_W~*_fbr(,BA]!lp^x.j>P~y<+U|ghă_ y$ij!XfʊyBj 2Em\p-!'1ՓޓB]L`Z%!-Z@1 1jvh,qx 񘻠 6"VS1AbXxGT:Z[gR+K~w7X2mdU~:oNn.Q`Lez};!${4#|%s4 I&(^ԡD#Eဌy)'COr>r=.a(sq&ݭBGphh_E=xec3Wjzv`4N=:ZRviV 8F|0`SUro$Z3-vYGNg@= DEPm h)=B*Z1l41|;z"]δ~ޙb#=p翴*ưP6A\K RKgu}mg{yep';gov>hyq4=Znܹ]ƣіfW3.U_hx$†$KQ;Gz' 䦴7F@l*w͎7$v6"I@d+57wW#^^ad:cU·/g\){U9ƽ#_Gy6W\?rN&Nb>͊ GrTL_U+!JsY%ԙa,G`/䫫*~I[~vpf [iWt*[k^8n.>8ԿMn-oC`u0/<:]|!o\r(x^27^5:9ķq- ?lK"ϕRQ$uv}~T FעHn{ȇśE'w@ħ >‘@>O*AX+kDs^N vjҷ@nAWjG9m>C/0XF 6r Aj`wmF_U?#M{q^~ojՎ7bT ;; 9[ZL:ܵ uTM\3ƹJ1-9׵So.ly}Rza"un?=w7Xo_.c|_or~qBtu=4TA>}O.4}/q8P)L1 BB%%0p-wKF-5ҨuG9 %g@".DXPB6xj9VHtcmm¦3}o;7>?(͇Z.;,e-g}Y'.6 f='\h,9ey?3#6}ry(&,"͠\]e%bb>ߴ|6L *~ jSb?a<7qhih>|}|x O0? b/F6kh6D8Á9_EۧkVBߔZy[+* #^$'C@BB1*@RiG#MiZt\۳4D7*zKLJ ?Ӻ}P.C2ݗngX_OQ;\z/ZLYs% "1i* āh#z;IqcyLjϼ+0\RGe=cjS^SBhr2b& V; Dx㉎$Dk\ D[i1HZOiM=h"* vPw3r!W64;%T _&Cʜ`Wf,hg ǫ'«|W\)hռXջ(LD~efi0lqOʂ/ l᷼[j-o5jo80XT J+.R-(yXa/E&WF⟹7(~_?tS&6뢚 yZ)9=?`%|@?># KB*ɔp!b/iKk B{QYۆ=XmgdcFP$aAq!QFI{+f>n6,1[ZoiǴN[EN,Pwt۞{lqd a-|O֮..$W*鵊4SvJ?dǽ~ua=g-Dx]}$"+1AbX'-u=WmJ["L皧aܟ4Sx(!gT$y3BUQ,ϛ^}n~>7e\w=cKHu%p\zo;2[R^:RiiiN8zk/u='P:R% g+KU"&36/`n.n.b-O𽁬féSy&ݚ ,7a3%ܱf*5z.d[ 7̳r6#B73Vo'6Potyybڠ^mL;y*%ԣԶa- ܘB4۶M7=l.Ifʛ^{Onk6܎h%5VǙC#ÈNNPjL@ YzhnqoN;[ۆm.sn|6sf}EKbo01'gu9cR q`?oyl:X6hi*5&,DG_OE_M*Ug/<5wg--6 2%%s!%̄Ja} 0"MP0sod[~,n2au0J#\Y31\N-\&2m`1HL;$T |dNR(tp#q`tذUcX9ذgZȑ& ebNof1ӋKҘdR>g}|H$$) I9OeL&#x"T&:ht)YE +EYyk4^RÇu-2lotJBG ! {eԱy>GuBaֆ >cG6IOHX}QoK,St!dY(iv5iS6dorlu 1Fd¹mCh_N.o"ef(3.~$%}Rm.5u^ؓepi_Rni:l7G`XSRlX&Br&oZ`מ]CFXuhr(XvZd%4,"RO& 8lcF@2kmz~E%o4{O9uz>0Qs.?+Ti}~8ϯV<ϰ<]veXj‘}Y[ΨOJȇtO{ɒ;Zk vևMXg}^ixedY` ]ŕ]ųV)z (#՜|ai Bfæt_tm`V< lL0f`rAt7zA搜B(AguB!scKы$6e6g"p9R< fEmUP y Klffb?{|2d4+ *LSg=n٬pt)h^Um=+gGo}8gҝք*ڬHh{v&'JI[Wwۯ]h(;slomx_0w^r>Ԏ[X\yFۻ~<7qDLjS<.\<7_l̾ۧQ҉N{Q\fnC*DDj7羫5aZ{}~} آU ^zSG!#·:-]%!%m>p)wO @IMY3mˆ L $IRfiMњ5XW ص~|~hv:x3oO6"m+o`)TRP;aP*[(1u9RUD6F@Ejjу.&{=e2B| !DJ&VJ"65f8q磊A>w50mq3Q~^101KvF-9k2pw '|3V{7W`n^z2ѷ ;nnLeaqSn>QfR*||;Ʒq.e-mqVC0ښ~µ7jkT߁]l;ʺo1)O/txW촾~, x3b<9! ݅ 5:9 @~{7?Y3Pm)\!3)I", 7 9(PAQ!zb>JTʒ Y9۵wk%li"ýiv.GKi{ t?Qiq<#cq0jvk+2Hd%@[@=V9Й.4=19$a7$s<%$mpQ_ՕO)TcC(MzWՏ|w\]~y }S?|w㮧j65={]O[=HctO M,FWP:@- `NU&x\ѝijM'S] ;Qg0Qԍ,j].AxKPGi6@ Ժ]8}M_}vKnŠvsqW 'Jm;SZ8[O_} _. :̤$u[: oXX<¢Z,jvƄ Јlr !*U.^fhxY'g"))pbГs^qS_L_b!D:s,*)$&z,R7Q4o㝣`4˗f5+(v%DF;8Rwtz+٧@l*,ͷwC?.?Uq\R1dɢ]θ~i5ye[ϴm ytKG+X{X؞?9(vO|>;(Gft9nn~h")% Y*ڨEHV8`pQ(l$`jF*Jm{e5 E121c`Ri :kYx0SCxHnanZ }=YYr\wbjR"nW%/jղWޝ{kզ8[l &.]O.:mS,JA904Ra'36gglRf??Vj?~ԓgI\< :n'5ٲTAJR (c`0Y-^E";FoJEDZE|$Š6bp)N@QeƔFfF/PHE'udK^l٘w p4kTEY!g X] ^%ZE0$K,mcZE[asH_Xm $N⍐nJQk'mJiФ@XoRn\zuƶ}Y_c u˭Vl!hߖ=Zzu{ Z_DuHtRjDþw"=}rKؖ!)> (J,J-.1g Ppah&[oh4>|{ү%6]Jv@Z'BZ`@.~(k^ 4/9KD\7@]d X:wAk6lvPb]alӥƢRm|F+]-1`6X}ڨ?&ñ k#f}[y\|ؖ~z'zҁqTB\LyȩSJ+չR*9CcmQ5ծLȔV-֛]o 9'ڇZ63p(}l]QKߞ 75?ÍiMˀA-hjmy̧#=PM7㛝^9yŎ'ml+a_uR0)*<9\wVLmBs!@zƈ]Qb`G,Y KC8LksF_N<43R&%Yn}x'L]qڕdC%Ґd<%B Dq&)䷜.7w9߾P~I_uO OKHF(bJ,7Z>6.d['ٟXR67{EjUȡ"bs"s(g[(ư+IYv:۾]K=£KdC@Y!u$Gl9Kc-%["IZjin)0!WbYHvԲ/5G , JlR[&$ yWX'0ŨbTE E,,yBE#\"3` k0z ]x,(C(X)SbS۞=LF,2W}:#{ʖ¢ ޕqdۿ"sڪ}  -KZ-=Ӥ"R^)(MIv6]usFmaxLsLv( :^-5qZYBy&mW7p@*XʃjMnC9$܋ĐGSU9[IoR3]A74ig{~8/wㄼ6,,pWO +S )2$ЄGaz继=#w˥t/g !./5GȳBmTzr%Ōq2M!̣V;@CX !`gEhH}UTUR: xDw.U[c tX,`&s\"1(l1IeSI1δr*ٸ2@}+XEer L#+%po]MpF%JE5nXU]GV1(l ?U DbkM]ZJx5+(IH}^R6Ԉ &yYEBb|UTXQx %EOc`l/08F5nʰkQMk?fR3&i8tj~nmf#eۊYUI5ŲԠ‰s!: 3L ߹aC\|l7n%zo49!V˚ ZǁmZ$L>b‘J{۪dpҫhR0 Rn dPgP j7V[/XP0F4DL iY(}R(T)pVb恶S@^\@&6X _  < oĭ |X:N~T} yrx*NEɜ/+II|'Ǎ.}w]m4oŝCLm]LՐZE}m6F82=.-EWnBl \*&@QR" e@{{V n!*c ;f20jxrI'#6uW̨ ƌZNmJ'0%(.65I8K-F|d)e$bzc"Õ,c< Ü@#܁?U2gZA.{mb8K4YF%Gj3Ko^5 RYUۣ ^`vZzו@̤I@ |^[{t*2 c%V`&k1Ҙ{Nt)j:ƹWBȿXN-=h9>.O4ƜrȎlZ7o᳑7@ƥծ5.ߨY pwGggq_Suϙ^\5 &[>C(E&Ó6\|ANo9*kͨM渦Jkv[wtx^f Tbi0Rc 9D)0n N*gya~i3ǔ?MqB0?K[ ˫+= vNJ~u}%߈Nwl׈oFs/K7m3aE\e:1zڳa52O&Ȕ*,o\ 7:]ە1[w洹o@`msc r"t9\l|ΨEoإG~Ֆ?NQ>Xo;𯕵gwڵ]]8ٜLo=⋡9Rv(8p~ gΝMr^}]e^|6_iy.%/>?94_mip$v ^S]>Ƥ}|ΕfQ0S\eYkY$Suo!v+w3n}d)yӈ?¶ڃo?Q.Opo|8i1;Y3C+`&"l`;ѧ{7_`b[$WDlnglb2P{l33?p89:ٻIٸGףqrsrY3GN j偱 j${4f=^LjHeiof|n94&_{Dw(\Y/:NEDH$L!'b?|q0բk>x7aa(y`Dm7+&bVT1%)M[JqەЌv̹;D˛TyXVWy[g&.Wl?Z ɑ}'hs<\E^^ 77Oc[.=7x/gr+`g#󳷧ǀ4Ǘ[L\ w}Uwz(CÐ0[YͣINr3@nL|2}?K'w{2X;-C G?Z>b:y˯KM#2)0)0.0o0kБ:~Aq0D1q˗oQvZ)˨ް4m l#;U9ۧü=w^P{NY8e]qޢzSby,$j?#: =3< {Eruga~_>MZKE.{Gh}}Gd8&/xgl])󕛸IBx});}VnF@'>}mﺹlxڑP5'N:|[!~5(+KR\+VU)&WMAea0el\8ͻnS/Y78¦gu кo^SsI)7S izqT:)zSEw/QW6HdAorZ5'/yҸrO}F=;ii_J/_Ж]x A:yb*f/^J[g\9T^҄⃹yvzR40dj@nO8,룣 / ~{SN76mv:0AXv$uQzُoG|x>l?&ǂoky.[K]n[BZOp멛멛V멛?SJ/JhnI$띐G-R5)7yӁ$ZT{(\Rb?DPnh#Mێu-5 (+Iꔃ8f@ }V kZUۋ{v"j؇o#kWPIz?ŵ>3õ ]6G'fDTj-yw՞ ipfq$`Af׀vj-Ɏ-JWݒz\[ECOAl"qTY0jJ1JhBq%[g?ƅ8{X"Rzf`*CV2pg2^/"af }!KR֖< )+Ndpn ><Ĕ\ }+K+rfR/U&?4 Ge ^)cL 9ďZ<& `S;H T+~bs%Ţq, [fwtݭw].У[M߼/֑{WK7o5B.oŝҚ[WQ%ύ.α]v>q[勞Znhŷov~Uw%P[UF.H->":GpQwo\14r)I,H)U42c)DJp-<ZL| qAsyl}nVa|Ľ>9PWd[s*pHCpE:e +Mw&mK0,2ΪdS:g⌳'!pcZs.**Gp5svG8A>~jqs|S̩f^ј剘ٱu~ 4ب*A?ޙIn Hy ,x& &Zm7K.y]5tX&n~邸z?0vnz:wQpB ^~658UՉ iV#“?ra~'~FpȘב}AS"숊%8LᑸDP,Hx#z!zMx"lWέz ^ޏKyE W,Iݻhc:'(hR"f=!FX鼦 \)JE>z5|tx֟} FaNDa%;0Í ;|;p(,f~xgCAN]Bmux}燮c{ >l7|s%?N}#{{ϓ/H>d@~B RLs ydqu*OF&-d辊=OV?IIh<? 1ɔrY;bhʖc!˚?*]F&\+b ]Gf3Q6.ֶi8qWOoJűQP Ʌ~*YN6tZs_J8GWaƒ;MWy򚟵;k iٰ6Hu*H%AA)*nRnXkŹA+R7XnG :81Pq9JK*; W{x][wOw0c|sʷq>41~[o˙Ms'~oDT2y;Agjя>>ej80 "?FTqsq`Ѡ?&(ygOP = lݽ \(]x^=ױ W1DRT`*EJQRTQ=~-Q7gMQ75FԨuSnjM1˯fJ Mq˦FԨu)k2eMQ75FԨES?! ̳`yq7ZCRzWgȏW7ﮞfмW#_Ƀ~O{on H%ė IRD#( g⣙cܵfS) ϏήvZ߫n:~}˨(oUnLKҜZô_Wǫ?ɰ\1a4`bύtJRz7D{4~>,\xf.^m/0 ;<͈嵩UJhRB*%-V)UJhZV)UJ85ũݛMwoMܽ7q& pM_LQhHO7'ѵRlG]u!JâF+h_a.[T~ r 4 !@H+ !+N%m @5~SoMqa``?p>Ig9As*!kM1L#!9S)aN#VD=Oe# }29kAiCs ,@KI 50Ryo@b<L\ gWhS͜=S⻴ HNǟf[σ0%: fYm=X~Þoi2Mwo{Ya[zt?qH6riQx:r uF3Wmؚ`dz{8ɎNl[}4gbK-IwmxxwA=/Eqso>>>>?N];:m%O֞zfxERٌmIV f14;o,3O5t) 09et1]#@U8QAm2YeyTr; woWqɁ8-Xld:~G.kN/v/:3SFcrc|^ņ?h*"|n͠?pvx;kFFia!k$QӜ-앢&r6p?L@*E!lY,$7 )^k4EZt3*%wōIQd%]>,=Y!&n8?z;_ګ>d0`)JiSP;@UTkqe6A<uHU>cp01 qI#ump)S1&ejό=3VD)UOZl?:LߗA;{ԓO]~~  F n' 媤F:"{.xAW5Qgd/'FrHmj ^6\aY*7R\: "i hP]66Y'+ߟ m7w"g$%GKϳw )H$$Of͠%p*ƣWd0),r?}4 'ꪴ3A$⣿|{27ȧJ(i0d3>&P][\eƭxrF$f"U2tW/.gp뷟Ϋ-moo\wNK6#ڳ٦}< 0KsY7g7KmY<[0AQ ftS۰RB [쮨٢Rq9-2-0Ok̺E膸>q`8rPD|)k$S4#ܒ.ee 4:xvVo?~4 qiL`0o2j XE1x9GWٻ洎lWver/u*yHp'Ǩgӝc"Í&5fJk /*$Z`7J/O>ׁ`X6"GҒ0UF˃њ e"xiL`yXV(C0!B*Caj3jX$"﵌FMFScfΎЭm~ GK9t!fJd}{l6\g_z{HIZ&cAi3!4yݛM6ptrkZRԻ$ 5n mՙѵd=l41s}9f-/<]ƣіftȑ_Գkaohr`ƮQO4gß=Yn.lEa|BVNNygRH9L_|T9bGbq뚏-ǹ^]q]O_z(Ϳh5LsSW^ [jE} Dy=3XVn5zL,y"D1IĎˈ@+9$(1DcBRpn5ܑ t/FC*:-{e`8B?wҷn`API!2g!@I-Ez[-')M8oӘG"h@Qb ;A (Utdl8qm6;[*]6渢li#KEI {g@Pbg- uقGߒZA\')SޞzH<'xN:/Yˎkܡ鋫϶lX\ڿ9Ye£~la Ir .'R=g6Zt_XkE|mBN`E!:T"izOy@oGg֫R2;y)a :YW&|.r^+k)L@yM>jb w;JĨ6x{Q>$g.'.AvjӶ)QL}%uu3nƙZ߿!vQi flq*+Ȓy Q2QuV'i\3Ѕ SJrZD+b) +n[!po0-a:,Sٺjk Hx/1/=ѧ+2㦻3 @Е$gHZ0-|]1bȏ qiQ^}4f!BƽVHb)1`/ T8n "ZH&1)iFNM^z[(F2Fi,eGj`EPKdύ loM_RQ9u\U*S AK0j /ʛEy;Buip65->|YMߴ1':]BKkmoźFaS+q{A 22/W3ʄF ZVO_/o>| ?Z\$wIW~~L߻S&4DW݃ 3HӏLTciuBӾW=d]P؛jdnZ=답H;^DcHL:C9C'\>`HZ9j%m K__V?&@[.4 }<,lH4oWI!w%0n0 O^]VZ=%Lς-ڊҴN.g jA)({)U.m˦N9KջK5zkea$ 3H%RZx}2L ܷ"4L;VpyNz0_o%Ձk] |Do7ӷO܋#OHjtY/QSB uH:ci7eJ IM P]QQ87PNbXXܕsu_EոS\x} RYKe/{!ZT7P`ypxrYG*03DmʥMV yIt$ хT9☕ ZxI8"E.qVq ׉ r J/#A194_ѱc/HǴ (QHLd$&E",iPx%LR#1s:C6lM+V4퐚vvX,ϬٶJwT:(*ogA´k2P!@5hQkcM0&KJ>ظLȴ$HO==ߝ#h*߸! `k D;R qGj+Y@SJx->c}4tXt\w7O4Y]_Cu_OΥP FS4\#]Va+]i۵y?c8;0u빷hn- BC\(wp)B@ƌƁZ1b") NV!yBÇ͕u6g 6t3_/*h7mDGLي36k$rx35C:D{Z7x0 5nPFWW HhCt>5 m5IQkM2ɥp!zuL|7v;cčo՜lYr^)*̂-+zS$J&_dUA3nuZOa_6$Ղ'7ӱn[w_pne^nݐw5Y(Qk/FEC7#+VQgVdy1m]W\i_RsAx0#:qI6oԓTjZ)Y-bmZ2ګwJ_=tY] 3)>p:xĽG;Yxuxl8isÍt&wO݃Rw~vwuOwycMySVʤՒ#$80_|w_4h uDtRSb6ivs_l5)'SL'FATm,sM}Lрx ^uҫ"xkB-07Rk Cmӵ~|㯪G,ǻXH TS~9{L/M^GÙ J=ЂȞswUl}99l`MS^:Mkg.vw9'sT,b4wGxS Y`5,05HlˍYEѼ$&/ X""hER")-4Ԟb%i[$ oER")mHJ[$-ҚLLF7# h+&a/f0BAeq-" `BK 1*`Yw(ZH4x!&T)VKULcfZFַ|꾥~oSIOt/5 of[U(kc^+]$xoYߦVW&9.j 03qʺsUF[|W ɮ"i}ZA> 2xl4)١XhE|aBN`&wOuK̎C:c{sU籘dj?w?y@;_r'v|r^4_Zʄ2a^!X*1j 3PjgRu$9|;hN4Ov} *uaP(ESޠ`pnp x1ՇTG~ t2,%75 Ωէ8nvVrb;~7L@қ׉ O߭'|vӉ+LJXht.aU'Pag5ѕE2L ghrog`/Ovk˫tV2@֒^Cց+xX?:?r8 C3J1n6ܸO;>w"Fy}f ÷<5CMEbDi.SRsiʮ7预*[խY "¸3LcSoF 5 K?{չgLb0b| ?U36~ ߇:Uǜ؁FwCtJkl4!&FDjBe.!8XIb*a6J!^ؠLaL i{(ZzUG?Fp+r,TU+uXrELs6?cPVu<Z'Q),|VĜ)ϷB 5i`yYJaml"{)',e.QgqFA(K(pRJSJ$Z7RGC ^l4E@F[ :دBs4f6X'`NQ43QW'4c4p4qߵ{Hӻo"k^= ʀȝ[ G> FlQBB,$#S0) ֈm{V+)X'^V#'MiE/ -b##4DMnsU䪀L Hm׌y$MT8 .Lvk|onL>~K4Q9u\KU*S A%OctZ@w̷̛ykף)6_RٳYNطs )>*WPUq1O J]FGWkR"nk*knp=LpR,̦@FO.R DxYFy`n*lSAzipf'='4Y}UX28չpտzyt D"%T(zb`,E+RZ3aN\PƐ+ 0E.(qCvYI(J1,,VJ 0w@Y.rֳhqDI l.eII63GyՋ5De3#NO:7tR9Y>SL#B \'5*")mnu`'~=yh-R܀P>RL@bh0TcL![2f#gd\mDfYdneNekttaŨ΋'jP^owH,U (LQDB:`^eO ) V"*O6wrX+M"88fL r+.#ȁ198˖Z{B<FHHubKFbb$-A% 90 H%ZNk9]rIs"c]pb\WEE0 Zg ߧ|_}YM11?:7%ܛN??/nlƿD+Kn#Xv]s3 [8޼ކZqx6τp EI@(Ko@ 7uL.,CUDEN@ġ poMGR,o[iTj {(2Tn\睷եRw;JAۼYn Q\ sxA75Mibmp"f*֟u`7vn7)){G/=TiyB TJ䲓WZA]\%*%m3Wcfr+Q읱{;:5F6_L̒trی`ct7# A/4CKP(nq]\{KL: = sB aU:$͋pca(W? <Oj9,o_-dty ,VY[]DӉ2VqܮrUPNWZڂ- m|,R6EOKIRJJKIT~^ʳR@X!qJg@0dU"SWZ&]\%*y+#OH\;稌*+NƺjJTъg(PHW@0%=I\%j%?vqԺWP\iI>%q%Ŋr a"LJTV\=Oq% _+W~ɥiSݿ~ǏE@& V3| G%x.sͼf 58ŏsVQHPPLUC HIQdyuҽgvBl>k,KL{3'RMLTUր6 oӞr= M=r(y|3^P]RgF)¦s1LTE ѕK_zEopybU$W G%}w1=TM>%bӏA^Py$x'OS/nd<)VԔ1тFQ4`pH;^'lޕ]]hbFWIn9xLMr8mR/lz*R>8Fe-ߴ|s|JO)m eygz"1a XnIHb-hhnQ`?Z_w˙ 8e̖1ǘsuˡKe=3PiɖT{P;I#A*5|e ?L6K<]7 QcRx+mZ̷]DBW!ft!+&f ÎI'dޡYmv7f GWk[3;&FoLPj^< ͘1Em*<{vŋu\{'49z5[|J@xZoP5n8y3d8}|yՀQyK#rZ#\F1DT 7Sɤ6E$bּܙ909&| eqMG/ȜBnCWړ݅s<;rSϭ7 mT&4})3²P0vY4a#X%Y@SJxx?{WHP/ۍ٬n .0/.OɒKRR$봨#m6\LdFᴫ9er*7 H, ~tu bͶ" !- `dq AYq/s:_'U)G-}ꕦ2Ke|Ί*9<m)QD"H GF)qJ1D\rw\2MF!PE=\8=$!JʘT&HgSG'?LIEx*),FnO!jfIC-|FXqv 0nrs;טf?w }|VUgH)BSIūD0KW=\..[0jŅoF꠬V}׌/0hO:]}>qܤ? XA HD7`ku! :S-#(L|QOV D' Je)#D mTrگAZe0f'%e j1WY-3ln|]}<}yt'W8y&0fMZj(TX.~/s 'VXs;e-ecLZ=WSA@R*cMBr, ʵsɉh4 cPF0$ E(Ebh18o3 RHF%S69k5IXFٳL7#d~x nvo,jw<kr01LYv4M1gF6cvd:l\ҝك+Dn%mkmBP>[$1dyaF0/qj>mܹn['3?.e=yav~ [ݝ2~ HD>{}`'|_rћh-X%||tz:7ĿmGa  DһOޢ&{?">7.hjǷf rq\,۲Z)Ikml/) ɞ1#tyo'+gA^,fdfdBh7%Jl-߲-o(s:Qe դJH6.g2*,jRoyn& /$|97JZ@ EcH}HfZ)?4-{J[mgnTBTSekl?:>R+ZT,yIcl+h.Ro8m%3͝\U>kQfq[^pֹ5+?uyA[9RA0LJa Oa"mr1M>8EW$g+S{55vo9&ELCrmwJe^{i0$ݕ`Z&4XZ4Fh>W7T5"X"mZDū$LUŠhdíp*EˁELRZMˬ!0$&CRX W s]kO }2w:O3gL͖C!3%—ta;7j]ƘcDӶ1=K࡙%@ JuMb5Ihx' ^F/AVwa[jVd[Ic;֒mVZ!Յ UMCkfFfڼmPS+ AMp7.И4DkZV8  C$$R{^u9^sfdv,3jax"!"XXy+`$Sa[1r pMjyK.^ YM _cϐe `ALū«B1N\z)^m>t@lL8*FrfIs;΢z>zQ"9YNh!^J%IAV83<&"ER Q'/n4y؃r^6Wqݹܘ}.Η',8煉NV} NT(:WY <%o4B"ۢ>4n`CVje;a .yj\ȾQ :q4f]:f~< 6ND 5R4mI殛sScw~gIcE?̽Uy7wi~A +q`iOXhA;5\`Z=)܆տXr%I`~xvp;>'( L%DDBqI)/ dhϝ'[^'4Uec(tTP$HBS(t"81-f4 [hF\v>Y #v'; !oWޒ\KÛ[vdBQۡ^x 9!L;S(Do3]V,[W^jBH-,n:$Ns 6X[J؄Jf,Fn͸\Dg:=EtpSSv!ontm|?~{3kIܰ dBtB@dbt2 I4k0a(3:2$j@ڲk!60]jK%8$T1TBsh02W.Fn톃`\^.ڴnZU#* wZP@A4p=GH,D%L{^.$G{7*kK8;풪2Vel%\SB4NM򠀥,H.84)*h61P* 9%dlM*iUN)iC #Ɠx7UQ;mN\wǪ{Ĭ` +C53wf&WӮu5JFjqX\tH]!XgU&W𮨫L"׮2Uu hC*|RJ]!L-zt+!Z+䶟;'Apï־\F. ? [\EUJs>X>}?TFUoS1{r1jt3XMޣ󏹀>~K~q|9_xx.@$SBe80(3Y`jDzM'_y_OmW4|v;޵Ll63kOC`Tk]VU * ِ=w?9Xn6>M[oExiPn' wώs.9ݓ~t{f;$< Ԏ;&1a?mg̱Yq=-_4ߎ/Nq0J u&:k,ٻ޶$WlI}&`b1諤X"il߷(ZHjޤD9<<]]UuU}ootx1k:5GKmlZOkҿ bbBiֱ+j٣K׃>៕1&)ώYi(5<ops7psn x (ѓ=_J'Yy&`Luz\vuvKQ?4Z%!Zre`Zak e䣷XAKr|{wi elӛ~m.́\v'd0z|H!P ROr~3PI|2xLZ4y-ROL\~Տ;u9 R ĭdJ9ìǝQ1r 4eˀ1ylX'H{-i=0V+7̑x~o5#eVͱ#g<4 '_d2^I6jVԓ' h$VK rpG~K;vv"3~I!Hx3bJ: .lcp\҈U_^TP7Fp<'4㿽 ԃmjDsQ'r?|Q&ƕfcR%1%\}QSrrT~"VE\BGe=y$XJk_|VZxvX ߲?P`MAA΄8%"'UӉkN*HDy\5p8w<< juYN|*F)E@#+ksxH 9S)aN{jS}29kAiCs ,@UURR2sC7̦=>eLc:3#s;+^ĪZM-g9{/ #?}bjenfta<t}7Dy뚔͚mxfq`gGv=Ng!nIȵM{ms;f!t}[yr7]WvVw]6c wOF\`n-xWٮԥ3 ̲[nUl#Dl&E|yRI$,H)U42c)DYrQ%qNשׁԨ%>ŊB>sc>4Ba|}\x9i\/Px TD0*Bu<V+M>%`sY"M;eLYKI(3f"MNi͹ J8\T9PWyWg ~gu\ƒdWySWr_<yJkZm}1^9T 6jJk3MM'T ` 3A@g0zl-8|KsM[>I%I0pQt`*&dVǐLV2&iR V$l-_oi[?\vb[ )KՃ9d?OyY)TUe7&rbDRFk g)DJq!;tN,ܘ=0{ܢO >Cu/`1"Ku{wb'T`+jNũXiHh+x!=!s *N\q9;sU+R.l\s%UN\ڜBq S1WEZqHih$S )&We7v_nX|78PSPޠp&K<=e DWi"]%2Ub%:ċŴ+2FU~l2A.2ņ=?.C\f5L%,_~9^!'"LJ8~Z֘'Y9 /Kd.NyImўj9zc\!!{B*4fʞ8pWSRRJ& -SF1WE\8#"9zsU-͕PzBlN\qĀQZK"%o5+ \S긊~:檈+쩘"=#+Ҷ\Y+ay± ]Cvu[%NOg?82t|m{s|^R9!!KM'  ,Rl]Z#)s&*Kul"qTY0jJ1 M \*H(gPΠ&, Tʏ]V6"SrG+ZWn,zR,_kT~ʯQ5*Fר^4 L`qC,=N\}`*GI+);dK,;=&" QܪSF/uZtuRIUʄv]˝?f+MjftYit}9)J_\-,n0kFPΘT~4NF_{0Mݴ2{4y?v-Z mLWv?˦ pBP`.4+DQ NyXRayF+8j1QH(@&UpZI4%` w2YNpsYzW>by)'ba'^w7[޽@꾛ks6 hkd+X/=W(NTԻM!-kl{(^d|7VIw̏Xc2&FWplLB{Jgs ʻ,-E7\zXNRn.J#.`i۳a,jX'p*;{ۊ?X8ە|<)Wt"ɯO$h`szTʛtd0`)JiSP;@UTkqg6A<uHU] et1{Ot hzR( :f2Mt)2ږ8-zr[XM3h U{O%0"F;pKq?<\]?3f;ye Dc&FF-1V,Q"dcFgzJ_R})F2E6A6y  *k-K2)y^俧fY :Wꭚ8ĩx(xE-hYĕ$J(sPN99.K(Ƅ ^"[[rI䘵2I-!餹Ɇ@#P¡8HY[. IUHS9A9\a. U &ٖw-.63HCsIQhS*Dֶ!OcHs k%m)om6lktAmI; -p˩1Xҍg ꁰ>o"]ϰcA; IuIC]b@~8fS!7Xa&oL@_.b0MK}n>D|'/}۫8IntנcsqW f6n5ܼt ͻywO91_#ٲ7MoRo֭% ioo PAwtʷ򖿎 W2!r%Z4bmK$O v.rf>M1bO1xMCջ=+VS+}`ģ1-Yݽjl;e+]'ԋڰܯv6t?-n\:4+wHVAYg\Ap2{UmniF;(_11cZ,B}qNF; =l\ο*ܲH},]{/5w71y߱dNzxᾭv/^;uf]Nz-o <ns\PEۉRBl a$Fg,HcL*!&3v0%8! K)}Hnu%&(+0dDW[CX6:BIzRjK8mY퍀=9F'N^zdtȇ__﷙.G|5Vm]_H>Zގfן6Vr]igÛI) D2$IVvg2=l U!|$;4>.LrBId}rKH=Ŭc#}IfI}mA[z?LO~T7;`fLwnnG}O_podߎ\yLfN6ع8]ݏ'O]ݎD-kdG'iqkEܝ]Ҏpw c :,tDB fIi8W]AV&tTߋ _]PEwNZy`Eq39Q+xĨ2ꈊ|M)_쎠\\{n]{6p?T1JrPCv:PǰHiYs ߠcJad@i`Kw(护x护Th2Ooo/{lO2? tpŽ%M!DG~ru^)A@ft}_~\ޛ uqz'(~21??__ٔx_ G꽶rON1Ń]It Y(肵©:`rc̀ZJ^[Ur({-I }-Rö׾V#9$@\\q%**8tsU47h ѝ"N$ v(Hk>Qx cljH`ņcb*Rј+W~dL gsq%c{w~:~R/0W׿zyTyvvuu}/?}87@,^O'XTФ9U0]ɂ s*9B C9 wMaȞBba K?L0IOM Tp,R>YY?O/ڽN-'p^%i\ R*^Ya Nl gl~6φ!-ՑV$dU^MRZ'wӶ YpjNz zU_3c۪a,+ (S4 IpCΒ3t:NjI^^ja9tPjQ*ŅYVʛl3p!k sֆ,7Gm̵YG$ -XCʗ V+!]Л 8Gz411q\ޝ_5w:\\ g4A@Mjle;/:ׁy97uyǎOVW5^̵q\S,FḖ.+=v2 pC`ZdAu>ȆઇV_>U>A>WC{kl p{L9JZg{9Pt>ށպsN`h Zhd cs2X y&22E*'\ZW!6I) hRr)ie[)E\ň&s#w;$NlS| 1˛ѷe/qo]MƳ$ygpMiδhR+gA6.[_]L/NrBmZ[]Bs?$ `e2X*Xjv>e;ύ\FWW+7t}[.E8r"iusO]uKsoXtZsk;7>M֡x[k퀘I`k L" 3pЙ3dp%]HC{@T8ڣ@]F9l;%l49JPc 5Pう#sϢ:霒"LfL-2o%08J̕nyFmj vzgǫ ci7Z֝l֭zQ}|Z= 4'Fo x᥼{jubSV ɸwiA* V i^(Hb|tH"{"KB'bx,hKf9h#)@rfr(SFxݩRC6qtu$ljfwA'5O t!Aa|֦D;q-f:Ҷ$ʜj$"c)oۑ4O3XRf7?dwEvoqX@ 'moߐ2Ԉl`q1Ik9FEJZ#7؈Xp)2.&#"`e 8ea~Y j{#K$;/WdYQKLY(j]|OŪxQ؇q2rWg|͉2й`uݤ.~xug<>^D}|Np?_jGq[ȅբy-'nZz`W&|<gp>ɛ7u$p)2IJ'9y홡 A4 V*T>d8~4 2O}F3.1*',}18U4 rrG즵**''|*ߍ#sCˍ,*BQT| Fa)!zŝ5) h.q)DN!$爑 '@C`RVǠR)lB*ٌYA BpXx/e&c)52:5ne]T]1JG˾Zˑq6V2@r)ζ.]Q!ӯFy`L5eE= C+7H OrAǘ ~SmzӬz!:(l#A= m}Wۿqt~iQc=٠!sقB] ql -']ciQ8>`~aĶykO̳ǍR;Q{ QqeԲ_FX;{~5(GN| ϾͰ%&wTm:N]ϓ)808]g'IW9LL{f z%7~*(%%%Lwľ@7}w3ߢ08I4OU`Ez6+]ފyOd",Ss^!2EmjtMrp'Q S H3̕7LEnPM{ cR7yg O?N^4Ҭj=Ǯ=NJm~W+ hT+_*k صfJgۼVBk55fIMghM⺎^^L[Y3P_-J}AIL)0T\faIk 򕮯[ c8TޞJ;l/6ToVCц˖S7:k;@kJ|-!}g}n1_KJWm(\XD8% ZhYkG]Z5VmVm^R/C+$0$]N.vS˩*T@]N/'y+~JBdGv8GFL\cIS8zZӴW?ӝ=)eJ9IURF l$n*{FsBM=Q,1,RppP;))\Z@.AT%gSGC/JJ'7+&f?e:"?Ս&mx5ƙy=`]/ȵt"QQfau?nDXiUhzJ*kU4Jh\#xGs?ؕCRhT$ -v,%ʍsh҆pbF$#298ʢ"SlOͪ332*}x Vп,]um3Z_uק):t\jJs0Pz޿t{KVXrDrjuW5\4y5/\?k{}kOO.զ9U{9|CHK c-~SsYo9yW)˫b&4|n-6?mΉFi_ ?bS (`+ Tl+pTS6HGjTT)F(R'B ?K愳D@2 V Rv[S"KzA4D 5N$#'<'dRJm*3΂@F.(5g&EKfLz-*Utbl<q6SyWg9W)nWt,KJ% [3ݿ#w f~8Wnz6 ;uAIk\eIS[ovM®1}~QߌUD=aJ;}|I3NP/j(Y~K!A]j-37_6N]_>E^]56U:-zߖk/Ti:y-\ٝ(i]gW_Nݽ_껮j\WnFqgޛ?M/o̮!K{ [=Û\::F6L#CV`a[&G~uݷjiAe'<~2޺xԮ\JqWB'a8 {<DZf9švVȩǠtd@@MR2Di @(-m0}+evXsqJvOF^kA8nNUe3&0ICIHXb@*JԇTT @BD .0 (͙DELIPGՎJ,YA/}8xκ7f|5eu/@v,pJ$ԯZxІ mකe?k76KmhK*;q8_5DF_cm/ Iwr1d9yktꭑ`^9.^*•h5OAP18~9g7R*nBfHk*=xj"1`ah *+Z<f"K/twru= azG+sSe<}Hb:z *~`4{NM`Z*6n00N3  E^BzS(ږyBvwGݦQisgUXZI>|Oainv]54B6{j3HS^} $G2DJļƙ8 C%=l_ɒL9 ^)IAWV:"8%:%<&`vGE2GE(w8;ҥgwzhO燠"Yo'puG4ʣɗNOɃxCEwBM[S;x,y:o 6GJb^o-mjZ9f :pO?Ǔadځ((*+l+=.7zslIԱEc_ϹOdoQwAe@k89\N츗% HĴHZ{I\$%@ґRxbbJy$^^%*F qu]Njtt_E?}xgua(*o(^z#0fΚeDyNv WvQDǂ|ʆQFs$+VƄ)C>e11R)lB*ٌYA BpXx/(&oͶ6#<<e;)P~O!)S"p`A(B hTpQI4#$@.Ch{Y[8I`7dF6^ jA'4c"^ JpcAbcWԦeQ{Z_mӰBir 3 4wp4[&%%!Z b"2em$_OZ @+sԺB$^ȌWH Bx40g3F2 b+"ˆ:Dqm8SpㄨVZ9 *hk{K E@8J)$)ZqTKUTThGK ͭ3Ek"P;,1u%"/:"TDyHTHxAY-DMR)jd1Jm\v\<<,6;vCQnx+Y?|T:wC23^NqI)\"SBC"U‚TesSτНKczZ*M/=mDj߷rW]/"DZJ=94H?y1/|2*D"θ{|rQ͝=ػ s]y =䣃#!d׻kQaچwA {˲*i0gF^Ιq޶7`rKU"UtP%&wmHW~6H$اbyX`goa{Z--R|2%⪤R̈u}(ܱNpSZpd+ LXDWI>\xNI rM̓h{W'GՆk=@:X&Awǚ z{t|(mn~zDOC,:}\OC_VwV;}}cu/)6ou1]1)tZy]Zzqx @?at}ګgiAb1XDY(,P$|Q]˖cr7ro6ϯi8_Oߎ(!kgB4I(*QKgP4DJ%2"ArXyO u,IrfS"s>dV:%4i[^yE1^F̚{>Gz7c&.Oy=ݚT a A8qүŚ9+hA*fԫHQ2n7 # ^'p?)@7o?8D66Re MJf׍Sd%M6FoOޑUi Dt]̻#(Pߛsuzy;;3cg8 H~\yN>F>H&8!j5'pRTNZ)a 'J'T8a A>w~6 ~_|RVA8b2g vv5-1iйW8Ϯ~W~Kג/8>Mu~;ry}헒>gT.YG f+)uީ܅R L-g;CR o7_0a馻=xc oqY0~t.}w}Q + E)#GRޟJֺ"RfR>c®>!;8\F ^6Z nOWN9)k6Ws=j9tV='+Į(O֝ \NUR> \ᎏ{|WWO_Os_~ЊyO~*.IӠd<c1 ~]8j}D|6RE4!jm"@R!1V$0^r_g֯3׫WY;ʡA?qB>Ogn"?{$~2lA,%NWu1Lğj0I' )jO5@?/A("5M,2fiO$YmrLa}mq3svHݿzYr=EVoUCmetIBu4uGxL`J$f,T ,x_-`=//owwo-d40wNt$TP eMI[گp ޛ=S+(zdp6F/bPPe0(4HaB^94aL U B^a&p3G?Cr)EuDf:I7%=Lޞ} j}9Xا e^df=oCg#Od*Z[Z6DmfZkZSwô߇ON]uz{r%V%쓷d)ŋڰb~"lkOyST޾IN{TbpTLk#2}9*',"#:[[ (j9: b)䷙<JvD:ؒQ") qfX3¾q;,#3}Oۯ\o;.?,Nfob@&o'KdR:W`1yvΥň9Ɋ`Ky28́VmpJ BVbF hȆZ%4GJDk+s#vs(;Em=2uAl˜1G<3Bt8lU+cP2krZ^5h rmc{2hC⯗#dfRsg\DBUKđ1#L&AumAEdơ b3UcDT#"㥎P9#13P2:b(&y*0%h[4$`5hJV+83OK]P\JI+m!XSk]4ʜY. ?ȸ8[c^g3/1.ˆ#.nE>"Ha2u5{DML2fCPw싇1q|ܺxOkF1z! n~|GU(o)Z_??ϱxX}8:;t?/:~[w#r24A$PLBE ba?}F290EZHtؖߺ1}pzQV:0i9r6k@>c:=qu+B.s*OM 銿/ߦLMpj$I^ DYi5qݵ,G 164fy<޿VXm)5YYh<_lGNL @us22u\"SWuz2uJoFO(S spBowG4*H|X'S CAmw%I9Ŭu@YsHMkD^vch&4|q *\*%⪻G9̻ѺO~_ck̬ǻGLJm5a#wLͩݴ[aK3}F'tŤi!C"I>,,[rԁ7WɡJA{ mkd!q,* HTRM(\A7RBFPe6 t1lz I**hU"] o9̜nbHҐ0\Ώeܬ5c]FŒ 7i֭'uş//=\Bk'Q1-9i%iQXULδtEcA$鈩d[b(#"/n<$1g IĺET$H`s9mF+Oȴ-RӘ/#fMa>#1Oۧ꼞nVW4"hqQOx/8x! %Y \؍%D2'D1/Fijw$$ oHu:Tf:Y=fc&[wK:p DG#(_ߛJsuzy;;3cg8 H~\,ȘhzOZ"iBMR"(sT+VfnG=o 5\۰E?Zα rt& %cD1DcS̅PRZ(մSw\rBceX^_${I>%OD$%?PHòDVSc2eLP8e쥖93 Bz!Nqk~sAnxi0k3L-d۶wݚv&vͱc1J(!g 7}ɗ C1n0LP+NBak|2 h"KṛA"pm䬭d$XZwmSW}[&ՙL*֜ {u OӚ`+u :PGP''2t} 2Q&$Q,eHM`Jn 2U)'%e 6Xe-}^xxNX?;ci&ؚ>g2}8rp?Oĉְ'-@KYX+Sa𢖠p୙ӾҾro@R*cMBfb D9kr\r"Zc@*M8 D[cGhc"1MP )K$)P5$"5[1 yG{[mTY{ͽΣ鬭I^nɱ4U&=:m3[6j޽`\麽C5ȍ&Y3b`!={ǍEקlmN6>[5-|j.vixηhLc3vIV?Yk_{b[5t+-)槉 bCV4\??%&M>\WhGO8]?ӆߏ,N>.<qp%{헏ǡ5h Gd(-q]$,KE \g%xg𯒎^7s e^T}xo >fNNOy6,oxo]*m7_'O(;Ww~+ O`:QӻI=v~M{OҙCߴ`ۥ}yAi|gPagqç c'p%Z\ZԥA㋃:ʗKgY4  /KL djl]B]2G#IeQP*"I`%3 e,jzK{׿K7rmݍ'20ݪWc_W%wh]?@Ah"qŽS1w ZY'[o`46 Zo7T$52Yj##Bښ"E 3J1&Uּk}^:ƴ5:69Vsk=hagrIjTKk3[bxNu& ˥/5B; P t0JrruR_rss˃- <I6\2\gTk0T')B:z0`$bsKs8tK~jmjg_^RgZ.8V sZ&h E0Xb@ EEU>GE1oG9^b~Euj]{l--w=w@lRGlXc8.%)A-đQl8^aD#A*͉9pnE^/]]e@Zu Օ&YWQWiL,T_(U+n(̦~ܽI9흑6#rԿ8PF\X39\?/?02' SJ)'0ya7Mw0U!{% /TjL=vʣyޒ ޤ6V1{mkdSZD.}(Y:kkU9 W7 ZEUFZjՕ) u9ƨ WBSUFkȥ+D D+7G]!`AC3\*2Zv 2Jkuvz^eFNXxt $ 1K!%$FP͎H`s߂a(4ByELLL"`, ȹ}l #TRA(2`yV~Ϗ:wY>\AD8r/e*/q+|WyR*-Ce,d…/0,( &y՜S :9*"4)#֑biV~8a~f'Jb/Fkw|ihGyvM=~2T92iW}/7Osiy|w Sj+ h"@+~'kq~|m%T%X3p@@M8P3Zqj{^GB/V1*=࿌r!F^RFR RWIup9aMQW-2Jhc_(M CKWWl_;N~Kp~u` 4{ B]V)-^(٥gPWUWO=3j!msyP;:;eě%\ä^YG4PWN~6VBE@-\L2 ziqů;B9G1{nW&:O? $KydIʺ#:΁%$1)EԀG`ds"u?EGzg@A(=kzuzn/2;.űsю-^>z$X/4\qnF-h !;qsEh[%GQ *TUkG%-j[osE͝VeK𪚕6/6k4QT&}oKEFK#65w/dyȰAvJdQ.Vk֔z.hCLD&ygLD1TwAl/軬.o%{[ki#xs-\vuwN1n\%\m6VchM5}cP" ߛT7{u8V h) ke* ^5sV9 .W0Lvz9bLsahTZr,fڹDƀTp1(k#Z-@Q4NScp@ :g(@,J@mrpkXlgX|A_%{χKj{ͽ2?^ߡYDFGxWd;ܡlV얮]xq'! r#QqFW l66w{ѣm&:.uu1i{hK6Mۮ3=<=tB4?Z od#v,}\ܐ +`& rٔHVK(/ō6Y")|A@ks%I5t2~|q`jp5JgY4  /KL djl]B]2G#IeQP*"I`=Yf(cQ`Z] AJS ĩv!ă2L4G [`jDzMj ΎyBnƻPxؓwl7j{|(zVovxN"A]Dz 4ީ;,Ύ7N0R7\* Km`uD(V[\h!xST)ƄJؚWpmpv Xۘ8ZG&GjN1|#L0'35^V[bxNu& ˥/0^hGy*YiZny"nyP|srK}iܲsa[9x R%l6Ld@Ψ#`NR(t`H5ĀiKpn9rZ okBPzK4%( <'F/EUjLA`4>@.ddH}H.b 1 %43řęqT9* Y@ml_3~+,:lJ$֮F[KD2Nhs#|ONƮk2K9n%Kk:Hq,vp埾^[#uTh#arQgj{8_![/m.,p.n7w!A]]MsM IQ߯zbREjJ=˜LLOu׋Lً(4p^h!GLZ$R =~I.\\NLԬǝ?Fh;\G^څ!ݛ7&vܑH%>o5nicڪzwM_?d!o07ͩYsoZy>t/CJoBm4Z(`w?h J hkY6hTqpQ& M4J+n5yeV U \:ZN.2-ɏ&,`Pc6&8 "8Њa\qI\Z9Y[}R U />xȎjX~au~a{'3|Pp! "\uR2IZξWS.C+P&pu ;6 1ɢ7S?TQֺ&hS.CʝnZ$<#euגȠNQniDpș7LCFU&}I39t,H'bbObٸkt9k>zC^bp8ο;eQ-)XDf ˘19<{Ty`Z3#`]E$s?/[ͅC6w㞮oA;qG7E<&w艘 : JHׂa22W9`q_fq1;xURx>]rUjzz%1t|/Pf,97ILW_h5z)?m>N5V "O?6(,$V^I^v_WȒIA3gc e=˰lr :2%99 /1{+Pa":+ʮ_Vr|Kx~**&;H>Fo9D2Iىte97䎥x}!AѴ;jlrSj0b"7ෝJw| iAèm;57'[n >Ccb(m5H;\ʒ3icu*N!M,{+iQUteYi"KLf*k[>#F430wY$NԕM%|F&V[U4ɲLfj]02pSg@8nf)f=9Q&rC=NC)+)nNZv6^-q*mRf|XHmVHsjF]Z_T奪K<"0=1:)f@2kci1eS6nXMx:炈XDQt!=T'-'&3IfAU\Ā^*/M',ٽ)]ݤI7F$X.U"RAíD@PVv@-qfy#Y9uVӒcqQVEbF$M̀ dDH&XR(k6䔳ωǐRo'$Sa58UeFJi|H)! ]sRگ݁kK:V41[Xq]BGE$wQ-ť}ocLܿP:79 {s:̔hFHR^o[ޝwŻq~<4|jvToGF,omODhrk1-x'QS%Rji=EJŻ_ampc's iWN2|\hJeP7 *8:* rK{-D}.+"ZtE9:C0QV{ 51l.&(O*!t\,S2FK])j׹u\xJ힓s2ߏ^-Uh1fxǶ0w+yH_"`; |FP7!4x>bCct_aYbwveVSNNFS%=gDh^%G"$oL1\dcei X q9%?baGfc4>W=Foxmw^}1?xZ;+RE3830뢷yN樭d` 8 `vpD(Ts:`ZAH`ye+E*`Y`"2gs>YZ A&_|D1FNwsJ?ΈûzMr; v_fYtDtYe^,x8xW!%1lıkͬGB2O`3lE΍gt.uгz:G(BɂaŒDhFtm!1)UF.sQUO<bFG)V`G/x+^IfӛbaN擛ʹ>׏!`8!^ &~Їi/*eoya1mkE`9+!"[CR;dӇoetD_J* 7bC[Eڠ"Iwa9U~] (@\{11WO^a?bNja_zeOf7@Av|_?_ 4D&2hT4&bLŎT$bϨOrW8NmSk?:ympǎbɑ d$34Zrb*ϘbC^$^xl Bȩ$U]f&Nhky@B$ H  .qEPhoijzgv#jXh`aF)0L׽ײ4 CQz,f *%rg\ҒgܒE` C@C$YUfz=["1ωU 4I "dDH&FZ 0&5! .Yw*?C/55T߇삨M <E- Vrk彜T',HɂV,|gHu }jpkv?aol0q5wK?? Gi"XVD8M>10%?/b*۷e fiHҽDV<8H<Ͻ;doeSWܑ3PK X#pf3ٜ۫@ GY[ àH20V 1og[z C]~)Zyx󟓐0Boڕ{݄HOWZ^EoG˜6tru&ɮ]m?{W׍l/3\K = 0@jXVE~o/jIE"] +K)Ί3:BFjWHۚ9>]M>x7-Fi3a.^?>Vhtզw=OزZ=],F,fnf~Sb`rӉٻ] lȔYr{thm0 O#¯|R,ok~2v1jT\Ѩ5^I]ro?ݏ?T~ϧ? {N:JdX"Q"1X!ͻCGvMMv}9ing]fgMɯЏ?F8^O]ېr ssK%W?鯜#řz0ibɄb ־Δ"{~El!Ē"P-4y#&vWVԩf F NohO鲋?ggT dd1* wKC7:e0+I?9lRdک@L%v'uT,|tȣB!ULiCx DWQLΦh!YX?=~0 94- [KI,a}QX6wɭ*I#W.4-ZKԩGiRI(9zS,Qi^Ұ:@ n| ׁyNzَLʴtdL\h\cZYq-]}dQSr L,ySN]:2l;+!(jN1N0[hvl`X ߲ߓai!ة>{R}(4G@?X|ݧ4w> /Ч+Y}v霾~3|0kKCdxV9C*XYWX+,{e!98)xedƋT,\ZW!6I)JI,T%&%VYXfRy qYh29rCPyMʄ8fcs_ [[<^{}7nz=;mhz ]ES]Nm1)yӃ˫MS:nc!W6M\V7]ZhGI r- tCO/O7W7Oit̚jh=z6Ɇl&ufWMvo^lܓp0XyoƼɚ'm{7W4iOl3)gKs:飛^ԕ]֮A]twP{Ũ8:SLFո gO2ݵG\i l%l49Jwܽw?P{IQɌiE~VƒwLKr!)`Ue )Kn}L.hPЙ"9+sae^M л\_IDo.!6hn!69=~m 4 2 AYAH#7)CL ,W,H6WBe뺋*{ꄜkD]l"Qu}!asr!Tв΁;8wB=BRQvuqs\)Tr9垘%2PY. a\i.- J ;mC7N Xe&Ǝ[>ʿv摼/YsKIrpQMb*.UhbHrI5KvsKܞ[^~rw<2_s,N~.%.CiZzomH REL, "c-n1W@&e F "!I0R(HRhύ R Rd܅YgZפD{9ěx5%M)?>$v1:]\osn[:*R+sNV\0cό_G#wR(]NĭlQF۝{M$Kn{y4;r{\uwkO :jCot5~Ӿ~Z`eŤq l;MSB#KR¨ 5l.A{ >@PIm ޺`d A /R@ fzEav#ԉr}*`6!)f$}*zR˴Q XM Lc\~a6WS+xs_%pSbYʋi]o~vs[+1/} SGHl$%q-cI$[vdIJ]"ٗHVh#骨:C,<)0.C1AcDш A$!x.4AW!"e)936u3jXهTR@p>dz{E YD|4,656{ fsb rSu&`Di5퀌!l%x)&'WCVިYQӇa[4[GGok%w=>j<QVE/*X@&cJa ZzgsR3#b(zWtnG&!*GYrAГBI21#R93)R[s.HFjGzJ5,2U;,|R,UZSь=_nX jt= D'@_{1b'OJ(QDRv'DLhOYr6#2,Y \]ĖV =8ؔ 5,u)`leĮ&z~VǶ6jwJ &*+1i{ZIUJq&xlb DOٺ>+m\9C&͐ ie Z$a&ő0>p>fY R]F8%ӋǶ(*#CĕK&(sPN99.K(Ƅ .El'Ič1ke4 ri[BIs# #P;Ʃ8x%z՞pq4-#|Hjd[\qQvJ/?:Β 2KZ2L!jg+k3ȽR UOVǶx'Y˒wmI eEF0 ,ۉMqC"D ,3C5E$ÇiNWwW=]]4{佧{q U_y?>#EL3h]~`#⁻N;́a񚴑Z(ڜ-rm|̓9@#r.XI7γ~$z{abkЀ U cJ#68R@i <T(>8vJwCTޥo}Lo/'!aK!3VGڥ3>a6Lr;~Wkt4Z&ĵ r BtZXL8P=d:FY>a*%tvte Ԭ{Mdž#Kal 3=o<EnSIcnV4ߗk$4:,Nɧh(%>|Wln hs߫{Z, |<qcHqH=|H=0J5wu׿6ӤlLڠ9zjwÚfd V+yoM9$|N~kc_M(1k7JJ|:TT>_@!(΍ k>._nuҸX>TSґuM\e(b4Oju<2 ɶ+(M{yte-jw۩uG&I4K0AuAoL;SQ`%thVt43WCDӝ`"Q0ԦP+L:v@/qӋ]I o{9vǪ=wPJ]3EsGWi2ht) x Dl}ET.zA`UJė7NjSN9gV&_ I؎\p~v;j%\jgAQuT|\ DK{7@t)g}y˰҃WUPwu*|PIQxm0%³$nz3ڶmyWeɬv5!V` tD0^uRPx;4E갘g@0ƥK %+;./FEa4-4l> pHY=~_D^gŏr")SCV+rFAm$_R%0)1uV5O"IEbĜ/Q'R}= Ej/e7'f4a1z'鬩fY> 03f1j6ut4n애սkX5R{-MiH0+fq6Rp ^*4]R1c(qc3G2zz57]}%?+à zpѦIMm<{b|F X*,Hz)GQgc$L{AL3Lco,.hܽQ Rwj8ZGQ+q+Pdo-豗YRscKv.vʙ5șd:FEo6q x{A';t=yC}+?*z$,eIk]2g1Q*\q+;Bȝ+;Ɍ=VMhi Dp9΂%cH1FJ+,T.`: M(rn:2,"AkZ DƶZ#~=:лupdPㅈ"'۸DZHUeS1Gd\N()\]EKܗD-Ed@zH v W6)=+(.a*.]ƣd] fR(uzܪ7[VXDkJz1Jۣg6\xT+׊܄@ LZNt˹r>P+K`RGSJmΌ6>)u6+K`2)np0I~unyʝm#V6M>Y)@FtoD.Ab$ju+%Q{R^"\%Jj/p%$*YW/kw+ `o*žU]DB"^\)0{W)iB{W\ymWJI;zp%x*GrJҝ?JT.jW{W"8V2,~uPGvn&EbKxX|\OC)RLi)rc,<%J&ׂ< uԑ zz;b3/Eb7l]7?Vdӻ]E=gd GVÌVS *)OjR|*mUlyg4^7o^ܽfjgrp`z$p0U YWg0l[^wClF֓>&-\yp3ƣvҽA-vl!R|&fyσMrvqHnzg<0)yfK*?@*)埠mo;C4zw~_N˶ 6\!r -%& ?}zG(&Ŕ>y#k?RkNՊDjpacDW-b##4DM͵Rl0q2J,[9ӛ|]9zu㳧,OXWD弶q.V)W#Tzaw‚DPx&PuVJBє+.6yb/ >54ߩVQm" E;?FhKMT_)fVNy1,[0l,TZriMnttZw̱bwrlDK6wULGQK2P04]d`DDH2Bj x$ )[Dc&y-#㱉hj4BZ"ZV[#g+!{4Z7mخZd&i8e1 S[I5R:(JH9jE:qDva& Q3' q@$*Ű#+!#a88s6qbxHCV_<(Ւ){C8Ɦ_WUWWY@`fAFvA 阨ZfM,WH5HVrDq=gWunL+ L<9?[]϶Liձvz'=_qFĹNPkT 3$ER۸I!iO~o{$hVdJF(]@JH1 N9ťh0 ,3r#c6r#c>]%f=X,|P,\P6Lq۳yX*7W4(x<8U/L0O5\H'h)R 58!טh52((1y* 3 j-$& fPIlHGхT"g;b/ٸc_yQ vcasEu 1A^R8M8Lj`*H + T^>r!1 4Cv.$(!UbD=r`TiLfCa6Yf9TCxd7'骀/]͊1h<]5.ޛ}mӧl(GL)ȒS*6)B˱H*%QZ04_ .x "<"$^n ` @!+QRb̡Hk.Ji/0c13\s^ 4ywC4u;nڔ=Ӳq>Y:530 3؁S;HR,v{'tBƍFl~酞y8! P`$X62QsAE"8Jp#(C)!Ÿ+KҫA7N||48FgF'/!5 aG l1s)Q}\x 1"2Jeo ٤~ j#9 m#NJQnnd؅Nd&})'nU}0~ߺBpߪo8n+N#wUUq6=/ޫZ8,OǣIì:ͧ>H?&7 #kʛcb8U1~WO׏^9u?^ s9.xzfIh=8ABOM@V5nLNi})(~_CRreI#\#4B+ۤ9w_v]UFQd](Y3jygg_Na4 SZ>~m^ %bƋR!_N}>Ԓʻ (ՅqaGBg*gYfo4.oݲ.]˦pWQ p>TqR;ɱގ3#݉ÄtA̙q 5R¤.] f7Ufeg|k'H+~K ~nzVHnss9awݪ*.Ѿ-/mtKž].܍sG} Dj.sr齯G쑮#ʯw KoTޖ6!Q)W[oyv p1}hnѶԦ,Pj;{oIo ݄1)U&!H›(hжmYŲ|۝n³&PUcCvz^'\_ԊFt; tt/{Z4;٤j9 PǢT\LQ h[w|T7WقԋhՁ|zʿ!~dN'#\pg/0/?yMrV0#*)p%3ZX&mDcrHƳ&O4JwIJ'+3a&:L{8`k U cJ#68% ^ wE%`}f]VoP΄[Z}zsf-ņbI1vICmo"(]K^}n9y?~u!RnRtwCyZSLǘ’WL#,8Z<ރG׺q_՟Oz\\ɳ2>5?PԻvٖUvA<'tKL(x g@ +<HpEPv˴s$'Kh.~TWGAt1Jk/-N;Pv Ce'RÔ[kV㫨PlQF EB Ẍ! gLZ3R.oL!,zZ& p)AҎY |}=a8z<SSg/M|(VX 4W{]l.h{]tkѣw.!J:*,ƈČԁ(!JQʥ<[́J/:|Pe6(z] H)Rɜ)^ɰsXDI$ۣzcPVF=,o s .WPThC^ㅵNs-h_q\j[utb)ÄeH{b$*%3A >6GQ 8SP~kX56 {96$K#rc 1$>5tyy|0UJ:4I Vb<*ywŊ1Y٭ vɗFBWT ݻ#SU܎9Q} gE-6.<2iatHChR+X;{&'d^gbu~|aIX,WŅ]%Xom5=~]$R+!:U9r;!M\),y {l!avrJ՜ƜE<ɛYel 1\0 |Mxtz֬-k(V|9_bqa!7#1~aH0K:2Aa21 VI?ct2st>oƽyn}U#.b*aNF̥/+Xc|isy ǠSN?FqFg'_}uɿ~oO0Q'{uKu+0. ӕ{}P߃[C׫wz 94 M[6; C-(}W|?\|]io_]iw_)4iYɨ*\Eς覚pwѲ'9_Dϸ?/} ! tlMɸ5x0:[RiCNv2bc ~w0,I *0wDr'5Y?3Ow@uuz* w`Ю#mE01U0]p[Q;!aY_!9TG0\Zh0*x"y% kE#,57f8suP -^ -@vp΄ I(SD,N{FW&FCxㄇ^Bm*]Պwp^Cjy|[||vS$U7Miwf9ȳ̏okz9x(E'^\^ V]gn `^3.+Pw7)qoɃv}08]yk0zd%*n+ũ7R].א H-bV-ef6ؽu[A&y^<5LK#n^SyanFK,=rx@w"]YsGs+ ?{>Q}zD!D;{$fQP5=U=9֟'sg9e_n<DN  [k\ Vv}NMgub؎l??o͍|FMW z{ p鬋R`2&j瀸)YSCNrKF.Ec!̙tlVՔj.1lUL\MtT\}[;cqNoRxό6" 5Quj.\ nFD`ɸB׶W rrهljhSO/kԻsfc}NiU]CځZ&nJ0'Z ĵ!Pdmxlk#Mj15@Fk$3%2|ԯXR+k@B۔R:Wpe`!vʒ&Sʰ9w*Tev^w1Y,c%0z+ ߨHN!yd! qBqI%g>#%"ma:X,Ϩ)1gJ7c%% ܼOw!1NT.zpNΩVRXWmm{"דl.1Шc;SYKҮct9FfcKpdqu4~kc0܋h3!Xμ5X'-FR5pcHFu ^w6Te9Es] vfV4(xԬ;vBi~)M(VɮD3[r7Wn!adyQ#bņiD)g_PE\ŐgK'Hm̭l%S@\ZlhN}zj +3\"5Pm2nVnݰ֞;C(d@%P;wVC*)e-XE103؎qBBѕU!] 2U*7IQAddRDeCІj5ؘΰ5Xm#dsm v%4Ϻz3xWl%WG( 4[QA `-AB@ r@!M=+Fc,d֜M@] suO`&SDa *!BB)i480)$ЙHK@T::M]{Yi{S4ÙR@hq(hq MYR;!K@;hՆ< Buv3Zq*.g-=uE}J{FyŜ4R&j]V!/u2`5>!2@bۃB6z/`*Zg$ Kz F{3yJcŬSBs|12JTԂ*a:i "0cT}-sk*K@sUI6Qt}- ZU&2rx Hփu䋦 AQ$rDOk fC֐0ӥ`)ͭcDF%ȁ/CPF[[@܎mCÏ V"YԏoT?/Ѵ\żs H6YK!VI;IƇ`El]Fw}4.\t\&c%VZm=Rɭ_b* 7DR$.R75l5b*H`]eے3-hg[X+} yzr KWWiT ޲ICyhp;#(("cQ8fWc݆`m RgjLZ5J/?Aj8x0"TpV,SaXYw 1#tȳL+ (ZrE|An{ctb84X#YQ ୻A)Zk7UPr ho*mU MXHY/>4mzt lQZu!h>pw^yޞ"rq{-ǴCnj̄xB`.`;XՁؾ ͪd6v Ōg609k ] `L ʷ 3Rgc  ލ%9rE7* E܁D;uRn8t*T3ĵ:[.2%etigPOt4=vP*'sW 46)ծ <@Wa~G?u7, `\8˜A 8FBr#5H*ԃp]oL]B*B<mT6cR- iko֑͊xv=Դ=ɗMLZL`2bHR{'k9V]vKxs(TSU#l18 bQ1*NZ7telp AP+ٌBj4SpfT8IOf+Q`cJ C\q'ɠ\ .JJzN\,jUY.i \%tǬʬd n Z!Kք2`4Xd*BN>z cv'^|zz#_Y u!e2@d8g_~n7u,+\* b KV#݃ٛ=tX?~Aw֏sHg<\z97H=eDi8/F#bS7Ɵgӌh tUfG_!~Ss'q=g=cߛTS΀ӂTmF"_:8J[m l۫/.LU2Q1>-ÊqT ln2^N"ac2Za ol4Ѽ4>ile7%~UvzECO$6O,\"6El.bs\"6El.bs\"6El.bs\"6El.bs\"6El.bs\"6Elj(.V$6GňGv楈w^l(ADl Vt۫/yv̒/̾/!MREh.Bg*4ƊSx ٴEۘsϮqJCFIUi|fBio]Ab~ryhM^>ӳCiƧw~Nxo?\Fׯ!߶vởrsNn?s{C8-]޶ſ/}pU{G/8zsp\81} >9!\pֻv~w/>;Gχ=thwyIܞ+v,Wh~]zJ˿>tɿOcyLũ:au27qx)Y#.!.!.!.!.!.!.!.!.!.!.!.!.!.!.!.!.!.!.!.!.!.!.z]:/{meA7oV<`{d>'T1r ^Lb6,^d" P{Vx|ok4: G@76ژg,~Z8wşg.'y::۳O .8nBFkl qcF _4K4gQZ.ʏ]:@OjG:gZHے3cMRswWf _q |+ Pq^Q>QJ+DT2fuJ`X9Ћ4{ шkVNvc⟻z(fGNj Ka5L,]bv0t)O@KaZ=ܖ5bvX/Ԭ&UzJ&v+ \y/m.Y\8?5Ŝ-y֡MxokO&WQEr"}$4SXg䅧3ϠUn`3'23p.8133133133133133133133133133133133133133133133133133133133133133133133Wdf]u`~}o; .Ջ}o62бg%y/ D/{a-^@&<{Dx/Fe{/\5`q1o8e``%ԗhD-E^]VnU}-p3_d7r~`ӽYWc3{W6׺ %6 WG=uʦW*7y֘[5Oݕo_ޡv~?.{V%0/8p6J6} @a۔yܧIWuߥݮ{9<:G )зˇٻDlewv!;DJR&Jzs<t{y0 . xz4T>uwވWB(5QMF^ d(E}Jp+1 -Hz H҇*JDS^~xz<Q9_P`C%&\T;n}72{,8Mr]7u,l?߮g܋@Щ=亻~g;ċNg۸ȵ>?/?%ؕi pϞVw;cze ,~kv`ctNm=J[Hu$HeW.+3\Fuɉ`UŌpW>;HŸ^m r)ƃZU%[(SKҎ2B462SƢ lw=@K W)ZJ 2#gi] iQqx5FN[uqjj}}v.GKG2,m>[s#Qh@&XOh@&Ws ړo %T_ah$bWn'^~ԏ3vQplIx2 \̮8$3CAS4ނ]flM9"~Ȼcw8f:|ǧA(`H{9"/pZ")  YRЉH'TKڳkۃt:h6`e<ϲq{LIs&M"H~GeO q(X]={/-d, 8LI7TxO՚ǨXm:[{ eeDwby3~%nju"7^xJtD8mA  1pm&vcuںOǚq?@hwV)mV~ͺ&H@)Y/sQyKeG$E#Il(x-ѴFb/@}MAs9]̣x]TGK%ˑv#HM WMQn 7T̚t~~גŅ *zn/NM-,.2?3mU`(I:M3wc.0tM;&'N#n"GgM̮\⨓+2քQB95$ryÅ/7гFp~S w:m\eԂx WEПX!CY\~ՂdJ[ p%br_)o5ŭ[<+9G?:+43|svom7gi㬌lu~M6ݫusȴ Fv+ur%4*A[UOwBwx~~û!|տ~ǫ~xEzwWO L f/_D¯/!M?~%©q܈f1uu⬩?:~߾7d:P*g="z#\?ӯ 7S_Iw !d3TY`ȕK>yh^4o1!uDBC>fpDu:ұ^1[S A0%ov=^&_ q I 6A[Fcȍ G׭Ir2SI1%I O y; RW:2Uտ9yL',0*@4"DI4J$ÿ.y)A\~2εN2N1Ln/ds멆b ×A8,ub쩻 e7“r8~qgf7*r5K85m(lf$EQS0pfb1,B>JHar<([ Џu7oΚyl-E[JSk-3^j|/4ggMGkB98 LD˔1Y }nk"KxGo3o$g'AWPq);wm#)8oH%cm}S2Y>G|4./hC%I*rs$3ʃqWB #AVިr?,C_|gNz >x"9*;I !) N8挊IA Y*((zb[7@DN&u4qZ0|Qy 8=Re:s®b쏨lo#zW_E8{e qvug,xyanR1ա?bex88]S3zM[|x@rK~εc1?{jϭϔMu5-j)\hv`ctNm=J[Hu$Hpm\ * )Z:1;*Wiz ]2Kx񝕷2Q$Qe(ild* GU;ӄiu %׫}({^B'deZx*oiWceĶƸg0 _?ai!B#Aɬ8m\n|4/ IQ] Qx9s4 -j]muY+YְIbXDERDNNY2mZ5itQ0lkG7=\|@f>~r zF}k @HI]@ !E.$(e|L2J5hWyV1rvZMdx*ˆXL-.~j8Z:bZ/.¸{\qqcP2N^'*0 Iq)D`[ X;H: YbR=.iǾx( a˨}ng#\ X+Ţ ~XBȜTM8+ *S%+\:c !Kv{wAKtjϚѱ{BFpPF'=3 ly`ByFAD9i p|٣y}IUL=0%$jLotٓ]a U+>gղ!:RbJ@2KyH$dHiA 5,o,{9hEm弻ݭXwteC#,tx~9Oo _p@Wd?)&0LY8O!oنlׯtȬXBג@ "\7.hQZXr;眙tA(eݵl&?;i>Rl&Y$f((#-SN8e%#)*w-JEɑ@*Bxpr'* he"5K.ǯ U<-JbzJ|miq:4V2>,H{un VS"*I"A(J=` Tܓo#ϩp:$K;(Oҹ%ҚH5h9QCJ9- Ib Ok%zZ@A6UJ0= .ct7l/m&WCd<ղ7ڦl̀* AœT+5pRs FxRɻ#A!roC  #! 4gz-f2H†بlo75G#z{ݡ~![DpLsw{[lͰ!^Lk249'xrt8@w) 0yƠQ>C΍%=t~{鄞y0((ZP6,O*RqJA'nr*\$d: Hcp*C0$ ^ )/S+fªHm@^zdtn"?`:82 Q.WSOcL?l.'BU^yuB;NRݒQɷpJ2\_瘢?}~sP9/7]|a.瓕Itx}u> 8lZ_^?!ZleK5ˁ$:CҼݿGSrOd_-ynt}ڡ/<ԯ]dPfNպh}5.s[+vY!}?PV"q|($C-m]̮]H[ >C?Zѹ;!Ȼ;NW;pOKC h}G.;8ɱ~F݅ #V|)̅Q K-AQҸ.nW??kܵOnս]=# sDG߆ح~0vu#߭Y߭ӯ۠YotKj.YGJosy#XM̞mztױݳGGI{+ Zr9&3*ɫNMSX݌/T8 * QbXϬwl6=6BZb~Ymx`ڔls/mz?noc&FMLC6-V\];eWWc%}W7Y''Z}ڎ sks|GalB>>}m>O>;W}]f-o|?;UǿklE}&whZ-RVZ:՚9L땘2Al2J ?e*#yRX$tJT@%V9<Ng]ou=V'\et \%5)pZȫzL4qo m h%TInɉq{Uk=ָ7@stI/&Y_1cwk[v[={%mbj_4f)Lu`ׂ] gMiYr*rxfT\'hW?֛iz\o?FHzBp6d*kJ+=vR\-?=}G!fhr^jZrp퐟ZZ೼u.F IlصG߷I?|||oHΞ1\N\:|c VBG>=RHjRj|ٿg5h-rBQ<`v;Ka ɣ_dgjS7/.\{lj0@cz _hkK׳}jAxG}pmނ"kGSvXmrоi0{,^~o*oT;n*oPb7#o%@1=9|NʺJeos. u0[v -[2c 9"MAq $Q!Ӹld֡K#Q(2*sI>F)CI#u2 p:EEwJA42id,Uaa1 ۢXxľ;="CfwYf/iPhyt9qNI[JR*:ɑʫ蓶hF0i),r@G˹ G_7qa+(f6j@.1xa$hS4b8ǂŴc_ԦeQGv^84L`@FFͽ1V,FcFUpALVoRiT̐ G %fB?8AFЩ)] a1qک_lDf`<D,""+G7i<3 =1NrN9ʼK U:Ez| m&H?e (s @1Yp1q Z|GDܯInS8JFYlNDA* R6 R5@BxiGUȨN|ݶ]O~rv3bݽJN ŝ8/g6UHµ>J=#E])K?dm2ېOZh\ T+xƅ^ZXr;眙tA(eݵl&?;i>Rl&Y$f((#-SN8e%#)*w-J4]lq &^,7^9|ԤmOlNt o'0M„UW9ڏ[d|6i&gOZQj>wm ew/v~?  _ǹv,K )R&)[=8$M5%J%ΰꪻu[nP`)A~N5Zy,{MHPs1gQRp \7dSexByÁݜk x5k ܯ>NJ<i]*֧l  )mNr,jI +h\RpY႗[XˍSR0 F!jMFIV2"QFzI L@cTJ*t3œӎ~sXo;ݡ~w;DpSs%&;67Ӥ,M= ufam"ֶVL—ܸjoh_VUU\}[nnz^&}$M 2+-c%R{.vi㝆cM]- gtmQ~; F@p`^Bk ߔad)y-HݼV63,:%c6(Ĕq%Ԧ=(Xre%"xH dqO,R<2KyyTYrS/ϗz‡WR:}ןԟ~ҜK9?!`է TxkpIB]z /_Ӣ[^Hwf:Pmxrs/cIדp?q{'<ඇV'YXa^T DAj7n0<%q__u,P6Gh,|»VSYBn6.^'<E 94Jg(2U VU4F" ZMo,Ф C1s DN0NJVqv.`蛓|6r#rfKbQJX:՝nJaze\䃙F^|O柿s ׉GP0vB#L+tr.݉NåޜLg`/OP͵_%P0 4+:mq)C H ˥CRqE Y[>:?KZF*/c)LeHQ}ִWY/ok5Dj%$R\'n˶uFa^)9It k,!arv *aCLE}{r3¯ʻEEpF. siFq88ǖ۱]p~U7f4abu$xgO鲫͂v,a"0W6 FNax4izzr0pwJ^ۇY5R{=Α1qŸ4cǠQFĿrW޾gwg߾?D7g߽UO0nX{}PC;]ϛwz:ܳk*]uXͻךJ(P|0tr=#TT H]=/3oRIpM9.5"0$;~ݥL Ȗ8]ܙ$mIm~ֵ1, *0w6Njrd9φq0S 6RHH*gXB].e!9TG0\Zh0*x"y% cE#20#,57߄ʳ uT'ij^'a'VԑwbL W|h~nηni؛pYRhSo fwgݵE~~>8~uH}ܸ 횇y|X?>Y3pwa5"\ԾdlKm*9Q^Fό=wrw"O"beDH FϜXf 1!A)o8CY;aR/1",DDꥦH0<)c"ҙ{6qvk\y5y&CroOW=l f׍2w4׳l &c_"zjÌ+$:+Cd"CΖ`=Ӗ&rL`1VYE 62b RLtd>g0}]1kOv~qs\)j*+zfHRq*ٙJMݕЀHx"eKC+&TӒȔsqK[c{؁^H4x!&S*Ô{w*UpL3-PF#ϝ[A/a ܴ֯jw˓ɧؚ7F3KUZ<{`k$G&eǬF:G]ͩ=UO:uo.:"L=B 0EOiȧR4IH &) ~H u}3h4UY(F|FS \8K/M w=c%w`.Ob' Dr :g F;ǝoq<.3<}+[Ԯ|30* 1J!^ڠLiL ihWwa}/f!~[l {e%a)FFam!F g(Dڒ[|sO!i#68R$#Z{A KfJ&e<Ng/@LƂGf2_ ݭq(.=:z$tDćCG~`G0ĠsT'ojs.`9V&W I{2@JF/tJO0{t$ޙqd 4 L|k_ `n %MisB Iy ӤDI]ܺR^u\zckO3 yX[7N}n;y3Z};5~_^ߓ +l2CizՑн,is+OKUڼ>}s-gu ᫅~vz&CՖ ꑶTќz*MMTJ1u3"/&/m΃bխ=U ͦN[3 Gsw24bW`tp@V\&U@E,;Wj4Pw{}'`\߹ C-|-~_\CWE`?Y'ZwymT+9w)9I@lRV~r)e?W7/P(FÁPPʔHIe=*KFX~}}}[Sizc'KA`kRב2{)HUtb;EFW++ N'\tQW"JB`ȃtAZJ4Lzc\3e[U0e]᪖[Wa``Ͱ]]s[_W#+ĵdtثRrtA] OYWW|C q +=<JZWԕ AuL i]WHiTUu:+ĵ ]WH:ԕ^to辺`$]!. :RԺ꤮4`C<\Ֆg=GmsW[29Bu-[Ե* Z'@JS]JIu KFW+=]!~)k5VUwtF,2Fg ZEEW@˕]W@)D袮b\RZ4ד <RNI+0e))m V0\iUm[p(MdedծIϕLkIFWk]!^WHuUu% !]!pp+mkNIWԕFֶ8!N5m0;?M5+C&XݺЯ?l{3nWuL]WW*іy5Fx}!T^+½=˾x3qY~EcVb\Eχ`e9}^ٳ2ndenVB/S~wCՎ ҋrZ$@U2rf?\?W^E#&QYzE7\dx73IuH_ŊKeo~~%g7Vu(θY$u2^,r;[W/]c7]j7V}?(nzoYkRkBPIx'Ӈ%hw}dX:BO9w=orw~wJ-GR=b]Jr'aG US뼸hXj/~%nomYl٬e%?gu+ȵ5voUAe5 =ǵjES|s)uGKFU(uѱ<'V-:w~K!{z!j/7nU C{UE<͖ rs۟k@VoF<3rPf3:X?.ןM1nr{W({ =kg9Tn3!VzY:;7 Ө nw5V`L\" , i})ITN4Ï#M|{|玃 Hhs^vG!ckgG_z'ڹW1-]sNW!nøh~nNަݜct{7SR=F<Oǣd .HU=,@]\|0Νo;.ry}'0ft5mİrmމa^>Іf}q)y3S>=y7'l=RV0R"ew(Tn4 km3/9*^)LRaB+B`-]!$+5oGJ듮:+kѯ< JW4]! )K꠮LPj]!ӺB\OfrR).k4;2\9]!0$utt&1o3l>j2 vuD:;RFv2 ЕJ59Ԧ[ q+2v]!IWԕZyGHW%+ֈuJ']uPW+B`4]!TtǞ_WH|Uu0;:cW{e{ h-7 )U'ux[mP՞ %Ҟw_ՖFwϼNӚ'`cV;8 R^%յy !]!adtNQJ+R4E]Y}Ct^WҪW N#]ԕxJAδ6j26y"Lp:+މT Oл:דi]yoٱ?_WHѕnǠV,L0TЯ?lnqqRe=Z4(۳7^Ϟ/&˄[/~V*|{B}+XCi`RB8Z _.'ߖٻr]v٬rO û*\ K0\<RZL]']6W]>W4ciോueVJDUdF?']uPW3) 4ttH*cRrtA]y4 y2B\ i]WHiӴѕoǸѦC ~XWaEa(-lЕO5`jB`q"aQ bR꤫.J8%]!dtttR*tA]IaD`OGWkOZ+ )K꠮{BB`' pOqҁtĮ+T*骃XO3P pPTt:RZtE]i&sPPt͕<ᮒ*֦=H^O]H ɲ*~HJjn"K`m^ zKF{)HltbNK qvmX0JuQWN!]tƀWFZ0Pjfػ+2s㒮7^ ɠ:ˋ!C.g3_eǓ LZ<* hb9x~0{{۷oי@{=)!S0=dCTOaΪw%__}.#UBW+_1ŲO@ ..71v//C$_,gs,b6OWA2CJ7[V}Z'g\{:4WR/`Mfc} S }>Kr$9["ꪷCf+Ⱦid1XnM'GS625f/B"UPVYrT 4͋R0li O MӮj<5Q"bUz`v6D>Kk98DZkb`ܔzDUAd$%kdDi Iaԭr:EʍZ 6e}y_ARRk֖vY!f &K3%o%i bĚDhi{Xz̄,3)$"|sQ Eg) pz&<lQumAudȲzY !scwHy׽ mF4*)sT!  !DcUYjkO;"0Jj=,Q(J1Dڤ@і8: BORX:aBZiڈ* JHJTm{(.QcKIM(TGhO$т7_ եfP#2 DBi 2Q[ݪE' {'N4v 5h;i\qLOJ$Z<ĪuJY@X4\,@8քjBGWwͤ٠+.`55Z0*k.#eM`t^# LX1jEiژuE  " ջ&PP&yN0¤T)cH1)ml 4[ЄpUTiYi)̨M : xqJcBLL 𯒡2UWHcn'b2j2iX kzeY j3@%Wk m :Zom@ Dc(4V_T- Lh=ilFf3%̨!2֜Gƒ5kyM&"AC\"1S&9%J,"pBi5 UV+"1޷{2 =j+ZeM236>e/֞w̚(Nw )H^.D#MȾOk4NowmyZli?^'Uд+[w!4$M Y + S'T476BO[w%SdIW-$Vj*#d`1uLECMZLj,Z茸pΠ=4D.h2QӪAU &de0qcM~y ᾬ$kIuKw< om zXtu~R% 9EPJE*֝dG@ZA_F Hb"ǍU1ro%~L7]@,$!Z*}e+tm e,!zԥ)G;>=Ť}A%:]%|̡aLBP-fH H> jcJI%KkNJ@L1^!B ;f-D;f brtCJ߲y@P΀')YCǮ6 3iHQD(ͮɀC Vzw7qVUpT|Xt*L*BȲ$>f$8F4flh'JڔNjC MʾC:gN& h%Йԅl R4i*y/Gt^Qc!-A%H*tIF}*M-{0~cV6mz̃v;'FbNWi9nnrI3FP.nnYf= ](z]HSp$J(uܚZSRМ'ՒFkBo1&\,< 2#bwâDy H5Hmk* F s&I%\IduAAvP-hH &Ш3ES'2(xXoڪכa۳+YB|=l7؊PD4d6i'#7`Q' +U@ &~BFjXQ40xRuJc-\GڥȪTyGAAgМ[R; Vn9nkAD %_T(X6#TK.d?!]뜼u~=jS!`j7l HJ؝5!-:m@xxXA VT@;ltEkaӠP/ZB\T/$0##]2'I{OS(tJW̤QH bxxpi8n!X7i̦rUa4n"b9\PHjzvБdSZ,=uQjIX-0 Uz+<2B.8! R GQ~U[ߜpp8 Tj^sr5 ":2~@sz㏷~yB+ | K/ ~gqٷ8KoA.fC%-Dϔ߬og!]qa-vNs 9/s}wWz]m/ϥgEh>~p}#Ŷ_i޾;,ڇlD"^~h8I c;~ޥcƺ͆3uNjPE+J^O*@0)lHnVυ֚/Sl_w3 @2/cI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bIRs"T|H l[kOJ_aU@bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b%P3lxN$P/r@07gC~'`L}$5qI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &uI ω!`qφ4'$H$@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@_ =~ޫ?Q>x=>\}߮peP)vbf_ +]úUZ/V1Ó6쏻<1k{};~Hܭl5:,T@Kܒ:hk#09*31u#kNFh1FA5xߍL>q|h{ȩ3:~zپ{_ fn8ҟw_6[/oe໴{s}-Y9J+< (xiͭ%Qoѷ2F.ߟz|~}>2-w(K<^5'BCMR$L򑬉<ŤMhʡߨ%Ys4ԃ+0cP״ȾuKZs/֧QSb YPtu*UslF\]9#+,c?4=e!) H3;5klk) ~Ak p 46oG^H.ArJ!_` dX-oZF+i?[lOnu&O~oO_^ny"FB&[; ˩/nO^B~`3۰8}8o^3oAΩBj\J.gi˼p! ^2Ώ]dt|cA9QG"/1w2~CU5GvÆt/OuT=4$o:^oD˫͕yl<-gI[^_? mCGxBW7Rj_$~Oy{s|][o#+~9=x ds6@  8,cedI+y<_eZȔ- hMXvmFi+a(c+خh~xfz{G;iy'Ptu}6Z1ƭ,Xvhe9г{7λ ˇk׳rruYӴ\#s/<63_x/~QrN[:4'9l0_䊦~/~|~,û˯?_qa߽w:IHÛb2y[iGޚN'-ji'FO/oדaZJ]dJŒlE3]ࢋq j4"y/jx[b C­o溝cp|clHz6% Qk)}{L!f+p|ƶӢ aIK0кw]gU ddiGZRq;S?x<#I_7S2JSI3̑e0,@ =8}%Z4ud v=E It5w Mmpp{j} &[~L79z %@UT&-d9V<9 C+wBqB#ojsai]C?Xc\Q9!#[%W#|a]3Fc#<(nuXlePVxӏm.;s6$g\Ucͯ1W5<4fg˖ibnW >m4@p^Y a =>a{Qn *3nϐ#=WϜV#ve4[i%h@E#@cΒ3u dZkERJkk$Hk֒ni蓱ЬfAb:s$/g i卶Lq9ݶ0ڕA#7;1q6cI 1V'ˍW*Ӵn 3aĚ9gyuyqH/4׷Z̑d^y' ojMR禤dL ^D%$<:e$DJr RIkͥ !selў,T5JT8/Xn_hЄO48|/ &kx|k{}WVӽ4E؝n쎮]'7m4jC[ECj@ ;{Cb $ nuu3o{lG6;Ne l; lun 7=5Gk-χx,wϼϜ~a2wFOwt}ʃuO%tg55lonMsS)랟??_hBי[͟ /t** w e d۸m&GQ鞻ܽ'#:霒":Q̘f[ g%ϴ^8.D6z*w,"w&Ξ}<ܗv!ķ>lCCF>7W6k8)-|ZvLF-t)+1N!Ktiڢ Jbg5O\m(tZqWg ވ^:.=wYY];3mq3Sn^3gbߎD륵40!H"mRRcN.0 ^[[rvҞd.I>.LIZpfbR6!8p%U*nKpn90h%!Ν">уjo<:z HijQ(KLe "2BDIcL` F "qBF E I J; \AX*4X;v8 u-/P XeuaJ"x9mJ)Y/\vzu?sgJsc|7ˍa LU!dX"YD͑MLəꖔOHU"cOFUtda0.QfUygϺ^[ly]̥_\bkc!HО|bZ˘B!''t@RT@| 5fMrg9ڞ<1To˗|Elh :lz+0iJ!e mjB Ѫddh62CX1'y|w0yL 9Im`JaL*e΃ "G$ t{k9G#sqϳ ɀO(#I*71 ll&e2wi,<1_/ ̦u:7nun=:z&tdsJҨIe>AEZIkNJN44xLm7M9ȓW?7}?:R7y9gbmiX~@PkBm֭g gMli\ҴH]8bt< MA4,A# @0jvXOV(΅A̵\0uQxB,b&O0aB0ȴȂ=z=?B'd_Ut>of43'{bex_]Lv͛`} VhKms;[RV 6i@ RFH=#729%rpVdIt^1KX1$pTBT5D9jəY8֪Z#8BO-"󃄣5~j 3;rB@6%2J ff:ْ'ʜ$C_UhӴ3LכT`A{HCȲ3"J$Q> M+s̠,%3c M24*L_'`?W96 nZ4r%el *a9h,ʀ̠u1Ϡn98"Q:pM+ ;L*[I/b9&FH%Z^R,MӇ>;c_6=<64D%߄\Oڸ0/U:+^UЂcBxKe6'i̘iE[3*taq.\` ] U(Xqs~qϢj/3\)<ǟ )(Q餔 3:q,˰d$=!)ڔڠ@uY 3V٭x(hjPjElW:%4f3-`U42X+ tB)3+H`6Q+[gG=bȄ 9FI.jEMt}C*2$R]8aKꗎȢOE#VjDQY#^#qk<>)$bb1eF啗"$81My%кץ֞$I.ke4 r3!ҲF&HLZp@*kjrUҋe5SuV%EYY/^/zqkmAg4̒$&F @%zsv9s2H`I}Dzx x*8TBe}#U};nuG 0e|#~|G%r7W|X !9nɯ]Ȃ4%U^Ӏ1ʹ&-CyA{%N9B@!HS[UvD!ss$H|mYFN=0ŕv9J ۑp.7wEjԬ5:`DYMJIh`!+sNVt\b(42"u,| 6'"ܫmW dj?}I3._mUf7/A8+4d#C =#EЎ%)Su2ے>U\Y6 P fR[G#sT(QnR  ZA-.r[⟽ <B&GQѻ`Se%%Gz&, k]ޒ^ㇲ/]e4ǯk;c}3ObzLxDo3B+!ΦIWsRwOO^a˸]^oG,[ j!&)qMb]v<@دcۓ؏\?t<q ۟Է)Hǔ::INkoei@9a`Y|ʤwAh??Vcj%h8VEewllQ.ǷFJ [%DZ:/?󠛟 up.|)F9(78#kǠfo6os>-/ݘYX$XWLxaY}Q˙%nnNis\ VՍ/ Ji9϶fgһ-{f[d&&fC]ꁻ{EszaֲJGMht;6 N@=G>=}P 3ZHfD>jƹohYc)ׄq5gNSXK##bo/Re |sɼ@YVD3 q1gku1 ̔wQ230ܣEY 8^F۫6k=W8G s9)pJs*x٠=p1pMp>t)xrQ8=+ ek$MJd^nECyN Oz5-Za&YR"#^PnSiiѝmL;=; >;%3 "4DȉШ#K6R'\VI 44Y=s$Cc] )MX2FhX' Ycp6ԳRlb+F"4JRaZS3t:wyXM[7˿pAYT '"d(ȿ6!e5$s!LDgIj 1oG^ǺMfҬ Re/,eȕ`-Bj l&!%93`izG#q\zY[-,s0/ZYgmV"? ϒH`v`cI`'Ǣ|ȮVy& 8$Kl"Q0/sVIhȅ^z[d&?d:MY]W3]jxIҚ I0?U<&Q:3fKZU;[[Ľ3k}Ek+ݻ!S<&^/yW [ݴ`LzVQ-ZT3[Sr֖Vc/ɋ(JÒ!m :0] SPrX-ׇPq\tE+ծSO ,)m ]!Y65 P*:ztňTJ5tpEk+D]CrZDWXPBܶNWRκzt%('M Me-thBEM19ְXxP4H`}dZC|rDm˻tn,kYnb7=';MELVԶ."^-MPҲ2 wB+mc@9.&9qi74k2# ]> U0<>D<n+<,yޫUT|=ej@/~g`'IG d$perH68{{q|EQQ#M`eekf۳h;̈Ra~fR6}==[CW׾܆ Ѿtt"DwtʀoU KIZCWW5 P%]Y,&*[DWXĦ!\!BWVA%DidGWo؆SN ^t\qcۡC,mAW]r^uoe_B=tpn ]Zر]GbJZDWXխ& Q 쎮]q!l3o ]!C]m־tJ Qr;+& 3B>CmottJJ*T$VBQLW|GE4 խa SRZL랩JuNhYDЊ":$yج^2)QGx||44˞fYG6jIEfXX-EaN&˄{"ޡҴh%VgXIMlkVR@oJ["T+[]h:vҊ*:zte`0g5tp9o ]!Zy_eg]KJkݦ[ذl"\!BWc+@Yt7z~J41G5t+V:vh;YCAK ;P P# u[jשe FqYYݾ*ܻr %\,`J#VԪ[n}_~,QiWo$(DEPWX%y6,Tgpƣ )d.VXvXHyU(£oh={ѹHdn4SzkfW7u؎`XBs-~.g9K/Ӝ&x.%JZhxwk&3- !Oߞ(3Co&tw[Zon;FFB(w}rmEj]ب"_/Q:Чp1/'i~ݏћ`(*2^Oy!@$L)fԻ/[7t lH~Œ~-'ڛ*z/=t?~/ü~ox6c^T.{C|BiĕdfG %lY~|׶u{8IJJ`ih+Գh)ĩp4n3AG`오IA`Xg hkwwV1"Q*n))@?O. ?y&b"d)sj|L.-LeʬbN OD1eLr2pjQػP2nx>SLsr, |!XP?T -1}3\_v?|klNiI^}MyEזbG#4I?/9?XP~9f`~H.>w/.0- ?hzYS xoLd貔"CfE""7+9K)ՉUbEc F24baa`}hLF.S1ORdtV< kp=ϵ7̨5<dzz\iKÛՍ0h6rj՝O{eTp)WODg.@m{]_\޴{Ka* 6Fg\'xa!h fQxH,Mb4+y:,-=9?;]@"ley4t1^Mu>%?zn']O?PRt?pE~m>}Ğ׫>=Ҵwv1p+VB3.UYS0猞 =|Wt]mF5T7ywUwr^U6v;H̪\KˢRV'*7 W|oy{㲛 u~ncSOuS0s'e݉΋B^lr \(GuN.&eJ-L 7J5eGo3 9^$-#w6pjgS33fKg)saHIn܅O 2v4^ k'8c !,|2JHadҼhVz4|3y/e {&'N>'.Dr%a)&!&RPcΨ$eKֳEȬѽD=J>Kl"}҂߂6*b"^TĜ7ln 6_j+  h|e4گ^s׶27f5]0Z8vс#%{9h{I$茤}$R{[o=x>^scaCŬuvQ4ܐ)K_ qv|6WqǕح~8?]ZX$tz~^ao x 1Z5^oiهrMln~~n鯙]F*[m_}5av@վ&[.SO.Iu( sE浭9pn9,D{>ۡԪ;aYI^jٻ6v+~*! v^zgvG|$%9Zʒ/$+ eN=$q\67H\omm\ i"0N'S{#!ۄ%&ocGHaI7Ż%kƹl 5H(l$ UT9WU]&9a#\8X Mb6" uadUt1P؈@%)yzS{:Mstv7m/}.Bիqo_h|"8zj/V}o>~)zlkJ6E3dj!%C_^ii:μ[n^J )#Ơ$FgCm'\Cș#:S"]'JDβ *#Qͨ!גraUkNSmv"s`<:f DDL󯫽L|=MC|}!f[ۑflouC^12Ug7՜v>+bV>вI7ܤFOfD Sq]F2zump;z8'&OPF2rXk{?|?`~}f{\ݓ<乧t~6姕y'buX} qZvfɟ'ۄݏyNgu9/㘛#7s}A %>r%|]~p;.G]{rl9 քbv5~{&kU5Αc"+.ֳRNU*?[1'$1BMX3V/f~y-ď٧S6lvzLKҧ혃*g9 fRA6("Z;b,YAEQIM3(UԶlŠub-Stv1&0QRH3ia7IJ-8O[V츹Ãjky}[Wr_1z/Y" ³y߭ȃ,H6 \霜k)t~@Ðl,\!ۭ|\8s.6+ 0TN9(ء %Dcu4.ƻfoo$,r}7<y΃5<]}Tp o2J"M`g_M&nHH ϽHaX7,RhB'ԓr*],-ϱ ::; =ep9o@ڞ]7Rw>GrRJ%%|Ua, 6NuaQrG 4]gdSQG{t,Wg/rR&LWD}ժF&exo=DdUzb>pvN(_R*ڵb}Ѩvb Y ?[~Lх9+D˶fE z|UoSh_GQYɯS~E~٫*FިbKIHANVT>,96cj$neݮ͂vs 6"*6=TZLED50cT]R8O#c? iƑX{,# v\~jx)-.ʺ;`Hc) )sX!jY: 3eW3X7Q8 QP:  nj<`/MŷS!ACylZݴX}Q[G7N,6K6X ?[=@VSV++rucC@HF=[S(0CEcSIgȄ8 F/Rl*TOti<\9炈XD4ь8"R8xb& cJD0wJaQ|c:G grDr 'mC{8+W:MKE茋0∋{eώAbXIGĕ9r91"dS-`G\<. vӎc;!'^^1jz/n ~`mP6AG v*GY3 ƠDT4̈)Ve%HY~rES5ؘylYdXg(Z1*4ӕ*ۗxXxu8Ъ$L!G1F}(Y{38:-0AIۣ{Y_Y$q.){0ʣ}].7+ k#j>Q[&:9>Od= eBGsb+:Tm8:kf=Zn7̯Yhg+ATDs Iꫨ C:!a&Y(b %2V:P"؂߽2&@Gۀ*$e{κs0.)\~[:RE٤V0_M:V:Fu8[Ttƌ&>_8(W -C!C azYnZݨVYNIq^y65;uKlCF+Ј#Y0Af#pt5K'^; 4>h6 9:RyuQ&eql5X9-u;""ۧ9 6Q"l|#qHgh 5X @&Xm+\h].8_*qyܑf!Zt7A?-Zdw0 VCMLh"-e0Pv'|^TuY=/WT> CFv-{<`nʯ& ֯=gUCu{tzyuiZjlUOkwS$%<3_qTY~Z")h䧒8*Ԃ6C&w]M 3][M/2-_dEbI..Z /^tev5o[js/Nx${ora3 *k?'y;v_{{ Qۑi2x2%Mhò%8گQC 8\L ,*8-odOad]ouXpdcdsVIX4P[(,ZőkZ?jY?qãϜ&M]+LCv}xt\=v#{!!۝>OЛ^3w䳻*.9$;-i@qć(?B92t=6p>LNvۯ/Q Gt5&r-l۠趾edMĂTTM܈ 9ңe$r$lG@XLBC)P+**v>)&΁e9f>jQ:tGև:o}Yng7 yDv⯖C_K%'>c{Jl-K9mR([!GoBt eJJȣhD Bs˽+D=o mϋ=Ǣ؄䫍 c!`/ޢRbAf7[;53;r)^0ru@nMca?ͺj'F!SJ*d*I.kq.\"qn˫:BN!AAłaVqEG:+Lڨy]Oe#qcx> ADҺ^1R,11hݬ~c}4ZCGSB)]Ro=x1+ha\!<Ep:mIl3&kfC Hͱь00~L*f"csEWS*TL}xqxBe9[Ervs~KY9\Zg/>5Mg4_>ͩ]۝i޵q+EYG"'IⴽA|i`FTINwWZIֶH[K g7 'ec!97}_Vkxk̯UԻ0s3LHAg\UAAX#`%}D>YAp9lzMJ`}׻kފQ"nM -U*.Ѯ-7ep $1+I@Rd/Q@Kv.VñzEwHWR.(łzi yB3TF|&ܫA!G/BKI\iߝtgjsn1u Gyz3uhoyTzaR]kŢaV{C1h[5ńbƕ[2W{zMGS$qgiu`<'&doȮސz[i6;(Imm`SSZ"AL`^Jm:6DM yjXpZ:j8 (N. J4":6#c(5pc0u(Lj5,Lra;J/IJ2ۚlXrZW)=_u\2ҭ۷QG}[o" TSRGAp1 k/,N;$a"הl{{b.KW*ԇ#{$jkZYFX ʣXMϟ/x\9Fi1ƅ'8JI{=0^ECgɘL1lv9F!:(ԌOP_%P0 $+:mq)!B ,0 ]w^zmϓ/lt~|tK;;CS*z`d_L镚lHDY=8u[.WuF++.` 9*%TT͓~-uUOޖ^^M.^-p茙KۏNjjl킣DդLg  o=]6uCYn=,DRGVA@}v/u/qcuM6F&^:NGaFR%Wc_X]n!I}}Oҩ.Kc:q5ќ9l8is"V|_wKCZ 8G1|kZ/cA4D2kD{CE.>e#b>d!f< y%) [pQq\RE˩N_8)~qRE&/NR"f*PFz/qu(v :]Jt5 {KCﻟ^-4^|?.DWY&.JYJ\gyQT%;Q\)Aկ?uWULf'mh쩟%8;4vWs7N3%Ga"G]%v:uR]]%.jVWOG](ؼ\դ۫iMVvE0P,#p˝wy+mTW R'A5so0ew>r`:ա9&We2u*ܞ%9GyV3" oXftRy҂Yg[]%P'FW.(}珟^8!>xf)H /L d-t3w)V0#*(`FB:DcrŔmcfwPY10 .f1_]Xnu '1begԈpvDVJb*%X6~VJ⒐l"u st<*KL?J\*7ؖSN"GWWwcXGVWwcO-%qݸXVWN=F-O9GXJRu*qYVWOQ]a|{2)oWf;C3'%'"ŋEՔ)et #XEP0ME@ #.xFN=3-}_VîsSѤn~>|7?~xX?+]Nmळp gAop%x!R_]JC2J6S]*x{z?'ijb w MX5ljgoCWk#70[yf+e &0aX&tRg\[؂EC1cRc6(SX8Ss$+A?nUk~~[}u::godM(4l!|n//Zn6$\*OnsiZwO'>5Y[݅D4rQ-Dy=3lg=j{L,y"D1(v\FXi<̉e A!3Ԫ~oz6عφ皉ۯ]+S4jnx s5,Dݫk| q 3`J 9g8[ N[N8SqNcF1ZleF`( #*$;ASŲiyg}8)Ѯl9W)lWd,K-h?ୡ w]\ dKJjFU6)[*\4LFFl[>|ð-!'zlr'^ڷ-3iBFMT)VKM0fZFl[~붥ݶ~bZ3jw%ώzH[hf BY'{$Z!"#l2`cK gȕ_cڱtUyJneOZRLZRNOcHJM I4/1=3Ә,ޣS^24;(bFʐZ{ۉYײf͒սgޱ)98kvM>ny9ժi3.u_w\o,}_2ĮO>tl[۰:\/sIQy:t0]HMU19vݹSUZ뺠27SؾpkPVǼu}u]֠eh+\ܲzвY#e:*WLkJIȚ|M$ ⁏ #GbbYc.zTo;uh-(cIa"!*bRA  kU$֥8qm5&Q;3{aֈhxz>}^TM "-bc.l;Y)EjA"y1aFq Rt!Y=(@t&TM7Y:>."c:$]LxD1]lfY$.V׮+nvna|g۶=e>IG&07fW$fOW4B=X[dٗ?b[H#Gs¸K%Bgdx;gzLyEE2&,#(L&(DC!jMjzYVkp*FMMxBSC.ecҾ(s$ 6I(F6 DɛP{Mw^J-&'*}`;-{+~;7OX4 8l b}Hl]/?MlA.xJ. P蕮K}t m24DZ:RKthP6 V&82 ]$fT!!,+Cg@1$#Ӫq>?EQ],ET"eMZlZG$l !ISgrҶ-zL=|*g-\J2FD@s(M9Z.ʜݎZNg}eucPb5')nWbSv JYmnʻW0 ڗd GEg&6OWrY]󼝅i8K9SR1T_Lm%;" dS+=I1TkdlfndlUaaq 'i~[u .*3޲8g*yZfi| 0O?_lȈ]"2 {'ff&ޚhr, ;eDqȁVk9X7셬ĪF & gK I!7 d]wٍq:k!池vq(j˶-Sxf픉1sĎsaJ^i-yk9zE@ rmc{Ɔħ23CE'`#HFa_Ig8.6A̜xCQ5FD5  wM(p$B t" U z0DdMͰnp ȱqғEEgbfbK#iTLz@+sP~ԑqquLkFɡq\pNOb@"lE̡lL)dI))CN\|\<5,T,^}wӨ98ն}5W͍X:Ptl*vAЩHK@Qnw-"[%`l_u:ȶhA`qBrL*iJq$;= Ej^U6jX,ej̜=Nk{ nsb˷cS5ӫYږ>nu{NqGzh?Zx>iARxZ8b75d+AF3V y=a$m湫/σ?;ɥU7>*:Fh__8C2ޒ$-@I֥B:$xh(VV3.4qyn:R$%m6T瓯%Aނg$%A9f 8+yw1w_S xiZDoLZB$4YニֆYʒu@ZaHf5\aۮffAVCPSo=2Y*MNEyN|K?+@,>JqXodv37!z4xW:0.&]ߤ[ir^@uHTJ0MHl"Ge-gS\ʣVk:J򨯰<*t ,y>:y:+}ǦU T'@0xW{Ҟo/;"FOgIvƁtwYߕ>@ t׵:,__2IE bS2䄉^] o K6 vSv-uYL.iz5˨|ѧ2]r?1uoFۿܣ dۣ_<_''O-QbMzv?WNۮ$Ldz; #G.ێ4k$NC`D!zk̓1-vt\V,k1~܄9)i~oN.MkH]wt榢-n`]/u};=jsIρkQ$)mqu'{Fd3>M$5N ޻ZAqZmַ#ЬUY{\uw7L\.}S`Yw [޶uj ?P'z;ͩ[[#\՚韑7`t"`:@](=yB^00WX] a+ gsLhM 1rډHGIF FB%Gq ZҢ&,\i!QU tY$\$|caO3s*2!u韟I*zxw9NjUf<'ˍv\g~z$v#ZPr>:'u@ZZ#* JP5aʀu,V0`P Dk^dAE-&AWTЧDY|jTOHJ_oIq@ٲv~3Og{l%,uUww,">-O8x-x_qB lFkDDHd31YQ8b63]zBπM q;l H'iTQ)+ 02ګt*G%*YeP"s0('TFi@?Āyܿиi:)ߍ~^\Tluh/_0'1ͦF]'i\7Avӓ Mgl~BsS:d]# N~ߣoεˌ'/.˸1Fߘn)1hkc$ R&ub/HY0ۺ/Kpyσ[v}gSUo\=:_u6wnv;vn7w8fRm?a=V{7^cCz4acZk `YmҚ4^j:H <˰Ҍ8Ol''WB@W?in9?mx׻oZPAS8Lbȁw/#QYc>"匇λşz)OALg''yՇ|M~Zl5EHPIӸtKEHCWit^{wז;]ߑ%͈oaiTGϓT=hxtGɨ:Ŝ>׳H53w_ގk ?M~BP\znMQo7^v׵Ћ6+6԰WaW ny-`1⇪~o"ռ/{*~v(~uewQگt~lKv듈a~+BLvuΏu~{91Ǔe2͝t-n͢(]^p]%lS;n̜6ac+kw&}npWKyuMZ]rv%;6y?]X&nQ;NZ]cںo5BBέ@:?={Y4HD !R/uV* NkP R&UG& U/O@HE@ 85r1. Ж0E(ѦCH9b|Ы#0iwTHnQ^sf7޵$"ib}8g&AзQ$$OY?Ų-ڲҶ%/,"YWd]xd@4ԸHo43Ι#jc>yvw|p ͳVz"s}<[4px̏S|]ǥNoBm$dh)e|z>{OK0vix$rZs`N}~1Kk^fmò/RAջmeOWC(ChPdH`qF&Z+NH--|m";_KE!"%}KzSh@pGWE\Ŏ|?\)•cFs~_Uo5n!'@.|v[Ǯ]Oxlq]UR+$OW ^n)'[m{@b3*I~h,ӶXќ CS'[H>\g/6Ϭ>Ss+08+$&Ǣ֡< I6 LgQ8D}*3j5[k|ټϏMfӟءQɅ˂vkm!I;q Y3dހpwYx9.G˳i ahh0le7L;u1'ڏڮً$Ys>1 ej$xTؼ̧\y^<c2+g$L \h."Fƌ6W&gVX ).$"Rf.%gIcSA teXMnCjNLNGl2QY03eI4&*aHfj,"3uy<Kw%3(& @B Y)e/"CJ#@-A` XPsvg3%ZeۡdbUӓcoL=OuCr}伭Xٺ;g{jfV0~%V:&gJ5o1$ Hnu_łƏF/'<2b@LN$R:lC^L9.EG\e+).buc.OA(h6Ae Ap v:iY42bWnq1iǾ6=j /MRV-1dd X R $Zg E$Vٺ=*m|sI21C.)@V"eFMđ0!J,RP]fVj܍mPzYP~싈2"{DM `2EbNQy!H>;t2&#W ٽ(AݢI#)6Jw d3!Yʬ".Q.%,gnu \FFҳjZ/.Be\{\5ˏ|LG,k`&31ioLd ac6ȒAx(xXM;CY~xnQpm6ϟ[yx ~\46q$>iW|yGT 6UBv!YCƳPJ.,ݶ'+2u({"E=OV9dp%NHVtI4AKr`^*!cbXXNUL(1c2 ǘˆ,&(I6YXah5qvHto@?7U !bؐO]OҴN;=ht\tfZӵrS\h4/L @Q9']6 )8Q x0nr(S@[n5mLq,˫_ӁT䉒MnõNC:K|:CLg;}o3jlɲp&}ci\eGO)Ѻ`E4䠸NKn 8 mp> PtJ}{`x @ Py\022֛,]̀h%BDTuS_qHG@(M̈dzJ&ew vЊw!mmxIG /dA~߶Yۿ++9.Fpm1v ?p&? ~\$ Xy,+}kqvdzzi8N߮>lWlˢO7i' 1oqerɹCS!OW_ohwWgWd 00}syWoX~0 @hYr>E2׻_惷Ms{F8/Odԋ̇<-Bַ.o):7skV"m[{1.pf<272ox]n?4Z^ ?>yL>h]; W9,m)rwe_9ƨ3,>A\6t)|V_*=.fy0lQ_߸\{\VmrG:v \yeلeW {}B#K*nCCfO^g \ܒ,#M# l}g-NQ ^'/$w4Ђzz5+ -Iy_WDW2z~mT45L*,xk3duXG:϶֟IMwԿ<_T/uGpS(XE"[mb|$BhřU2Dgũ?nj:z}ٱqr%'BJ Da孰r.k\u^>]^2u5:RO*jra㹪y~)JK{RwHQZQŜ?&kO gc(x Rֆow&Up;>{-H}P <ʼnTRQKI,!%h~Ba-m yt!y.4J)c$p$?6V;E(# pwA!·P5/Iy{M:<_0,OJnZQév(i%**W.*P+R3},ӊ5" V>7_ߏQz&ۋ뇋oxAx/>~waEc//ЄEcUs ϫ~9dušT~ה{? |50=,OC!.Zb? !_8WI㞟C:b2 V%9 Njz\&QN)ЅNMӇAV8/o@y /=c(?k+ gqͅWQQ\ZQS(]'P'*I2z)ނ:}4>N:2e¢\5Bp/1ngTבZ.|ZN 9"B5.6=`?ǃ" @hQ }Jce ,\9e? 酪UND!dcc Iʆ1 L$O9r|Sݷ jZND.0tB%WEzh|\js?N*K1$bګM"5>^N6 ,Ad!pҜ7FPϽ^H:qs?&%ʗG7)={yZ/K3Gۧ\4-nS'Oz<@S{{zCr1qm$ aR sLxʈa"mr1M>8uיӹ3hj ]a#Я s$(Z 31 ]ym|!]Aʊzպ pQIETh^hnܸs[Tt)r#,[1! Y-C`HLW(5>.=RkM'6DB-ME3oP ul% ˔#8 ,- d-ۏ—TdPz1c'K:bJt49|;ڗKQ#`cLݛ*$K~ݛv>aWQ"=XSw? qyq|?8`wthPVE*>(IYk)r{g&'Rz󶒭|3ҠC-͡e,| T[n:y-Ll&e %7LSSQw=Wv`ZtO{ch0-V{*}u龗_{sWY~a4&^Aޔзps[G] <w:t:@EV'0J@fVt:.:QƧ5OO!T;=aO /Wqл4*cY%w,jl~| xlg:ڥX,|9:Eh6ZmO[iITT(x.Oqa_⨀끸ABɂ7@݅`+u vv0j'gha7S&$Q,eHM`Jn 2ɣU)'%t-YI(vFMZI^'ڃܑos$GCa_?[Wv"Nz2?ss幆^hTZ1<X̦%ʵsɉh4 cPHhUT1HL451sȊ)K$)P5$"5KYyxD FqlA*I=~xZܫጧ ٓk"gV̉2 PYf'슢'Eo2'E_ҕiG\ mӉgO?tw 32\2l .A6 eRG2@uz! ́DrfE@jvpK sߎ' ^6l+Siz*P99j4[&W;xGDotvDpz)Iz_"9P*FVGb5Eg:EKbLxsƲ䉭yg ~U qVӢ׳g]_Ls|]TzEYI]f454-$Nq{mq(0A{u9qr`/@4ny(W-u\[ K%1S U2fK >r $EBG$QXC NuKny_a~<۬^N[CeրVT94 E0Xb@ EeH}H.b 0K ҇1=Ap&eFU s,Pv E<&9k&RWꘓ)Q8_^Za1|^p9\i#.ƭ^STX[:H+hhud&\GkD[uBYudLw:N0\ǡ_Wٿa ៳AX^=Î81IE=ą7Mu(pUUZ8A2|ovjPj$Xe+ ݘuB߂_Gƪ/r-PlJ{7~ozO[ib.Ɇ?Y_骮q~woDErS8L`s*6̘nw2oLs; 'jpֱh"f#˝e)8kdalHE<Pl(i)X0-%-ZJFyӵD) tZ j)RHdJ ET.pekbfZ5%ЕFAB6#`=gj:]eL2ZЕL2\FBW7*+#am])  ZCW"MtC/ψ4j#>ǡ~\G]mfѕ܀dGW=UT"BUE[*tQιvtu:tU"wtfpl ]eB42J] ]q7-+,u3BWMi\;:]L憵.Phi:]eSO`DZzv}]Ltj|ca镌uo[BE{-«kpڲfD^{{H}m2f$9hYUNTp& sKFh[*5t(v^8HW0m$]!`"pCWi6X h*m*tQEtzu[76,豝7{zfhőbnR6~ڀTGW=v&#Et pj ]eF7%+Nh׍zpm m Q;:]FC4k ]!\JZCW-MxGW'HW2.$]!`h*m*t(v%Е0B5lnN7&%ZVIقyoVIRl4Oc`m4el m4ilx~YGR"-%֤5Z U1d֩vR2J莬NQKQTC0U+d[*բt(] ]iF%- UNرc eӂvtu2L(& X!(լ-th5QMk>#=CfcnW2Fh9j3az]m;3 Et`[CW0mVӦUFiTGW'HWٻ޶-W`mq#{_mp^bneQ%'E!)%XLϜ933g,]E -G*<> ]Q]Pt?G*/t*ut(:1c0#"\ޛUD+>vQju#+θ`GtUz]E:]Jy򮎓DL~jlçxP{Lҧk,HoWfh^#{k%'WSpio"NW:1ҕ>-UYO~]x-;t(O4ה?UD{UDN/tˮgHPN?ns' xt"%]톒XC]]Xv9Zw ]EL"Z@DWGHWDh]`&xo*ս-*d':FϺt5GCd0$DWGHWNA@MU>.q JC%%W?hz]]x1_ \:4KhC;kW(F]Hy1 J6oqJ\g}v]JYOþ.W@^PA?PZ8ChJ+ MKh.u1/jvjmwc[glNnt oAl{B7z  :D $khb[Mg5ɼ=A~wm,فF}\96\IK+kT[ԫ{}% q@=e ec?..Ϫ[J?U>͐Q:ReO/`@<: *E>(Y9@TR*8&2O_O%W3|ƕP0P|j[oJϒWrkH? TS[wT)Z oWd.%BV<bJ{ .(#f്ѥ?Z͈'%yx+*-gNʒJO]>Ta&$0L4=fyq^>ilc}{m[HekBSo쭟纓qbs%fy#č_⿊b30K5#p4r="fLb*.^f^~LmLϊSl]v'qޱ4pGC r#`՟=el҈h)9PgD)yN]`Ipo*¥/t>oR#]E.*{CW.͋d؏()?ҕ҈>M#`.zCWs=Z}h0NtuCk#~p ]EZ:]JOtuta.(4Uh_**DWHWƻnCzu] 6O=񶚍E[2Z_yRiUH#ٟD2"Z~U4X(m.*;+3M(A~X&Ob GxhY3xldzbfrQŋ?/..f&ӈy?d fA&,>?W5h3q <_pUײTQ-tr PenSM7WOxz c+[AѢLGc*ŭƶ|=Cxd㏟s \! cYxJE!]&qD5ĔĂ|4w*jZ3 ~$K⮺ Ұ;o|Ӿ^j4s*fi=6u07F`MQRp0ę]5%++g ?p<5;Cx?KCFR`TYF$@J7:)#"`aE=V 1`}f8QcT"xT`%56STpUȫ\5OQk5> bD B  X D:1yQF:IL[OCPJ*3¬ՖUĀ% TX9b.`8)_ݥX3,%0l,:ƩKO ;kإxr%&ZfDpwP?F7g| i|<01QoP#=XO Y!HDRht"n'.݉z6Rωy8! X`hX冋kTy*ӀSӡ R*%0p'yn($Nq{W ɨ0&q3 V {b˃0A(!)CԸT3_OFXolхmvrevP0K]x4*Ơ~8bD+MI*t (7FLqFR(PK x&h|w/X##f ƀB6Ίˇodzbl쓿msyCY_ _oeg ܊*Rm`Ffl.wqɐw /Go-Wfk1'~E_3 |lׇo+]2Yøoo<$#?O>gJrEYէ Z-%RL(#л䇺WKRÏOqy@||Ҧ=:j73͒aϤWe?këԃzVka캸;L~]] /VV?LY@ZWv>tQƁYl42~kv#W<3%U3c7aƓcpyyЖl\_2}ZQR}vIOS,ԍ4S- d3 7G|yO\qM-bquaz L86j][֌FQ||z/M1vX?\9glKmU.&obr63P<qa>}Qu6*U=eOgY,(0ȑ6/?NMz9 ^T\縎wLŻ|Ԁe*Ԭlwc=weR)r) ǹ20I!LZÍ@43.LNK-!6~6<6NwwuI%R׷3w}x̀qq-|5ήI1s46MtktS|}(C[gD v eܻ*2T7Qw2H߰i'r3ʯVKСz6i'Aa*,'и Y>mRИ{شeJbVǭ œ%*j?X N+`p?DTZ\eF [4%^Tn&Q ԜTTT)\İZqAݵa$5IMʰ9J " 8#0g4iu0KPa$r}?j1'1Z/E˴qV)D+7ރTA/|Bgh̘ MHFTȔhAjMٮlf.u)Bp<=@ƜIq/3 ZMjQPws+þ`ЍA KcYfW:$"^{-AfdwW$64; MuJ©.6xTƼx2BA@ iR(Z Vũq!(0)͂[O!Rdz@ # l:i'i_~6o7)r - V` dSXR L[)Hi4^ݞ77BѪ,ȧ 5>H10X6 13H.hU?6svFaPI;8 QT,YW&PbI"1QTcpmV8d,fB\Zh(ZIV11T1yY`0XpDLf(JA5̹Ȝp.EH.9msEΡd (lɸ+` 5yc[1N[TuLr抩˨1 lhBA˃. pX$+#8١2i -EL\:p,MvPL j3`#vLY6wbF ɸ Ruyjb4VH,@:Z$R_7UUN%QKUDOE1 Ü/H J%[`LREٔ<80)8ЙVNecX"6VTr VQ'CddKΨDC@(  VWjԑU"y{x)(`~E\*Z!uϮLD)ԥWGE]% IOM =Q$"/=ZQ $WEA8PRil,@x@Htxe( W,$Vc+:c"& LgwA:޺Kk7,3O3fUy' G1 E%e‰s!: 3L 9̰`.],[#_60TirB#5 ZǁmZ$<| x #/ f@ dYWҥ`ruA"L&a1;`Kj,(t_@JPE"&LԴPyEB>X\8 U:?lxF) /. D,/xD›q+*p3Ύd P?`<9Q<|qY! ' Wl2JRpac0 vnW.qP'SaA[w5$ ("Xkp`(c#P. {^s@ QMހwK }Y]-$!jUW@ @ RX|*Id4yЎ2;f20jxrI'#@zT̨ D#+Nq:PJ"4)3cvXT5I8K&JI>2~2w1`DhJ|` Ü@#܁?U2ӉTIXs M=r!ud10M`QIR ͫA*U95-eV hߨtfz-l@E}׌I'9U a!KT`&k1+k-@Y"k!nj!f=0p MJSpR(xk]0! Zk sQS9%  `Lyy0᧡=7fxS9njj>'lrԠDKhLT@=BBH>Jk$=|);j=c5X߸UȰllFVO&+lE]("\x>W+f)ķ(N: ;"PA(cx)U,JG!_kzY\:\cGLІEs819EKkL0RH 5j*\#|rã,%&}*` W%MB1X>X2Q򻍵9mFِCL*VO:O&".ńE@ZĔ%0"pŒc \NPklׅX~Lz3!zHeO{Ջ6B*^8/7 QtI)<ߎU@d8Jdj&ϓU'D­wBhdըكͦt{g-/`k73z" =.=.V>.k|+r.v2SVw,X*QcLU]`%t*A2BRBاKagy=m~Xόϧ޾yvKiEzڍ# hJ* mhg ,Y5 #fCore!K-Cps]7/Wo[O}Ep\,P7hy-{JyPGl:%xpq6Ϋl4s> G%1  QXHf3U+}.ds&ip<"Hh[h!0fKD}@~dW!o`[ep}*R:$N{rY9`2MZA<{JOz\պ*j$m*'vIEiT+\ML*7& IK!(lUWk|pZz>;YhqF]TeO%Fs9bɇҁ\m$.N:%Y4\B*thm؀.z*17:!IЋΎ._qDJe'/EY. Ю7$4G%9dglmMv6&;dglmMv6&;dglmMv6&;dglmMv6&;dglmMv6&;dglmMv~<~*yNfiZ?K%-IYv3rGCNo:$*Bڈ{ T`7.&Ŭ{/'Hz;>wJFg}vyBc4n~qsrZ!@.Í,͇i\FOX&}ڇ2,=3K߀b2> go =ճllώn=V/~A`L:];Tm3*|c$RM^4y&uNi:M^4y&uNi:M^4y&uNi:M^4y&uNi:M^4y&uNi:M^4y&uW,ch695}`O74}θ}M߰R56}[Ōэ%?WQu%S;`UUUf{lvoa/^j1zӾݍcY%}ӻz2ry[w1,GFր`7"OnTp 6I{),1ע6Ȝxl JK-cl2hϰNw4oCOg7߽ў~~tn˯I KCi/Copw7ѫӡOM-g󓣁rwsʷ's+r,]TsSzyo ~,ʤIDGG;bwh'aܯޡ'eVVhwk)|`=x$ Kv#oQFlbL^ku %fS5GbyfyoU $4CI:l5YɭQ6b>‚-3ɪZZX- !^SyBݥ[K 2 o[3.Ld~IT-ЌIYNRyA&^Nǐ~/K&C*xIs0?WH?U7 .$So'7$Mf ΁ \:qSg/Nh",]ft~\$LfCpK4yDr9Jo<_^Nglzr[vk6b+> 7+=x%w_eee 1ټQAE٧@/9}7]9*orClsuJOp?bU'w_|%v+='5N6O)w?[ͫ}wǯ}s̅;~o{;$0j`~zp/xi\o}iKv'=MJʷ/>?Mi?{O--_UBaIh#nnV2]L)k5泸t; [[MrB0мp/7 #m#]GMFp:Q?)%V5f `xhaUE.(YNЖ:Y/08Y?RMT uMpQԬ?{WHeEȗiKT]2 g *mIehf"ldJaFyxyϡޥDio!:+}++:DB]\!Yl݌5Y4@:LL*Uvȓ=?  g sEcXw[Q KdHRv9o?&5 }(Gd~vّyKX/㼜]Du^y:/W\ru^y:/W\ru^y:/W\ru^y:/W\ru^y:/W\ru^y:/WqTy6?ːe;/C:/HeٚQ|)0jZo%ܷy[q -!if{%Zo1GX=ҼH739J/XwQSr>ųXu+L Y4AH\+c2 N:\^-bDEHAxp+Bw`SOr; =گPr[CWM07kr~m3QMo6_~V8K3|BE%iaC~NCWSvݺ՚oI>2bs7ͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫؼͫ.7pi|mKM$k1f^w?h%/sB5fp)6>GU׵j?6^,.Cw.0 P G\K =%-U]|W]|Gu ZK׳T0 85 䢶Q2 ɛDVcV&'8}ePtpOXV!z~,E4o zϮ238X\Y>3ǻT,3ۙc͎.,:[vz*AyE~#|e>tr["I2A @Qq2Jp䵦>|WM^$sxyoޗ(urz:eÂٲ]yQ{qz7e"~ O8Bj6 GfV's#oC .#oh0h;σOCnq~VW?7Ə׿v%)|^|UCm嫧UZ4NJkJuJcE,8 'do)#qS[ ˽j!@q:鱛j}GLŪ_}gW TP&P ;-S9層2 JVpX8WPVV7/*vG <{x74]hai$XS >Sz 3{ǜ}SW1h^ǜ4Տ#1kgcjsOgIpCKx 9F-$'F#Bl2ಕsO 󁉞SSPAsFJ֊Ym.)ʖR8?3*{{yWCo'9I`DJqu>T`f'"YXF {Kɛ|1Po 駱gq7m%o'/`pxf#ؓڒ -LϛaX|4ifq"2GHȚt6 LVDtN);d7$aeG=2dM3P>+d)Z0VX)} Ν[GꁮomlpˉU&ELUJ(fxR IgcDOr %0Q$ς?h'>rp(LkN)m @vS8}~+RmoO6j0 ̃YOUwrGUdǏ*wUAQT>ŽLB%sV(3K0; XeR#U !90 =k @ 6*K&X,X.W)zƉX^oW M³b Xyj/$dooǏf;:H$ dB2Eep"!d Qs Z. @1s2aI%w&k(02Wtqhp# j6)ڋ*j vW*M1`;-bNkzC 6J{c*]'!  !=QhKG(DHuTi49-_nDf` """-"bEĽ!T31e` +,sZBqdP']^0`|֞.F@qQpFL8Mg\aɣ()W*=uFbQrU;fyw:SqEVq*?L KOa1>H86W)BJ&lCD)!6x&*.zǩx !? a+,޵s^l'#jㅪ% gWl?}I srWdH*eʋ\!FVm>Qow@B~'w AuBXvC.wTǞT $$XᨎIg˂HƩKDw64֥)S\YlllgzCwoo .BpZň,w-U1F*U'撐2091 !JQ0 ?q=l~YOD>aݶ]wJzgj?]Yw%]"]RNa3~jߍȘ|+@LYs!{F 4Gb}L ̊9\*3̒KǸ ԞyW,aʍ1= kJMN=$KM :?[H$.':H q[.LZn Y*kQ*#;&M k,7^z)0_ܵ|e谶ol&0sÄ$櫗W 9V怕ceW8Vhc%DƱgL1L0\,\uaE˰ \,JFWpE*\ eFsJju( > \Q=+@o*kzW-%7> \1=+3UVURVp# `#2Õ/p5Ϯ JFTW)BW9 ,\ӛvRuQv!JPvlx;k4v۪6t(|5hXq0ŝ'7۸HNSAPx+v@3=K( |@rJr7k-!߲QJs\5 \Af=GZBWr'5ObMWUVw>( a \#_=J\^0ee*rzYcpE+\ꉐl \pϽx]r*~YWpE>eW7p/pUp45pń0ij }p!?n<ʝ ` A-x5tmJYvoi _ `nnծA׾+r%*BMH"Wdh4<_a/u-س}?ȧO[b0dӴ85DS4^ 2ȭS3=uQۋ>߼PW#B6XU`GnEŮqz M*$%[o!J&%Y3$"q=gN.ZVr_'ůDmu 6OFavC6Lt("`3s]>V{Em%>zf\.\t }<]zzteDc;DWX1FwϺ,(Ϻ|te ɺ,hMԬϺv x8j/]Ӳdi?C[FW]aOWv}a+!;DW5y*hA JՋ+e\whާDRp<88wY1K:X>OgN.><9rʝgiz"ddBnL&!D22c@mi&i'|vtGػ[ӯoe1/Oo~we2/o=W}O&vD넁]MGSF:I{ɪ6J AM4Cdh9c3ZHI7O .>JGr4Xj7&F׆d>\G7OwvjϘ &txV2["/@V;ݻzGua(?L@r39T;#ǫsuKF(} c۔(dM܋YU|}7XImk-~Sn4Swhyw8`BѫաC _ 4|~ɖoi}l>.B@t۟6c\ښio]tXMNONb!&LޠrCȲ"c@PD]RjLNe49@n wU6$)*˓I8.).1;sBSKI, 3pR@tb9b"'rJ{nٸgS ñ0z\׍MW=:';Y~4[83[Uo;Qb<&5uK9ױXbz'MP>xClN-z]f'^Cr!nccᷢhܮޏƓx /7:#Mf߬V*5/8k~w|g6I:E@YhMb:Hf9]DzW>0 \,lF5u=sEmH2%/D!s j reZ18;jz_ 䂂nzU#9F7v󾞝OXc1]䰩JqᜤN^=lM6)WuJ d:olry잖Jx=L{{c[?^u_kd0hIiSPIFd~2}a"sD 6 %ۑ)h9=+ʘ L0Rl KNǹDfόiƸ18 9k mυOʅkDEl8X**rӻ]|4'Tn4| go'Q0Q$"HjUI[ya0iKrpeX2Hs26X7(x/@BO@Qg혱Y U9̂6 3vcpgl?Ǎ-ݘuڼY^׳v`wXu4B`HL1@aVHəZS$:4g ,,H2)CE "&HE‘8>p>fY:iχuP?L0BmaPF 3gĻ,(T<:DžwZPQ-p$ eR̔FD6+!j䖘TR\C2!P$-PC)ns?#~f/?{I&y٬446[0u!M< DO͏ԋ|Hw3;[.tK7w:m;.)GeL%-ק"ُ)_cُ.;VL|Lܜzp̅}l &UʆJ&jm*'2m0ȧR9C@|O!H-bٝ*vMȎ=(*k"yt 8B&@zQ̍ʑb@:2 Q,PxJ!B̘2pcr'Be.,Ŝ(340\fa&-֤)ehFvFԥ+П,k7E'UtRqR 锒P: |J `9&m\ _mwpm >OrF_ۮ;$yGomw[wR}LAWa `;S^j֕rֶ~ B)8_`97֮jjygI=WY?50%* **xF(@kb_fMrE5='=f"(IÌ27.hQZT$}nt0eUn%kvFnJ3'#n8謏PFOQr蔕 7)8GK^nQ롵hzs+8{Fq~-E]#˔~ ?ơ;Ÿ~ySԑ Ӈ^J>Mխrr7߀MY"W)Sa*"gBo؋-"swּR+FdpjIf|~W,GMA9gCV$ިR3#Yl-FXn F BI'\.XKL1 aqM|R-/w_Vڐär\k59/Wv~q-f+f>H֘B:щ+Hd U.yqޥz|TE0(yFJocqPBfYEPr,J)A̭]pd["2OG#IמfPݷtPơh-CD()J9}y]")/ެOCf"夝$ztJG/4],N7P-HM$`2̄|G*f: )jyh&Ӱ+su1'w"b3)*x2(zH:%yPeoBڢMe]kF57lϼgq=.tϭֿvKvWeOZp_ê{ph,qI'~ Db&B Q"yprۥn!zZܷgip@\b> M|*:u~?Py,![nu OA YW? VQRyx jg>sOy=Tkr Xu]jOz'Iϣt"~vP]3l+QO($tQd@2T@ r S/%@ PtW=m]-)ҊSENw&g\\bXG3N FT>d?l2{G`_'Kjc‍u>y9IPR:x6,T nB  9ThlE ΏnyRKj]Ҧ"6yRx[]|u~:Yq_7gV^Lhb}oHwS٨a^ggMuA(?Q$_|ӑLr'EѼr-q__zm\oKmVOY̎Tm zsS9տ1|Aoԥݤ9#x:Xu6 *緒-*=NU\],h|#j5-%ke]\tGy[cgoN?)~L_C72ohӯWKIyQùo~w PfٞU7-=jJkbEyCeLgύ6ٛ{ 21@M{n~CePn65uDUŖZs@`Uŝe_S͊ u`>Mq}7lmO`Yu bJ˸Pj~\TqDb%%FbLc|y`^Gyփ P<&+"*oiDtQ.#"yͤ3L]$ ĸ0\-QzI]<*xoSclz=ƒU]uy3jF\' !)8ÌO[3ފ9^: rSi[|np & Cdz m|tUAC᧓#O\4ƢHH!1U L#,)x<ւD{N"\؄ AnPc(,<\KX䭗Gg}C;<$CJE^l qkY,`Sl㩲bJ2#uL )A"aHjGrD79$Å܍%@+hf1ҊF")9(fE8wZI%̈``"Ay)At(P:!Bۈ#^ %B @^ㅵNs-h?i18AE 4PW,%VXIAZ3Z8=WSCJ1%)A,G燉I^&+—Rw)\dS3]Jϧǫr")S #wRd[G=*hh֊3XBw#aCLM.̌geb~\x 6tNXu1*xtzV-՗cb(Q+~\̫HoLq(0Z%7t nfEp6*,C .&Ԏi%v6GӸU>dW]ϪzjB;GRɬO.Փ%1>٤ޛǠRJįue oo~W?}1Q~w_7yp)TzG$G/BMh{6 M&uqrُ?*7%v I0Q!I}>Wg:2\SEǀh|"0w+s_b7 ЛS<NMwsg{1b/(q0X6RHH{m*gXB].Qj>=&``TE1J02D,F0d`VGRjn̰Ay\&=ƻ-lyL(pwe*? ]2g?sc(΍T8 z:?xy"ea7A"l8 ``ɘp1Rf K-$WCQ:2," aREbSDcuf:_Ї&l9f\)IC;E>?8OaĺNx|  D!qg4灗+׽ "wHS&&Z2kUΉ2zf<?~zL,y"D1@\ ҈ỷe A!~CIz$=|:[cv6Z}f1?kR&6F1{l:zjÌ+$:+Cd"C$Ζ`=Ӗ&rL{NcF1ZleFQFG0 GT KIw&_y֝gF l2Ǯ}';=vpQn|$fqͧcb s\=pKoE LFF[[b4~ܩ/sKxQG)aʽһ@XE}*8(p|szwnyՓNjg?$'1ٔH2UɽVH"y`" [X|uF5jp ?7lϹ( )LRW/\Lv9T V0f#ek'f>~7 2("NKRPn1um«WuTͫ;S_D\^V?M@? z\Ksj޶{Rha@zZr\JZֺĉoSW'5zcp2о Y cM3)ǺtO=ay9e\nh/$T}MeOO;?m}QlZ݆ 4]έt*5WT-se d1L?| 0W)9-`EzfM.1l:i/9>GH(򹖜N)lOf侶CǶʣgyǻ^,[jvm,d+ 0$S}9 E6r  Cv(ֿ \9UϓW__OK*}S4{^es¥RV܅u[YF֥t`B>>ΓøY?@9-r͙A6ƒEiCYԎjcFk7c"(b! & ' jx^dNۍɖ^x=93Go^HGp-龼_;EMr8gJs `:!@c:`X?{۸b .}AMKv#K>sU,3%ʴxH6f+eaX#ў%ZxLZ!\ss8#:uJ.%%33xCmJ } {KHeՙY =yf+X1[-nɱ'&]Տv˻V{❿(pUP(unEoF_2=yK7<|L^ՕCGԏfy 2o=\]W?_o-ݝ}~:cY/GwY|+G[ӼݥYzRk2X]-M~]7#16'6 Z!D/faiVӡ/LCƌ ӎpaTLf?YYOہ'.SsZ3ֱI<_│WsE( nc֔ ,oguza曓o3yw7@R+pu!??۽ -ŔK!kQ,fEZ }EƎcY • 6x1BB+Tq*qu2qYw)g&62w W_|9P_;F\Y.)-irAPw*8Jp#_@U+qN.yZTj6,\#mz* \`U;+Tcv*G\ɂp5+W(RpjCj .W XSS P. *6t\JG\!$]`]P-ƻǂ Ur9q%%U\ VէhYxhV !iHEP6H/DBC>_V6v\*+v6JhI9pVU\aZc`Bjm%iF|mʜV1m奷Uf EHc^RsD/Ќ`[v+X1P 1F)J*UAZ+k5tP8r–+,-W(WRpj_{=q*%/#T3Je\Z9x\J=zWLj+tY˭A0_D J1t\ƢWJh2wo }{N⪕Z':J\) [F(^ @-때V*cĕZrQ5XRZwjJF޳)%Tvm{+isWVS)+WzաMψN`)5\1D WWG+N4 \`X1B+P++T٘oqu<XB  xW(Rp%(%WWG+I% \`%Vji_۩v1J>_Me9KҮ!IL]1Ha0"3 IZV1̰3ϴjN؜%QTF) rAvjz*8|QfT`c5}ozN-Cl/>xp6 \`ct1V~՚?2L3zWLj+.鑕eP^ P. Պ{Wr "\=ޜNu+lNyZӖTځ%0-peF\ S@vr%/W TJ6qŸ+! 2[+޻B:B\qA` FzO6N. *9t\ʡ;Mpֲ \`#P.& >D:B\Ia.ɻbpr)Wֲ Tڡ;mpoOR =ɨEM/4KuM7*#Y@V: 5>.g9K/Ӝ&x.% &獭5؄ۚW73oЬ>duq-/k>>9\j #K0,7_{~Ylm9wqXWo'{gS"**AJDPm*y |Xٗ_yoz|ᦧ$U`]$eDۡyR;B7\Z0̤-W( rJ+Tx2xpeE- Z;9P9'bTp/ǻB}gk{WW_ MoOvrͼނv* #mznkQ PU J&+T)̈#T]PR ]ZMq*وcJR \Vwj:P#WpշZWp-w+ˋ{WR1N^] K1Ue䊱{)+&QUsJ),%OUǖ-(WLqR]0VIH)Q jQ 4jR0JQF0QvlX1BIP[ T舫#Z+8hX9B\+T UZ=qeE sŴˊPcĕLZ*V Pm_bک4wJ^S=-{;rUUKv*9FL쏫#izja R P5 o\KR:B\qB -ɻBJ+U) U \[]@xW(uW-՚JAqu aK" a.D]_nRVKJո|mWiXgB]?[ͅwF!.\@V\ E}gw{ ɼZ4\U{;>r=)eC%ЕTNZ[s0$ԟ.퀰>AOOHz:C y{v.ٯx,D3?cmԿfXлfPNHշk ZTlgNnoll-纠0bV󳳬g&Q 1+A$7?W`V=gq7B{ IU7܉GN~tC^,R<0s)v'1Jx+hʆ)K럀O%3=qI t_) 9}zX!wURl&Yp@(у[pJFRTC4>^X=+,j!(6r\wx]M**-Uc9 Zgb9IHBH,uY)SLCT&D aRnNɥdfoM)zApb2pDw;+Ҟ칚YdȣtCGtV}Ƌi9/Yu#% a{܊uGM2tSmW} :8 ՝v$*OlCp]R-& !f\rcU^F5츣v&gwGMݬjniZT~2s >n3;9h%C+^2Z^'OY=bw\/f,rsnGIZϗwDB{K1G)(IFATDka:w v$wJFet1{O0i2&fN'|ȯchrNR)\HuM&cwJg,2d!%Ў,|UnaV?1>YO;Vqx2˧oMߧ_lN2%",JT;EjUI[)LZʬ|@˹n X7$ \d{\J%<F!#f&V2 tL&b# ;CM6#GمN( S>L{c8,XVgQdcUpALL}^fHt@:<$ ˡ-\r,;{X,/%r-wWбedxTl !#2gс%ŀ8/IFpR ڹ`lI} XcWlE\'Tx3\8!*{eށr [F9pv5_Y#7m|QԬ-Hp#؏Lh6d9ȿ;ϱ xiqY5;V]}_Wo3 Š aR hy+iμZq:Jd6FL`HIKrCV>oda~#[ zi#$m&X5<"TdsNpp$(g^D[@-OC1(e:ɍ%.q4=gԤ+ nrD=D:c<y˓ ,'Dž;-Njz#fh>XƄS*Bt.SSC9VIpƼ EoGLGQ֛ТGgC 6uAho`(vN0YBw 9{[Ԇ" P[jxo 픷1y#&Swn9m3RbA#Ϯ?suq n}f,ǃLsksYG#XB>%ōk8Vs4E9 ?3XGm @ ~gCeZ%=O$).uXJBfPI(L󌋲j9I9DL6^zfaq۪ D)¾4~6ikTu4ؑNVGhJ)t\<<* '1;lc~;GDK~wFv3;@hu\i6V^{kQps@:(dZ"b!ǹs#+u$mJ^-!1@XO /Ҝ ? glm̼Q2pmeY$a{sG'D9Wnš]^i&(Yf C$kyE).+fk4@>빢մ>kMDFN&TS9,Y%-N"|{wfw0 0BNLdzШ#K6R' UҡJ3G8SPd=4S$eK#"~llg^3Pn wf 'ХmTEt,D:"i,ӑH=*R)RL+QI*0 |\5[gႲ:1 G'"Pm$5$q!LEɒT# jt_oǭK$(MJ U`s0F DCm f4L&vC5he}Vn%sTC8,u3(LZYg~)46umNt2ÁYk x%hk$N2.8aQ8ɏKy0 U;EyvMӱb.d=e]K@Dtl'އm?0DZIe ?00<*>X)@J H$G!7GiTi$| HГA{Q%&cќ!tȂrQU]6יK@m [!/Ѧh 6 fNI {-aXl /VV\;2Gk.h/',0ޭ=EQ.. %)52ZFQ^_(]מ_#׽:d^k/58ȋSQB9VuNNX}8p9|Flu"JV S 2U^@*ݍW8 p\W'uPSu8a-G秙Qr]Qz=\nrq=J8jKttt8;?\dU9Vɮ ]+NхŭhLwōgjKM̈u4W\_^gtF*9sM46cخ,U?_fC^>urpPdoKBY_3̪㸚6jvΞ|VM_99,=2!Z{VKVy3: Kfut"z"~Hh1kv;1TYR2[S٤j.ar~_z{|??Sf_oG`AO$/!yW#n8&d{t=wϗ}S&󐣠]u̳z7$o\{%N2櫊Eni׳Ǽ*z6dE+_% "zNrpkbC MhoK -t[|%#WND5$)>%&Tk',3lj+uݝ213Qjg h$%H?<>h&SKms0;{vl L):U#y8L\r6"|{yMr}Ox!*kʍWS:AGoN~o)M*=$SVovEd{B'7 K'isH,fr,=:gR? Dv A%ߺ֬hNyC4Em B..t~9l ҭݞZ; WE*poW|s*je1gsR^537`~Nٵu嵋nvu[-(}+ԶQUe@˳nUXE@ ug?pf/q5?/u৾p|{搕+ 6cb_&ؒ2V6&]5YYU.GMNDʫ"68bjVJ~^ph]yk3l/?.zmVMg6$pI!FBGb"d>hg2BFڨ`7&=C>iQcM( ('RAL)OUV]*MT~eU2JV[51H֍ַZ|9ZSҹ<(iR6j]fxu6;cZ4m1j\OOo |&/5+|BJMU6,PRym#=FI}&TK<Ʈfhnhf(:#M1E{) wLlQ*ՙ[^mB*=!li,BM;JJaz!BȥAWYjŃggW"NG/D O0aO E4 ڧ}8Iݿe|HEbuNzQ2d]OBn\<bM=A!%xB>hAjy1AJ._>LBP=n;H H jwp,a3m>%TB;뒄 Xu+P(H ̎m!ڡ&$'{Nw8=PPAȚ\\"( 1S(ͮɀC Vjw7qVUp*XT* a!dE(3A6 ?FtblQNX!kb~B LJ!ڲhF& h%PD[vjꭊ#4 -w$a6 W|ZR0!UK>.syX:XD 4 أt%i#*9FтΨ19MAkG,̼R75ЬUg(jΤyfj)څ'k-D;z*DEC&zf֫YUqڢFD + (|jNFW6-4EO(+Bb4CS"p#fD9[JD9Th5]QzJ’O([1lx  "qXoa7ԔF,KC$ˡ옅GPVRp*|BG'4\- &!K#r6\z!(L #"B`jzD\|~~$A~)ZU(]M5ADw̯>8=7„8&n|R F%nj?Y=Y-DZMA-]\%&o] ps.ZoY[.s}!۶_߷Z^OV<CY05NF%/<97釋0*2dsSJO4j ִ$M4}CeFV^Kr6Tm2ĵSfhaf΁el=wώ||.?NEW맶pۥۯA&i]=^e/}~q&ZmIW6riW}[0+DBh0N"o~1Ξpiyˣ9Ϯ\=۲ݭ Μp/yt廓_|+o.Q_o}t7|~n%O\vTW?[\Cy/sATm&WL}PjS],-W1Y ֖<Qgc̮o:"ۣޕP3䮥,*vy:3ELE9Ph{4G7X(ʗߤ\:lSbCFq!gm\\cejTڣ6e/N&$cI䕻 O8"?]#v ZfF|fD^+fF03̌`f3#fF03̌`f3#fF03̌`f3#fF03̌`f3#fF03̌`f3#fF03̌`f3#94`+bF.fkaFZ/KgFJ33[dFH6 f:Ǯ<`!/^2YuОĔSL';Ҕi=͆ՌغV-x{F4x=p{lC5(?{WVLs_ (t kW[YR$N2>W%[W%J>$q(<ȳK0%!$AR !Pk*Nq4] >0VKXWs&6|)zkhpf+/d0qY4lnOP_76cNpd p·Ɵ?HͱR8:决^k8z G5p^k8z G5p^k8z G5p^k8z G5p^k8z G5p^k8z G5pF0 E?v=zA4{ t{y| 7Qȝ1h|YDx!}:kָvl:*h~ 8 C]{/o8A_>LbO4?8H)6}H y4u$_0_M bb0?E"g]6;O7S'Lg~o4|5߁w4]HE8!oc^/A]C6eJ@E5H(;X[suJRWa`,Rف ES_8h1@T|i7Mq}]^}D"d'&a/`^<{lԶl 5A i(ך5S"5 \HzĈ4َ̀IUw)')׋DV.MC7;ZTR0+QLyiê^ f; @)!q &_߀xhW>lӻi%x0l5UPKPu5y<׻ j]%962}_sYl 4 u2G-!2&I<[vGcO!L;.ƌ:_rZMr+?=oo 0M7ږq0:O?~> Ȩsw?ۉ$90=LuD5iYoq1K9&Zܹ~M͹ԜkF6Sӿޟ$=KO1UhDBgyNn:w̴KǶ:HNpjɋe-`^bٞuI֤dm9Y?I:n۝N4n*y4ǝ ޮͩ/;&i4P?nxt5&|*W9o,ͤaFzdC8X4R&pyg>kyb;auxv&l!8o4Fw>fHwy/hpGKu3spF&g:3MV~&>h&۶%4 ˻R$'A-+0TWqs僃xnE.a}S'Ȭ(tG;n[20/,]/R+} 7m<MѡNuν~ .); gƆZ7ϭѴu[:]ٜӺuf!Dͤun7g=?6aZn`}x̃}.+9|Kdzs?uvR&ϛn|[yn6>yq[,{ц~ R T~ d񫦭wpwf}of{”= ~:q西awb孉g!_>+7yy4P 3O8]|}vM4דT0Ӌs1Vk9Em\\UFs393+dSѥD ^{}"&)&rK!Pmh 'ZlDۂgņ;`Y m3MX<L9_lu,F6Cfp-ͱAG +OC IXg/>ޛR&$}):D.@hh e+YM$ԺT"ꧫi^Y;؂{1.vzlC& y{|rv :oKn|UnOOC%CU4Z_|5WF~!PKӅڂ8q+ %N$ 8B7%< qrNhP{ͣ2~>xfO TP&PK$cr|mXDLvLB3ccI T2˨sRZn#33Qz{#4W-[MN]fŮ߼NّK0ֶdx '"SxD3J5#l>4z=$삢{"Nsi1 NF=S3*t% cŹJ:+-&?nzw'?*7)bRB1ÓR^`NJ''%֞9g _O80d@ .4R%GQDHuTtybnR?̆\,b1"VX-F|T31e` +,sZBqdP'"/W2E-8p(8#&IEXPlg\QFI{`Ҕ+f: [b&R;yys:,&%EV.j]aiQ2"fif5eSHɤ@MQyKbgcsŤc_{ C=< [_xo=riE\Z؛+y?J?Y'KdK*e=B\)0%06޶@C 9{9#zˆعx9vwRڳj\w^PIq+Ց`#!ID q*p2M x+R-ԳqS\X\N  Fh;#,ZbTDO%!eJQ3 a>rb"JQ0V+8 ,&{OP׶r~-Zl{nCݕtRR1`U6"cKfr qgĠISipd, 3lCW 2+!r 81*>?l0gb7G_z%Cd Cxg㥋:fjFL. @=1}*~mVW.Y_5HjBxnl8- />GTzm _ S1@o\CDչ1`"Q0Ԧ&Բ(r"FO=_pevrPJI Rm z̕0TBWYiP0% 0 㭮-czٲ/?DU^qm[iQ{Fm@k*Pq´F R^A(65"b\炉hJa j4!1>,3BRw~޺Ai__AwSG=Wj^*Lhv¹D>#@H4I^MqQ&:3+ !$\K{f:H9FSl%+O)&Q a#Μno QlY {\" [ lLbGb\~̠bcb DHD"# FT8h+664zΆ|V՟:5Lx6aacA =OPDzb=3SgN:7k ,' |^Gf2d" X@RqXHS) DK\dr.?o;LC(U$(p\I;2a7`:"(@Veыحd}O7#iMQ 08B)6&t + imh }]1F: l0 SL',1'vۆĸUPha;tuq? >K>6iNQ#B&Wz?'_ \KAMkk\V+}Wz4t^h Mɭ4Zg=[/ڂ|:Ԋv7sH두Ӕ{CZ0( V}TAU*nq]c`\bҥVpbiePu}W:a=yϐ2NI\(wp)RU NL8P=x:FY^1TkRfֺNqѽ'=@oѿ}yfږ;|zړ<=yuH3ȕN9\fu5 [ѫҭUϑv'Y fQF MSU!Fk&-AgOz "\JlB OAA7U1kQRk^jF}Ww6Ls]$?cY.hf 4K㩲b<:pFɠ Tڀ0JU.H/:<(p6u,b$#)S@RɜRafXsR%pD0AP8@2qsm&( P`Ō `ͱ ;aIx))Z$(1^X4fz96##(䟂gji\w10냴3Z8=WSCJ1%)AkEN{C%!|y:)p u!@LiBĜM(bb*oGLtj~>,ύp pΆ -O4ѿKo*F=42) f wAR]׬k|LCg`xf)8R+Ku;.0rjre95ޮ&wYeϫ*o]M)m_hW6 :Y\-Wg}4_VWOlM}BTc8Ik48^Kt}pU_ϣHMJWBP}XG\o hjӸ4+ɂq ln4?|X/ vi^-io{jpo=|WZBxL s Ds8a06j)TI})6e碊qJ0[4͞4{fOi=Mi4͆o4{fϞ4{fOi=Mi4͞4{fOi=Mi4͞4_dOi=M 7qK7[02{fOi=Mi4͞4{fIEɽluHW\Zu]4\6JDA ƙARL2K%|pdv `c+8UpBr9xBa%'5fkKg*ܝ!U 8 J(X+~9Ul2w4l[ޘrAv);7njQSs}$8s$QK-9+R& =8F:G{$r͒lg'S &\ mZfHHH+i"Ii0/dÊ?(RdKJY3聴L=sdk0rW5!k="靂HkCDE|EU)CZ&2ui Px^SkVYXrFH՛`>}8a;Yz BeMvÔ vgLKzE2:hS띱Vc&ye4zl5ͭ`o`ADLa#5 I6ZLH(KcRL.$9b FyDb x_}`Wtk[MI/ݑCTR0=++U}Yql×gؽ1 Fi|/irPZxy\_S#f}5b6MV+Wx_ũ}oJӐ?dZb: w /G:f$R}1~|ڐQRl]};8?F2=CXl`.WSsk ~k GJJ:_q»p3/Κ kV#',f.+_{1.RBvT0@(wu`&dedèÇ/+YԚ8kp4O?s~9?DI1G o_ˎ8Lv~&o}dg<~+>|~ ӝ*#azwű+l¦g$uסҚSY{6%2݄Gw<:?=IWc(w~BnO~7'_Th>`浑a:oytj5ؾsSxa^:Ļ9zOJ5zhiVwcrḟ! "ޡ ;mϴf~k|jbu%˨~Y_eQң/myޔ~៿} mD>w#(eaqGRB4%4K"+kRT$q$YzQbVP*Uh@E(>Hg%GP%x]&C s`Ii Wzzi!_K,mp{n>f<%.O_.:@‚N ABIP9jQbfM %/e #۾ H( EڂRZ C|aQHHFIy*A:IVDhly؂W`uv~[{5}w;ݼbċQxh|.-rQʹ 0;J?k}E*vA*(ug-O-8snύ[6 7~.YsKEKYl! LU ]NE{@ɚ\/ Њb kx<~eR~m.V[y)q)饌$C@:XVUc>+#sfi2za)RY@ 90ْȔK` g`+̵33xчʰ!ieMPl#Q(MӊȐZ4a#^E5{\c S2ik1v5.f/tn3Mx]pF=ȞX>¿حP %C\)](E'+Cig`,dɨnJ{$v/j]7{{s+Q9BWS {|),,3Q Q E(ȨՙPEedFŒ "#s =I^:L@j88 u(3QcOyorG632kky| bGvӡ}ƨs~/u|'fg^HYvSVRs=U9vVLTHVB!!.$\ rNګWexUsGNW:zj#sඥq-jaRu d%*^DŽR  is*`KI yPl/ /qH3.VBs;n}uO1߾4 XdvtcĠXd!,IqH6md +ICrA**4V>1\p+)E6* )(,2fij{_ br!ɑa ,%JfU>˯ r|zar3WUcY)oq uge\>^E-OV \l416 0ӏwP**tV֚Z4yCCmȈ+(Wd Fx#dmYVPvgl0Q)ԅP9į7_g8V c˗A+|poO,1d(ۭ$YW'iu_S&JhCшL`Mw^IUs;RMև<]O }b7ہ<at:q:\[n^gk@l)ELDk:Jv)u6Z$C՟tw^8Lcr= {(SI!D,.0X*9Ə7_N>ig*z9䏉rYE949 b3u'p)$mݞIKP Ll-0C^g#\J2Nii96aC-o}/;ez'9Db5ӓ"6Q#ԺG?E%5|A2 0K:Od6i\$^#Vr¬0^mzZ(gQxxTzJ/]&dBɞ+NRNgl&ÞVi iƑp=K/ntZxǶaf_giQy͟<.AQec12#t6CAz0R'd,#&Z!45sT* Y k4Zuںd!NavNx$/R*${ra'F1k7ӎcl뵥`^pxecY3,By~":sX{D5x#- Q+vfMt6;YN֮;mҀ,@͞b4Z"Y'uv,bŧLZIH*&h4JY~ԑlU:iɱ~Q7z_Nj|MDZ"tIC^s2R1PqRR)fq?'pa [௪ ! ۾/jw˱w:@t,*tAЩHJQn[ o9#e)%Y`5DYV \0]>Q(9]().qSPJ 0NKF2 m&N7k9AgPʹv%;VA/Sk0kNӮqÏJ֛T9tz}(&:F+8-ffjSd-lI$ȑ`,_X`gWkVIT.,>֚ %8ikkgAϯt%YPߝP /Ρ6³uc;k&΁vշW`fNST>t 0A P\V1˨XL*Y%ZXcTj0&Qয়Y!$ Y$/YOLH >gPLV3nIaIϋμx7rIId,տ0a}=(Y[Pl-`x5$(Yg88*J'^~~L+ !hȧ!CbѹP1kTY^QEQ`m wDXij~瓣JE5>LHt&G&doE!b6}IY4ؘX/|^,R//_,ݬuK)c'lVy1z/MtV8)7^SH-9<>PpPuGWU8 $-9"Fޕq,e~{a3R߇`qudxNuOk^a[Y9(5(y ۢ8=5=_U]P$H5ihJIDt`v,F.ab:v`{nVo]"ҌFIAn]?9|lG'&KEU5j#^Ac~Uζ7YRPS(}"gR x׃≆?{ZK~\"v^d/yw5 eZMeä. jdՆxwz V.`Ă*Έmԥy^t՜XчA+6F=y~}!QЫӬj) z,.xْWMʿCG#Ojb7?F@+`7麗qz)(7> >A-3xsd*uEʜ]*v(6sIS 6U#TeC4tPUjޑiQMުƴNaԉ<- v]5K΂$Dޑsz匽 OYۂPP#T2ڠHS:XƭR㢠se%"[-)1cB<88':"u[ @v$hT9q>FD#c(5pc0u(LjŅIGlGu~R[&]>F"٪n O^WଯkR !pߟԅ xPJ1~zF[1竿VGA.w1rk-N;$aKۚ|G= #D(W4HH!1U L##kx,RDH-'MQIlBsy ĠViǬQFYJyY/5^#ڎ ߖ8{\hSJqDD`v7%*(0p`x,zH8BJdІH*m@5 #VtWsbh|iϧ5VS,1Hit!dNF03,i%10#iH&v'!Cc1`^a(C"3Jx-)Z/usmy@D`NǑwYfu3ERjbōY9s%8UX9 PϗʉJHN7^ɺMzT&970y/rl:.z[@T6JD2r,Tr>0 k 1JxO8C!j0֖D\l7l]W cJ#Hm)*vVk/;hb1Vٖ!Vk|M!!E:< >qFeft->5 σLvKv=:"ZK}4tt@ǟ#k\[2?KN_FB8$we{1y8[xc'߄a6riyLa@} ,VN.ܻhzmq84v'pI'Ι"7F\×J0 k`NGK-G: N12w0gE$Xp%(di̪n& F1) S)9O:J OSF2\C.(} h^O8{yzX|Jx!F?H`B( =KRn~o0pO!Eg+J?q|Sm?Y>΃L9|TS pu7,symn ;upC{T#SHHݱil#,T֞U/M.ݙeMȭ>t$fw Ā.yƶ%syY :%ui{[vrۺ^w"U׼TrxF[ꫯθS=<(Nlߩ R#Zz-C~辧՛ÔB}lc?g`FsEsb5HPbƄ #X`T9F$㑅H>HԔ1hb FрG!eLDeO\kTfbǓїWB{{aw3k4Van2{%:l:ƾKRH8WH0B%uVȜETs3i8[0l>l"ҝo[:lW:Pz᯼PK&>Gp(ZqzK!-{2S.6Ol8n'0_d<_tY׭ڹ[dS lj,mGJ|)Vxv61زR& 1A}0XxTQ͍Qm"vF6fM‡󉏠S{5Mh;yL Idε.w(%DiÞ<&s& :Ǵ%@53C䂓Sޗ:P#Eؗ:Pfӳ-lt*n iQz  DhwT뻭cl:2_^Sd]z4F]moG+t d7de7Fo%L )qW7(QTK KOuWWe,)L1DAL^B*(}A)VE~)aWV2]џqwz"KJ퍮9pϋ)70H X%^rxQ\ñ#*&m EڣQQKLlrJJEѴKtpQ5E+WDcƀ:Ge&`0^+QjLUg.c6Vč7.|X.+EL]!/VMs'1l/ W$fOWpY{BҏX*z#knP#(O71D:Ґ~Ri#J/!$DJ),ѡ B0B,[E`c0tLR %^; Rc(=Ex],ET"eMZlZG$l !ISgr6UԂz6L5|*g-\J2FD8D\Ɗڬ:+j9γ1--Y䉵JuI1%{zgu`)*߾I'^(־$18*:0 xwvmTO,L1 O.y =%M~sBɎ-\Rj UvdlUaa38 7OI7B?`bVʌw,{..^eyE>FOw6yd.I ^emֆ33oM49Ɋ`Ky28dC+YJ5BVbF &+jϖضΓB o(ɺY[UvĎq01Iǡ-ۢj FNj6kLL٘#Vt6[UJk)H[+ Wkkۓ66$~DJd+:k\D2( G(!L:Fuu\fչFj# b38UcDT"xc(Ur` [b @I bRP<MY5$zojЀ\l =+![TTq&fVK,iؒVh!XIń , ud\rZg3)91.€.Ǔ|4$E()`sk6P&e Tlb!'.>. 6C!a ?m&'Ѫ}&\67cbBsU^EBX2w~x\Fcq;._E|$Ki:h#ו|F@ $ z/mTYNS+l/`9Qڠh\ ;cq ᕲFly LCUgG3AGMS؋=M=)ӸKO޸fr1Mǭ.ۑim2 AeKDXMM*z bDg VniR7Hrf_b4xг׹ԭPӁED+2"!,.>ˀ05:DV&EEjgYWPcqI"Vw+kڡp\ŒmXϚUgO=/@:9IPc>*2O,Be֌V٬bxX5J_L*Y,%籸XcTjP*' xKJh},֥B:$X4@+ K TJc줧oGIIF o>=( *<'( 1ӌn&q\[޽3སiE1!5eA>5i F%u@ZaHf5\]LK_@V"rʷC=c2YꚐ.g Kz>+@,JqXok2vTIȡEЋYXSv"Hy1Dg'r69݃Χd]|OOJz߳ U o[L{=oz25am'}g]dn6ݮ]!73z&IE ?Ȥ!'LLRBtْw-"$^G־CޕnSbKW'o~Q]w\NxFeΒ^us{/O껆~16?ߏ';hW7^2^~h4mRjk fq%Dd;0'bZNhƴXagQYeҫw:"RBޜ4`] 9}H=&IznDE[^*XϞpisZ+ tA[c\=-&Y NIƈF$VuU^ZŹC{dk]~_}USfo]?zMj ?P'zSqJ0#͉oD2`})Ȇx"Pzfqx`t_aRYwtIeVS N'C/=T2gei=Abmj'"&J 1$r;M^ Ԥp5a!.M[ I JpP$B 7Ξa7RڏXR=vxslkf<'Ĺڢ%1Ѕ9Ja]V9oP:b=I@^ PeޭQ$b]" *h1 R.ج> r:MV”|B"M99p==;Rվ3nK];͏YQ#/_8--x_qB lFk2DHd31YQظb63섞y^(d 5*V!` ymoJH2LH%-!ut=QK[x!  +VKfJ|)i2!" 9AiAw(6ءġ*u'Rڲ;W C% ^xYƳIz__8 Tye%`F:+NHYA A(͐^ 0~w?(fq;8y ! OGϥ;kЎK?obmxge VфCqZx N sˇpj287{j2=ytWY}ڮ~]w4ge' iTs볐C~ظ?=ޭ{d 0_PyrTqWͯ&Ԍn̻{m..FݛNXh+ A!IV^}躺חdK7}eVUŸܻ&.dVuѳ4}޸f4z-ha|?|JmNW>,c|ӽ[>R2EtO֟sxȲBV;]?.f;={'gg|=+Ugϻ7e\y+g.>m•G,Û>˝pԧp*d(٩p'ކSٶ9Ccܔ=ܔ/Odqκ囪dq5?{s?ʫޒ} Ojf>}5/#Z^a4*;n _.p^ً yMCr<{7ť|tFKsvOb-.p-]  ٗJd`;:hYv%1a\&識'yu>WQ|ËVqgiTXP+budX9tgsNww.4l<@$"x_AJ%*XȀE+-`T*abȼ:j}+ ֐H$ƸL$@[\PGAB"3!VKղ U LtUPfq`.cl-rq.wP†'μd(SWד-O[=O[WZKi08?-vvӤq¸F{ օl.n\>uqif> Sa{Fj(9x,ڮ:C0q-J u տַl5&$7e2tggmy:Ԡdf#" \V6 i~t|-&7+|n\C_]_LdzFtӏ w:btEׯ1$|\LM'rѪQ?{(s(gQp\2F V;P $#6Rk8a8[(|W#OL#P|B1O?;.ӳݡ]?\i0<-ez{o59t <~Oudžޥ1J |6&L 3~f3wNؐa8 rQF=0Yg)~vp]jY&V)>jazkM  DY}c($XQ-.vx`46b/TH[8Lv;= 2MJpނ^^eYx6Sϱ58A&&Nik%qJՀ|\p@Jeb^R"cg5S6n5[7?{WFr /a~6$>gx3~SLR_3$ERZe!9~ziy,+g7<.LN G)L€RVh,S\HL+>WLCp):Qdڲ=HOy8 xVR#R)UR + .ъR-*B.(CpNBc=p^al) 8ޜL$>J`_yf8獔HHH$"?nkiګhT-hi-$…M$T{Њ3"8;|`StH%Cۆ:BwV &yũ)<1AGb9>lf4MJ KY?Q'?nиb/W'1qDK4Eɾ}ɀ'gv=H׭cMUgMQFd8@~#TB3;+7;L^Łf@87ZOH.XM 8,F6g7~lC?Ż޵?DoޜOϗߪX&8>{CϿ Z\8~8^˞fB}\^sxc_k7ߛO~?x{3nmF a0 ͽ.9Hp ֭]O@$摮چa,Ma'k G5J>7zr?f?vqQ6j۵"Y:ɡq#y`Ѹ] FM;'÷pN9Y|Ko?}?_>Gw_GsoT'`yC/Y'͇FL>M zb'z|}; O|J9~A $aWfDgh!Fb6 v>J$+ Zri\jC ƺ0hQNan| )CCߨvJʋi*>2 *cFId墶!Qq:}RGGfO"j.E.vDr !ʘ ˬ$&CƱOeۃDpx)D 1! ME;MQ @t(uX٣ANj]¹HY>ȝ#F6K7OE(p\b(=v5ڵ8v= 3= ;9o1(cҟ^n5oz!|hnhh KǕBAFc Q^hզ=siv:/q؛qyyBr[6w=my6~ [웂xvGliß6bZ#, &,|./pX-&Ԓ{yP;2.:Z9Ӝ-CMMu#0WEpD*$4ꤲVq\P^?*5%rֱ$ׁ'N>Rr$'%Gvҵ^Yy'eNDAYM`A v*”s`-nN 9IzI[/|=_Kr932gV'J8V!s*oj .+lŹC2!H4rbF"ܱ`)'H\+U suYb1L(vq ]9C%QMnt8*&h >Ihj2Hcx 8LTʵsh4bPfh9]/AvC7EH^~þCM ujK_E'&f2>|ʎesy^j܇OpZ;!7cG4!ʊ +A=.G*  z5]IRl6M4e{h95 Lݩ݆π{ > Pq¥6~EzKQkg!]]bRBev;2!{%kβ ΫX! ;m\8IVoOm.QT>+xt|Wd6-5&s)t=NuӤ03AuJs,L֞^R[[EPu넮fZU'6_2cޓ};]5Gl_ֻ{|m}ٞg^:r|F[7twc<z'/ޜޒ˟o{7};ݗw=M^^uGH Ys[t)MyM+/l^3 3R8pw.P 䩕qIWR6H{1mq3hd; 1K#RRru75rVJXJ* /D0)WYJ;+8 -)9qnO[ })YsKb"d >r- dRG/0@a:nsKs8ԯ<==_6]7|LzIF8n*hLA`4>TddH}HE* @CnA,43чL"q2G2ǖq֗/&n%^Y99kD2qުK o[6=U0Fv1yժc3Wr[:2U?W݂* 1l!#x*Ylh/TX%Ѩ꣫lVx""Gͯ\qgcEy"d;K˹{}߾,-1:XR0 !H&gաdjRCށmGa^0;Z`rWqɔ3Y\3 U te1UJ!-9<go$;}{FhvRN=KIu^g =h,)4@i1)1@ D.b̓LƩ %!hM65s*{ڦRCD|V?1lSهSᘧ#$R;o(QP! QDP'^u<xEm=tծ8 g5 V++x]pW.Uј4C"]+.mK;ٴp^\5iOA189g#\P^3JYrP6*$LV; Tz㩎 "5.+dp F2T.gWyc4 t=kxr!W^75%J^ s'| }uԝUsrb:BxUHB䳕_y(߭n> 6&)JЕrf+j7gЁخ#v4+3%$+V\gIA+3<&"׋įM^cA VzlcWp<^ $qKV} NxQSl  QzN̮kI5Y>x&?1=bŮDAmy!~#aHk;˟6&dE 6NF5 +Igss"麸y*O.Ki/QDXBB@Q, UHpD}RQBصx|{[L):A'ST$paR$,1򒸠HB5U:9=w(o3i*j 1($LkI-GB(%ZPp28PŠZL݊/u78qU"͓"VQ#z*\SǷxU kEnϗkacHiNQAsK;a f,wv P'hBH--.:$N& ɠ 6*Ƚ\)&4R-c1qv[rX-,63YEll•^Eӛ[ܞ͇{WMrP`e?~ɑ-vt@nXNʐD2A,CH-Zs>fg#hl69(¡r;Mb텁c"^]Kb0,QSfǡVڔvVC[/PAjkDc:;-q$B`]orbO#WK0MjEɉw~O)k$ٲB(NLMwOTWEnJ^i-{eb Zq} P2Drɨ\2 (|b:0v^Ĺ;~"cAc_DT##: oMʑQl'FBL5EA'dzdaUqnpXo\4XIٲ gbYr1HKťOZ`%'i`,qvf9.\\r>Ѵd_\qx[0GSI(-J&XZ 5k1bȒRRHp!phڱ/x@؈˿OmAp/<ѣal)[;i)Di%LU̒K`QnWAKߞ!%YN@-NG!km4lОl2.V`tQ)²QڰѸv+eUa/yKRCGg@=A #K/'m)攻iܵ+oY\qßrL۴4مt|yڤxy]hhK",I%[ϾXQL2ZVL$UF9Vy,wz=̥-0""#"dRg%[T-R;;Qq}E-"[*+\ 2 ׎Ăg4&Ύvտuxs簥]L dL,BePlV1HVҷhmh5,QՊ(FmvqV- QKE5>5c;w0PR MNGEyN8~I;gyElbbh㒌=̣iȾA#[A/mbMa6&ӮtctDš&_& &-,p> otU?=v zSΧeƈF{JDqN[>PsRnSۇ*w-E4ivu>@R*f_hz%?gC/ڥ~mոݑtcjd{`Z_iIz0].feͲ>{tdo͛X[i\F*h;'ź>VpH_ ^$)mqmkhfxLPsM*5W-G4w$فkõ>ָ'@yӭsfgk[^~hlǧ[ m0zdTo~>X^oFƏD2pӢS2O`髯lVہXT=-f^b{b_SpdOSnaV;QVu֒dK9o\^0puRIKX*7**Q. ?1MFIyOK=îcvo>FOߞyD6wML0q]jr>:'uPZZ#* Kv%km+1"<"4]~l{.Q IĶU-$rfU-SZlլ8%V=nQd]i/HǡEK 299ٖS#>y~7kmay\:[G.k#o6MخNxyTAAXJ|r'\`XfwmSJJHoubn SV|}"@Vrߩ)Z󫏅(?e(?v 8H}1u֎[ݭwƹ݌bW+5qs=h߰Ľugܥł&#՟}W[Ϝdv+sJH޷AT%YlQl!V([RJTևUQ㗏,H=^>ǝLJT#UqVAVE,٦CrLj95 LQ%M7ŹmZ\8"/|Xμ(]'v[ZZ;_ZgڙVOדg[,(݋~q'<~z*ٚzsX+Nw$';F}o?MZ/VOW9좓d.!Rnr#w[l6?{oF\|шŸ P0!!}l7<na{k@J :&ck}ؠWtB\E(x0.gN5ahWd *E !a^Ύιeo3x(ĖIX\ lwm\֗=Y=$*G& 9(.$WR1۳2`Sqi͏+ nğ/5Osc0Gɧjuf#PD_bW] DJZdRZf Xd뛌يs]=WCHCwMڳd|Oy.`T 2 e5sq5ZVD4l&j&+2kQõ]֫"KWHdDʐDȂ |?m1I%KVdҾ?I7R+Cj)n/wl`fLim)V dŜ0^2*[iPc]{*rZ{:> RZ[RXΪ&eP&Zu[g!_X2 Yu2}C1Zscɉ(Cfkw -YݿW:%g,XN~d} yrxM5+9tWl2JRp `|{y+}y'SaA[w56} \{#@ |tm^nbPnDl \ji'!Fe5&1!4Kv&r60Ze 5p@ZiE \EpALZ4 K4К5''tc,oA{Y ̂5*)8BY xAmJUΪrݖ2+{уEXYw hߠtf&= H/^%-K X[ҹdRZ4nuog5HNJdt A(P`AHP#F'ܵx5loجȰn$>+lE^H"\x>W+f'SN#Zu63 ;b[ѢaT( H;L ;뵷 ZB@2=BmX)+g).XZ#e̍LFjmRc |,$o/8XH#\1O2wy~vTdf.tBe_j;J* Xk0m@ncNtaY jC/jC ~4S"p#FD%q<{Z6l!&X˥71.ńE@ Dv.P,9tR@0[`4bM=m()!w*;c7Fg6BC{qtz53_"D%ms0]KW] (z|~D;lv@U'DBS;!`4nX2ծ$C&r{ =4B[%=2%)`W(ڽ%P={%P[R}J a"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R}J )xIjkr@0ר={%t"@`Z@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *>%I  z'5(`U^ +"%ЗP@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ/W d2%) r@0W<{%PW R9R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%) %Э%wF[vtIjZo/ovRl~Wˇ@4x+s.ǧ>zh&9N?u>m>3&NW)1YhR>\ڥovRyM 4;^~NJ ޕcFPy\cč9:(?(=l:{ý?,t9p,Rd~6}qW/sk;abo*y ־9*Йlyr耻e2-ir Wxm &6ߔޫZgMq<0 N KY! g?K0_n_Φ_ufk;}z\V)ae۽AN<^$ll1h+Vޯ>U; Ɠ ht.f)e?G7G1>z}tH:a~YO |u FZ㙫:[m\ ibM&x*̑bgis[[ܫ 3[.z 1cѨMgf6;Z=Cn咯ss~^ll;;jqbQ_w2.-w.x}zwY:=@$at5S[l[B N)X\b;eix߿$oO ?L{_\yli:Lpp: {/]/X[8a 6^3㎃<與>I?\\yWõl[zs`#/6\q7}}|4jE"Ӳ)/a)]1qT<ÊsE*A?jmFk؜6m\Tg|[[_x3 m2חpG[J߰gtO&Ng}nY{{X\5ZO;\kmf+@ͯe ŞYIȳinݒh^wuJ%ۏZJN:\]guyZ{#ro%LJe?VYdPpt&=htH]@c@!GK;O*>/OFW>w-觹/I>=:l[ G}†>|E}x@|G7LNO'e|˨?;I.q]Ks%95;ds7`W-m}p;(@C_^C|xpx4` ͷh1Pw`4z NMRI(8Cg]8ENN )M' ,vmqf&<G(>SiAʴeW0&jV ƌv> ʊشBTBM̈*p9&9hiCg2 fUQUxoTI{:8l5C|[OTXڞ <V3;8Yl֙ѝ?O Q',E;B0k9,r$|uW8`Ej>סh>\Y@Pl"ui3Vۑ̂ }RU\-oadϤqp0s'UÀRL}GQZ KcIeJ!Yfn}ݶvwU gw^36)Mq`67^TR*7 L./~K۩ _6&olp77?7ŕ./6ܱa٥6ri)e;u/m2ƿ6ƶpWoWw9q58Õ[uw|;o\.=[~yVvw;鹸7^H?q5F[.\6yߵjj},=]^$}6r]z]E/^q^qa^qzk{~\ hq6>-sK*dYЀ4Bqs6"UsѮ$7[ѹ8I [X6rW})Л3IB]-P`2ٱ)A&w㍋MX6e3W^,=a=w}"j>`af2JL3Gidٻ6r$+.@&nؗtDlnlcLYcR &(ddHz(X'GsU,wq)M9nlSy0ﺫC C@FqciŞ~ߥc/3K3%Rk4AhEM+ęEk%cߌ$|8zxgjf}@]vtaȅ&n=%2+&8ru4T/ڄ^R]#5%u= χ7%uϺl:{=98%(SbK<.l"-5vi 1T1vXʹd4%٢ث[6NDZ̽zi8bBK \%c. \y[uhz0k\N׭&TiKݹmQ];N3K[JWm$Śr jFZ(x,L2U_сj 6K)BTe(I)M1ws9 VSC k[,[uj Z2ct1d&Hj*1W5V>ט$dVQ6XXm7FXFX Af7W[NS(Vmeb+IGׅUA\aqZ` tz۝5xz +.W0Y˗C2X[ (3,̡qN\_ۇ?>:'d.92C1̕[`bWܚu=CαppZd UtʖʮJj9f_eqr-q@01þF[MNAZ8 g15B mA4{YC$} pNu=]瑷z"K@wkGϯ}SGU``TqQܣըguzVѴu \(PrHWљl4x_pv}+FN&V!6ImEB̤"+&5 -s (3]V=bX/qߜ8]n5nqy^Vmgb۴hW39d#h\%0$fOWh,Ny}g;Xb!(FbN E[G=i#eF3(s AIɪs)K6+<@BdGa.g-6%Hv}XZN)nniOq(5\l$vQa+DrQhKt Fqh]w^\m}[4`6< ubOߎ7V8k:MyNGdzp:?;86ˠC؆ .5vtl~ymNfZJ 7<|6NuaB(9V#=L[ L,OԕLʰzrL} rQTk3*Q( XrluT) r zB@{`gCV ;ZJgӝB$VqrH#gw~ZY)IuE=N[FeLRF*r LreϱSK&&V\_<7AM2 ʠ@+GC1yN;HZ[L, UvdU52]0XXx%Ec[=//xy^)nnCNO>M?2֑N0]v5uC0pEJ/bCu 5g/U h|qY|;" *>tFnչɬnF}Q[EmG`\.AX`1lCq[MZ!z/+@R*:<^x 5]@M%U!(%HSM>N٭:©_mD6`<D6>3"GDܙ3gxO rddLI(FpQ]KZ*Q|c:G grDr 'mCɥO򎤫|X!uv%E茋0∋;dώAbXIGEĕ9r91"dS-7G\\aq_Od= eBGsb+:Tm8:kf=Zn7'*|1-[.OfS `U䴰:H.hhb3чsFƷ.pyG c·?j4=Qm}  b<4=ĵD?m}?9Q5h>$)Yo>t|nr1[Lfiۧ}yrnΚtGO?bBCк(5DI餅G#JRTP F(C.ml oBء?hn)t1 4iu)^S fvD{EvdZOĴ_dJ+hZFufD.hpT߻2aH#XTq`c۹=u(z+=H$& hhE4A+M3k iDig!R\vĽҹ[W0'[LWMjK5Cwfۤv{#{gw?;86֡W};*.9$;-i@qć(?B92#ʆK*=oe?tKl9A~Jm>4KmaEΖufw6 SQ5aםr#./TZR8tȕicm3 CUy[uHi?[CItO}ؕ}L Jsrx^/O" ۉsE1%E)'S>Mu8yŻ0/QN4"ƒ Bˇvn-D= m͋=Ǣ؄䫍 c!`/ޢRbAf7M95p3wCҏsag3N,u5o'FruX>qR[>[qpĔ%žMZ#Rօu-W MT#wu3"ϋC"I5MqEG3uV$LQ!d#4ʴk[7[}@!9)^30KzU lŭlia"K+X4So'?/{u2<+-jYx٤\dY42?*qOL+~EP}*݈t]]}3dߌ,Me/f .e-4&YLoԢlăZ'"3tiv| ׋6nxÒZ8OwCs)',OYԢՎWTŜ&h\OGju/вXnK3-\ZG>+%pYlK4ap]?pŻ b)mȁVXu~՝ {뫻+)n;;)bV!-]'o_.%.2F]ޕoGaB=pf-]ōҷ]j{#I+ .#aN - lbxHCQ_̐(I-8ikj4}jr>.1:&-^$Uef1YU#&얨o\hhJ*,;y sDFD#c(5pc0u(Ljӕ.Lny^@ʽ XM6s9E,G˶fą{pRn8߾zH7hOyd毙SH3ȕN9\2]iMEi'$LSf߽b_Snٗg4HȔb\$0 KjqzLH "8}rH]XG' >=tZFe)A6Zf8z|ٺŻ6ݧ ,Znh.aӹSl㩲br1#uL )A"al@jGr閬kl_V5vT501Hit!dNF03,BǹJb,aE ! L{ѹZ@p >Cל"ǘ'٢Ha,L4IUlm?ָKP]25L6rѪG˺W{6a=PS^c $ "a UwǦ…_WPǡ_%P0 $+:mq %H ˥CR %fW0!=wZ[+gcYLjN,>f燡Iϵc*F7%ٗjMbt2QIVjZJA K+a 9ƘES$?Qƹ6Aj:ud4lխNjX5RZ=LHj|;Mwd >Ǹ66[z *TLT4 8ޟ}Ww޾?D^7Ip)mG$G;w]:ޡ7=vlڛ6 h$lŏ7ʥX} w`BO:1Y][EDŽhrV ־8":V|]wkCl9Z#hؙhN<ݫk {E;k2оDOOn_Hc_NzDpOâ2oަ3g6?%pH%Z90X9z8.xC9*S=aw|ST{ւiܕ\AD++SQ3I{Ѱߦx}hw&Aլ@ze}V$Y@EqD f:FxNP[-o^|Y[FH(򥖜N)bxЄˏX\_m_לhid@oۅ<%Zs#,ﱜn_vܕz 4XS,uBϥkjDWT`MFpT́GFњ^@k6L){[? $:P.S/s7G!3*s313;͟T4&p1f^]"R6wΓy.Z!JwmaG7-\IzlaNd0Ƃt_t)VvYW?f]NwӍ@hvkq0mD٩ހlʿ,GTa5"},åQ(/gsqD,KZ"= [ ®$beDH FϜXf 1!A)o8˛pM!SQ|Pl3e3t Ks99o)n,1'{\}zjÌ+hJ 9g8[ N[N8S1ͺv5=X\狘 u¡[b7:7:z$t({CG[\=on i$˶{gBmxNa O yohinDT.7V/_ܝk؂V6aR|ʥ3~Eq0ɵ.v:8~tL}_w}d+囯*?36jyQk6zNJ g78F$lUi0/a\0E А [ZcI"mCq=-Zy?_9Dn4)z)!JcD,5|G9I9d ;I G: Nw8w0gE$Xp%(d<ʋe&#tGYu;cƌL"^ˈiDk45[!-`dHɂABDLa#AjM=fB2b5R!DҘ4H6I=96./_@ z7pՇe u"؞l:)dU'M7xt|6AIӧ[E̪aRI=@,^t4˛NX> VCNKz{~1) 0ISxo5@SHa_ 'Iևh[\ۚK_rBM{QwH`E(!g1ZrGyq|/6mi.'74vfΜMֺ݅{ro00jSwGA16ht.j;xA3Y^_uUME+ud?LemXƝ\ٵjK:TL5N~T=غ 7.t=TlJsPtVf5٬+%,[iݼU{+onP\u7}:7yPțeeVWTz^_Gk¨WdtӢf츹zZ]yHdCr,mdiZ4f|>M U9&eP>*v%ւ\])I7Q}pz8zs>`!Pң%Q_)Q EƹddLՔCeӡ,zy`]`KSVg"1/T(z",S)GC㰈hGΛ00$5AZX,`D|0a@s0& T)J }#'Ww/p6ibgbD%'Qva> Oʙ$V^G9ӈ8 j{HJa:DwG"P2zEP=)QCL*Ffd"ya̙rklYal0dl ])B}Ѕ oL.2<]]U-|jk` &< -QIqVQAA1 ZdU OP& fh/#p;t$ хT~sugƶÁIǶZm}}@ +"SToZwqm⛀|vC> $$*"T22Yb&gcT0!p)[@QY,ַpV%GzJs,x C"[fF CR"EW ή_Ws\tw}DŝbIl@v:eД<ӧC?[rw|r33yk4KuvR&ڒڌ&tr8³U}fq\/\6 r ݗ#K˧l)EK  GKH/2֪H/k~TpVח5[==wx z#ݾ_o88WIhAl@ 0bɶ2^2fӯFfK=d"0i#),PrMNI(jK'|&&."c:$] XxD58[y^Pyr`x;xr|Ȉ:ļ]|q6Mcү `7ÄW J#[[rtݻ Fcq bAZsT: KjMT*'k< )P-ыT)8 mvY&tRؚ UlS&-3B"ˬ蔗>[ J[Ԡ*P9R>( uf<&^Un jen}[eA2ŻiNў5~Lb&}IdΕ2 w'ia7Qy(eh4=o IfG|ڧPRѿy&k3檼)R:RE.: Q/ {C?pw`8%YN2>iue6Ih6PZQ`hK@rDz o>JYUKN#cA"KS5g4 T|&$&3bM_ r@&Xhh "ԤG_ -]2Z-t@*F{ILs;F5Wgi4xԳ\jbP/IȈT̖| Y,|-;QIQٶzq *EB.a-c"T^b`^;-M, Y3qԳ^>5C /JXqȼgfӪQʴTXJ6(`z5[bTjT*N?wI(/Ar׿uI:A2:^J%mF j!-oYG0$m哯+%Aڂw$%A9Fэ8Vk_YCV۴"ޘ)*U&k|pP1kTYOfQd:j[tu+OTPkDvU>XV[s@&KP(}։$띮41bǣcUk U;vxz=k é¹I7V_@|h$*dEH&D6AF#p2!>K5>5tR٫l~w; t>d*/PN юL/aTTY5ol Rֻll~:[r9O~w!zx[Zz 7Fk-jdD9)$b:"#\03 y At9- $ܡe)<6>IݼFe:s!%Lnݻ/쪲/ew޽^ xv1Nf'~r1T5feý$s[.|e~?̓,7*zuux=YvrEHVsٳ*5LBf\78?K6Ce?oACΓUi>_F^uʿu]}u<9[qXBc09]vI)"& 3SO<] |PjgǒL=m6Lj?|6\u|ڸnlVmaO9ہ:Ԏ6 •>#\6`o ׽~}Rao&R,jz3mwbc* tbД6iϭj4  O>':"n}>TzAcj"7ƿW5$?5d} STlHU)rȭpN/x=黨4_x- ǐ0~H_O=:~)ўe70\7UJ?a .al-f%e/2J-|ﶘ*rp͛ΓȠ;8t2QZ`E7gʂEHހIEj*Co|?{Ƒ¿.FGˀuK$w,p!Z"W=CRš(r$Qv~ȜttuUףg+Lv.XLȔi"c㈚48?{m\M +% 18vhrA 4g[{F22QB9>jD>(N&ܝ]MϏ♩fjI䍣3a#P([+\t7\ Z9PF0.YW폯W͞Wk/*har#] Ǻ$Um8^q6dnivw=𿵙woٚ |љs?Hgͻ.HP?^LoN`r֒-Xy[3%YmfYXz8:j|88ut mmͭjX ,Zd{=ʱ ˖czD{&KC*Q&~m4{?:Eq/?;y߽>|wɛ9ykuo`}N%]yhB͏7lۛ7h$جb/>oO˯b)@>|E%3kzaYR B.b&wN%u9-~{]Ԯ? ш!vEͳڙh^< Եv$x ?"`M!&*I2z(ނ:}4>;m܅'unp+^=aQrhu M-$ 䈣׃L",֖snuѨ3ص55mctٗ^tvYӸh-D`R,񧑺b!K]$T?+k9\zzN.R|mշч㒴S_D &(Uҧ)1Vrɕ;.YQH/M!y@.ljT&Ho0⩣FksNៜ+Եczgݹ'nhCJo?w&>k'U½}f'X75o1bN y15'+&vb6biS\a <8Mhc^U%x>TRj}5bIn7PJm]{7دs`;Hpuź/=յ9eQ<٦zgR>7D 3B˸%«&G߮ZcH7Md36狦 𛱝0eI%ޝaJ= ;:]ck5F.d~K)ڦ=DPZbWKj]-%vĮ_ZbWZR]-%vĮZbWKj]-%vĮZbW˹N]-%v,{l IKj]-%vĮZbWKjAL9t1_ɕ Uj蓈Em+'AQq \>w);GGf{{e5D"A;"F%! Y-C`HLƱ=1MךROmi.*`wItc+#U!2bu֝!֪BBZ-s|_j#{7MQ8|:OVtKj}dU%grOdXZ DGJe+3KCWQ B + #OY'[o`46 ڭ-* Km`uĮXmMr"BLhR /yN0<κs ^u{j˞lWazNPv^Q0K YT BFg@ x\Opfu&*˥0^iGy*Yi | nIɁsK}hܲ3ݧKܒ)c*d30"8\:Iу#I֠&3[~-G5enf;g׍xLzI^k 8n%i! h"|b\U:]TAB0 %\43чLBL p8ٷdZߵ~gݹ{/P, }9%Ldu/thpr9r1ɳ^_7s՛MRxKO]4 *`>7f`R* 1K!#x*YlK,ɽ)أQ{Xz+$a`<m~鲃<nHA-AF3Yڜsa?e]1rY8HKU60)yH9&IW<3c&Ge6@(QԘ.6g%.7Ogԭ\[o~kJ2\JC'i89u.e06U V¤ۯ-d=9a]XBge-U&$>e#UlFݳL3"$`H5'2feDZ $R^!Cb&%g56%ma-`Ηj\{OM7xYa,}doΕHgY{^ [\ͩ5Q6=cy0&ƻbo/$]7}hߎ0M/ ub/O7^^C^0ON'l hS'5h%)u6Z5$]#*kJ)(с B0BaQ$E!JeH&dKB +C׻}R>"F)i8?MD]a")D!&#bFAl(Z+\1 L!$iLRNbSC- гelS146~%$Sa@v&GD6vC-'x3ev1Db5ӓ".GbguvZ^Yej{FO&9B %Yb2@x樜|V>^m5XTBDLEeR1T_c&2,J1HFf܎Jz4@,N_ /+3ޱgyZT~ 0OO?D&e*xX ;eDqVm[;;n* Y k4ZMm1Y,(*طΓ o) ]cn&b j7ӎCQ[Em)썁0lL٘#6LF춪1R(Gd4Wkۓ?^{ %4[\D(,G("L&Nuк`d3qnީ_x,L?EDՀ"XVV9=1-͚ &!HC1g7:!۽]CNZz(uFPTq&f6K(XؓV:bj8{D|(<Ցqqޭ:iɡpqxI(V ,Bm@jD &&KJIrjys. 6ӎC!aN NͿSF޺,^?ߋ27D?>MQһVs5([NR : JT$ %(C~kc T_8H"[%bѾ֙eP4`I1Y(^tw1և X0&B ("öv$+rj; Er m?8-m,Y3qޮ3uzrc)2/,aj%3Y,i5 }3dAJ\XcTj0& 駮HJDEiV}R!t4X5%cՌj?iۑ'БX!%Eڂbkëx5Z9f 8+yw+xɻ7v2HGomAZB$ EP9kTYNQEY`m wDXYj9JE5>ֱ{sLH&dgEyI|%wJ _8l74 ;4PЍC󺰦0MIy1z"QP CR>X*|N&<ħiI$>c:)Uh]_}3ֱmC3 !}(约U$eG_NF~]Û߯Uͯ^}7]lưqNQ:~?Qj1GF&IE şRdRc:'HlRBtq'v DHj mb.O(Hq1>Q]A/dvu1+B.=Kz5]5rˋ_&'/տQ9«'=]}_X2v@y4%T0+k`n|:.>ԯVG:g"%0;IP+$|M<zR\2\Ug-spBaxW6)ֺz936M$ 5F{j=Qh(\ո#4ܡ݅<̱J3&ߖldzw&aKX?Toymu~0߀щdbbR@y<]eX]EԆxU9A &YEryADY&'7$#QIQFkB!FD4C˫kECCPHKs ̦kPVP%8]&B y4gRw6_Nفɍ1>7>-nfgO_3y:F&uk>4F_X!慒9iVh$Z.zT[8M'Mh؟&-1.M~`PA X>"$A+*SDΡs>cB R Lۀ$8 qvuJ?ϙɧ{,-nͪjze}}aCBKx) [a2ZgcBQfƋxt =8Q($[P5X[j[- "( 3T¹ AJCW?q9;`ex%iVc)!Ւm ɇ!-V*!G?X}3>cH-vqd6DZ{vva;و/jFןODzxم2Kp"a$wdtT!Ȭa<0~w?}y97}׵:W/'HGWYXLg/ys 08 ?hҽ2WQrg3cVgkJ6-s8qZgp%%5GshB/y(zGT1UhQȹcヌ1 蟯>q[*, oE񚼾f/O||eՈ_|R/j@k.}z>H2[c{EC0z_$s%6M~Ow^Z-7"vzP3ok_^rbHD#I=įN+OW#5++agD,?tz,F3qLknr^VɫT&wG(}̙}x`Vv@ ϭַߝMbZfWhpwj"jέUKqXWRMܲau Z;Tr-1}CiV#^9 ϲA'C"jkq H pD01:7gd3Q(-GJT6~Km%Y?wf6V\DbDB$EL u"x0uTCPdЫ;tZ|] yc3zQ祯' [=E+c7>y k'Gen\2D (nR](E'+Rk:a:6 Ŝ'u{{e(KFR 6=WX gKǔ8zٌL(3ɳ$wem0GY4lt>^ku g_o 9rֻ(/ze}u&s,!N) ]%t*@ݹ&.źǗqSh21jV$zʲ;-h9Ԏ,OA[,kD,+ .Ik}FU jbSYw3uڲ&F K L|AAF (ظ%J5 G嶫f{lY Y`gs ^` )*"3 ieWZxIQa[; J+dE. , Er m?8-m,Y3qv:VEI 1ya&#١fN}a^( RL)i#QZ痢+X!$kmH _60yy S⚯אٴ(y NTWUSUAK&Epʅȍ EʘLj$J\P+r9?ko]JARE1 EL?iA`<jav#r NiUk8*ȦP9K(; F{m$Rx^0pK8kGFY)M¬x +9a-cD$AH5{'ȡd^|u)>=@8$Gi*MEZ{̋Zqp& ahP1ceudӐCWͼ:ՙޥ6Tb)Eyʼ_6@" mQߖ>WdX~ߴQ'kO_"Ϳ 3s`D_)b+t@`A.Y5)Oc.*5O'cmY7 ۱ώb~a]߰ٶHi b|2uHѨkru(ԵXp幥瀵L/k.EbS\cZcĜE NVFb})Ce:H;~o<4V)B˱H*%QZ0ѦF'S̖ZXVȒ0 F!BQRb%QFzI L@cTJ*t3œӎ[<\3#XA/`%;JROh0(J`ΑVaEJX'QZÓ [ߚFZgy+b0ㆋD#GE"8h*[Pb)e x #if9crSԾ,|P#T 1(Ĕq%Ԧ7gQPKD(ZRb$Ƅx['gD^^~Pd^缻bRD-ȁ1"ed>C{ͤ3L]$ &qa&E1zI])Ϊ{p68Er-;EcF\xl2ΫWTnOjʟB VhI`) ƴ+ H8,4|'\B .<-(LU% L##5x,RD[37QIlBwy ĠViǬQFYJyRSZ"Gg;g25=':ՒXh`y6D㩲bؘ:pFɠ Tڀ0o@jGrVkb|A&5Vv Z5xb)BY WApKXB$]óT(Pܛ0Us8=# `%B  zA:͹< Y^D͈[\_jkګ(Rhl7vf}֌j2! Pp8a Bʻ m5 I&F(Cګ MD[d&9vik_瑆 ̱e*!{*@$ZH S.0`I:Y'e2b ߽JPC}DX~%Ao&AUxN: h>׎]`@ p݅LB/&S0X(8R!BB&hLx = N9qh|J,Y ;#S,~_4^i^GDj%J%u rk`om0^$L^uivv:=2Lsu]yvEpA. 7v׽ꇹ] ~Oܚ|r{a=Ci$Gb|HaH0 :̢)s0,m԰JL;pл'z0fߝn9G%hM6tqNR=k̽a8vw_"wUL~͎E[.*@'Y];w\wt _~|޽M}ywkL~5: 7 9:Z@!|kCz:94W huuuX͇.5BQzĥ{-@z𰋀M $+ yJlZv': ќ|35 0$hq¸KC2%p1|Ue4(?4}~ 5: Io#(l S K 3Iq)gOSNsi8b`(2d෉Y`=GD͛` }0)>U`O8rsĉ(2hR (LA[V rO 8BES9# Ym9e*0KXp%dMՌmaw~+!s?O8L5lu _7Aq-Ffӑ I.1Z=H_&SiKN8qgnT7=ƕ'u8?0a`0ߗ0iv*ޑImXqdc>INJ6#M֞^XSqEȭMt˦ $f7mЅ?.&m;_gM.3K8k;vۦw' ­pv}[WMP[K tW_qY㉙zK#jַ_oíf}YP󇞦cmq(]r_ô' xqug]3 p3DD4sQZhU]ko\7+?cU|`ݙŀ,6É-YҨ%Y}uɪSYޥ{\}حVvl0պbm1ՅBo[׭؍ $+~p&Ns-=+qO͙K,G~}|~m??aﯨ`uuޮyPV Uʖ}2E[g-īi;2TJs6wo箥XK*YBkH`JNYzi A]-hjf|+ xk_l|]vMuNY5߆hn8>pR#+~p9ȶJ~ %;gv s/0'"?v/+1GV@rq1E"k&ļ[V.Y=ҖobuU`m;Ԭ`x껔X9nS ZVJb-z  5KZ2 ޙVtq9h`E\]V牝NJt1Y[<}gؐz\%1;B|.'krmR^ɟ(Hl랡"%^#F/ۨW+zAu;I1:Jp]0ƚՐ}1LޝPDv+kʮ,`ɨM뙸cb%88Pm]o>|nw~?ߖГV)E4gqRkکR$N6P &4ԻJȍD .x}%gmfYڷa[o7 Db fg#"`yNǧg ¦F.`rv"pQ)50LR*j6uƩD D@m'ގjN6 st)%Zbu9jd XQXYWb}VOZ}zӨ/05^*9+db,iȠ9T' sk)q D1g$T;Yi:^6VScat(s8͜qWQog< 9 eO:fasuMN?탿a~?bgFV@WW}hD'W0ZZ@0/I.cZb"6%6^Ph6)D \`htRc3Ici2bO3~.G-=;v.j[C"s>P4/H%%lu*{kܓ&5ܕjC̊ fh*Z Q]mFk.iqF\j(Duyvi܏KQ"r 4x,"ɈvC;B.DA1RW}9[WrbPc.pyOaKs7 $ "C7dhFv}LKn%U(iǑr3xj S]ݦis<i2.wxg| LPM db3Դkk1ڻjK-*Ȓs檓Oyc'!?ƿشCY;\{W_UkvՏ3dpY>c0%ϦG'u@{?=#c,g«x:mëw >>vxt Fxr>B<˨28?ig.g_~YilY5yVƏ.pK˂!X6e4V/lIsY.+^Yn]}|0hఴD\E ~wK{WZ |j=v2ذFfGaIbV_@{uquow+}/[D{{o?|_opt~riF\>[Syޘz7}]feqϺ@T_O^l_ :k},Ko4Ba?9H\>-0OK5g nvoNRJ?o+ODq%RڬD&WC86u9ƙZԐsgӦ[s`f[؃rD"ğztxxnK{8 {c\] &[?X'"os3 &Zwϵ=3`4v炅7cvA06T^r +iIM5gpG[83 OբN&)ΘqXij 7ĎM;Ζ1U@#cB )mhD\{O9(T+I=SN7{@OҺX+fɉb̀(eiq7pB ?{a L|ЃӱeZ"N] Hc6= P ǀ 2,m >7` X̉KjOhZ@>DT?{ȍe`, &e0,f `g$!l"Y6j#ݝaٶdr$*%\3 d:_rr^qd/,~mZ@Vǃ#Ʋr۰ ~OXTuW$ 8˨ϣVwe%,6L1*e9ӣl' +p#,h+{uJd˨[P(yTAJD_#Kk=xP@ʍ`QJ1K46Q#*$O dE^H"y>5\N.;$VV0ZةP/B(R:+:#>8ޟATSXD .1#d*І"&=zVNlD)XZ#eLGj Fk*@rlf-A I_6В46µGH<kR#eqg(tB%!w-1`Qi#b (R!B:J&: (+(߅XD.U2 ^k>V-iӫ͆(Fӥ`eX30+&cMHj.X豈K1[ 4b(u.U~UHX;7δP GRq3٬ŭ'~P*j*V%]U)DFO ˗[72 :W!N cWB@h4XR/C+P6<F&P @"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL TO t@aAZcd-"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL beo:%&VaAZ@#d9Qt&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b#fy攘@(;&Ud@֚g@!Hq)1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@zTJ.qYjZz.ۋ ^@ciXw?x%Ziq.?5Aqka#_f>*/U^-l Ϯ_hT (+C^UBUnj ?o.EŞl.|= <]6\/tp=IxkG*w[]yY%n V|GU۩=gl2b)nU\!r]X%{j*i}ՈlvCoZj"roK.Z(/l=VmoBϮ_T!br3J}Y\%_&uI=r$"1VFIҥbRZ839C ʈ]!54U8 qFqyt|mk(;# ׃Y ɼ<[ϡ۷쳣fk:Z c>Z[g|qT?-|pô0co]N俸.s,H/a1BtGN$])X.:w&8["Gz+s*GWKLzd uHP |d3;%D=v~}@I'jWLO~>Pk;|%o&´m>@v?,S=y1u^({YAݿ3kf4`WŮJ!+8wc&ṯ?cO~u3n:)~.X?%띧fc5oulrtcܨ<@H9 I8TމTe4[TW1ÊKXMiq^ZANuKɲvzŜμ3SzyY|ɎtFx(<뙯C)9HlIiE+.VuWt WխoOs [[]^sv,:[ uAPhqۮ(v>a]OpSYOtyS[Ÿ=Uεe':wi#x{^`7XNt?? @̉S)ٻ{^eJS,uG|sϯ=we;]o\'{1Y3|m 6OeEs|.⠰cf,BVBAy6 *dxl[Xs%(lMckL#L5kQ"MeRg(#%f?Ͼ/Ztg[қj4̮{2K~4_&M3P"@yj_Um67V?aфfu b!;b9UUUPEW9 e/wh[޴UasqE(N'u~kߌF.}h}uͱ?/7y}mo'wf>*:Bh9ʩo$ z&l$RAg8!%݌J[߷ mؚ[7jryqy<946K/umk%F}ٵ)fʉ}*B(Y7e,N1{ʆkK :nins" WjJBzM71Cc }!b)xZ|۬] u;*˰utt+$=֣6yc^`l\zaz1mPI~Z6u6p 751'c2$-ss]a 'sWg)/Y0)'Z޸kdƓM0Făʸdkt@TGd+ )L97Ι;dha,^lgD`~i33e~֛8}n _!!0]NSM1 Dx-VgD&5Q\$ ^( {V,$LbntܱF*r0 eŒ6zZq>zN毨Nzm1Qz=܋sn9om yfp<8"}hkDn6M?} l{i(l|bMI嵣K#F]bWܘ8l"- uv@?ll2ޚ|u=˞|5Cp4-Y{cB˃Ctk|p`Z$N6ZIJ@m=Έ@iƿ^WP8_o&r󉞝O,{8D6&즒tKe㬲R:`1_piGX*N{.ћg$ͬ$Fp!RT[H W'at(ns/W/zw18hhu9HvFU!I7iNua4*KZj2uЊZѮ<翝_Rެt(|: ]3x~魒hJsfh<6 0/o5? A~{=,G֢gu7u ö]vsk4ˢ3b_*M 3\,ѫQ:w5j_ALS^\rz 8]QqC5t[u.kiǯY.DM ͸MKsc!iZZZ}]1 ]Q*}K z|8Ly3g+5$STq&y] Ov^tC[!..(^;t[/X6klc|%yM\9%AFHEIOc(go'}_x*ov;;}wp \Xyfpvq{ wg7ѝB }6|۾\|9zW@n zv$WON dH УQW u5\tVȨw9JS pBij}R)çUa#Sq&kݵf4BI<(,u]{y!s}waݻkG].;sY`8Lf1.rK#NO~Ȳ<5E⯊Xv;={&'AX EʂBUĻb1 Sۓ#F:.j>J`7i=K̈́lBrBq#1gQRp"H7,aR/1",D2"5eDDL ` Xy$RD3;d#gA]W9:Nw6r]3ɼmO>0/>6=,K#3BtJ[0w l@ZÊ+m htR8l D8DH|j F`^ d%%V[R2e 4FRK'9#9X<.,Rg2DOL~ q6'%U;Ŝ(aHf킡`t"һ ywCh*ua)LZl&ni aR]|;vN>^:530 38s$u],v{'I(F^k7dX_/;ɚ$úX?L70m'S3OW?( pԮj' bg\ &*OI]^''ڻyI.X_j9iT.ZMTnǠKak Ph=ыAE9L[Lw4X]7 L+)HcV{3hwۼMdlfUI {3`WN^ʒZa4ikO8:|ԍ˲ u<"K"AL`^Jm9*(\"BՒ#1&ޒl>;1 2]eig J[cD| 12@Ig1H,&I¤w#xUTlWgh\Wƌee\Duƛ(J)=+ B^+C\$0\c_X$vID_,4}GTIwQ,4HH!1U L##%x,RD[3'QIlBuy ȠViǬQFYJypY/5^#rgEΎa| izJpw[tGs W(Ppu9SVI IJmWI7o7 "U% |䵸25C S?nᬸ#$LW<9 GsjZɼT?OWί-5p肘yX^[v@Vt;r6bqG+Ck$;Gb|H7]Ðaguef_g9dԸvL'ݷ ={sx3oƝyv]vU#.bY2vT>?.u};|`\1ThS<ϺaY?Oop|_.? Ο`$H&CP@O$kL:Svu jJYS1n?-߆3n{ ]><"%I~(Oulgѱ ]4lE`I1OK_b? Ý0CVXj .ϛTtAzh<#"(9lVJHTh혞8$tHBR!vV"BMvNa?q`SC̗WqG/{J JcF)@6(SX8S+RώY~e5^Yĩ |! aֆbpBԘa-UTs Ja^i@@ 2*jRzU[,} *be#g+ *:]k,{-stҬ [ͺ!X^-8:CsU/R}qܘr.@g̱b>r$P9HQ[DH(򅖜N)IfxIv 5wY.4ℰO|됊&rr'ƾEqN\*-̜oop)ǣӚ@OסΦ뾳a+9]ttX?u5$[vαk2R:ͦ^yAҌO+st>SRbKu|p2O6+%L 6*u0EA3XR[eP(ʄ3[:gʄ4 $Ҟ, YDDZ+$h/yY.V+9o|#ܗh_Ȱ AA١7w%"Í&`J *$9+W]h٥C>P($PT8$1`lՖSLc} )hDU^(C0!e!@Z30{-#chnDdFR x!"DAjM=aB2b5R!DҘ#+d캡՚"rhO/ŁFdN䚥C緳l*e^>é}Ujj|VQ/0`q/PRjDD.Tq}= *ϣ`Ǝgb,7f\/&A<5 PjM`}U]܏o '\y`pV7Lnش%cRQr$0 EŇsVr|ghxs0ដ7N(w;4켶gp(O9ʳWr{: }eky}vN;kepG GwpKIUcki'Kv#,D֎]/m%Zeȍ59u$f{ [9?.&msJj]ԭ3V٭Kj>ݲumݴOz~hTR07>^qxnbꖎn@ʑh-zC5MwVNSk ޳<lnvB6ɵxc5@Fv=5|H/*'&Z Bۨ N3-q%<ѠJ"O"1qc3'Y%hLHPPVKaR/1",DDꥦAH0<)c"ҙ-q1WLNBnoz /nS1=_B1̸B*2D,;Il) :m9Lal"ymNcF1ZleF ( #*$;ASɼ}Y`'=veO[vsؚ7F3KVZ<{ qI$1ld;nu5T 6sSzŔLwFb[\U7-2%ZоoN(AS%>xf)OB+gj{u_1J"--]NjB7뗶b>۱eg 49>D>|C5`Շ]q> ^,Tr{AVxW٢q2̶1ZGv3qSb:7LjeK!r3vqy\ SUpTgHkSjyp"]v|[ -tͪOtaBt=" 5/jXGU }ʻm~%HE7-x%HY6G쑯97kk@j"`⚭x ֓j#'kha٫r۟Dmjllu.#2EBM ,={,Z:d _9j dUbΩ1 4c[5&Ɉ nm'yj)3.0M5Ukw]7"Ar2"Yl1*ɅjK]$0N ^\s eg틗O;y9\g ޭU`Jy&z()'\b\X_ӨS[ _,[[NSmk4Ywl6>\/uPXIs s9yq\piȢG<01v>ݹKֵs؝5hT}+m 6!\=ת\a_bgGww n 9\1p>h%mlW11 O1vv~-!4ضÓ*>?xțX.HA Nӳw.FB1#iAтKEi5.l12wvUƛ.ꃡzG= QWTS͖j Tk- sLd8M䰮p3s"=x..'bj^tztN&@Mf{GLzޏ7[]vۺgDeLH nĒˋ`G;=rM|ݑاN/r{35EvR%{b)bt(`z5U!b;G쮰Ĕē9XZ%8W7}iw_*LI^KM5T)Vx'Z55+r#cugSw>r,S6Ԁ~U7X㉷]|q&D ^s]|Kt|pztq|z{(хL.t]Gp$~|o]-j﫪ƭ$ TKtb#|źV;{C7&2}_QZɴA$JUvތ.FIǨޔMK&)5QsVdTCml X@m'ގjN6 st)%Zb u8j{}2t2ɣ2ngb=eivyZ?'UIj**?4zM,ƹlJN&(K/2h*I{jr !> B|Hu%] Cn59llRq8#<2,x$^/_8 eϊ7fasu}Oߝ꺥퓿a~?dFV@WW}hX'W0ZZ@0/I.cZbZ9ElJXWruFI!jA;b;;^HںјqGrtXۂӴ㱨m碶;1;/2 EX0|TR"Nyo{2I w}!f-#B3(Z V]mFk.iqF\j(솑ĹAŋۂ㱈&#!qą\pbC$FƄJ]} l]MBQ@=i݇;=4,ظy&+=DK5gJYr -H-"v)Gή3MdڰxX6uNӒ"ME/ iLl֐#s#{-^{^mɽEYr\ur.qq[pv<y2 l%ŦbNۥw7x )Hj$]GG UWُg#?h4rk>+ 7{?$,e/Ƴ^u?>ĄU9v' Sz j$MAjDe S*uT_dE0ɢ"\-=^}#~UX_MNX(^U%qw00q}|O=E`ks hGrI{/8?u&>4) 5*CKح ywLXW_sŲ(o02µsPo9oI/:g h48ݧ>Kߴ<5tܳ>K0b C]M8(½ @tl:lm2~ ӳ;1]N+QʺUąԐid[>ȭx8zzD"z(Az^kqbZD~$%"os3&Zuϵ=@c2v炅6cvA07T^+iIM5DG-݇]NlkBg̨1= #h㲳t,2{D! k)v3ǀ1Lj}GJJ-'pϬ#H~O޴レM)u #lXKC*EC#ջr-y8nnRM+Ys{ 3 DXmIad@u%;c5EuؾwVL% n-E(Faa00M\568pp`g3DYX? zD( "a+q&E,0J8#r PSgvB][%VcSYHv^WZ*bSk 9Dp V7RDAEn^F cx ;qQ+czj- &o&23S^Q=K^b^jdu8'cD)K1E w ]^ĂW4(TշQ.. Aɂ$@-X2x :pf0u:v%s"A4S&U h<"I.' @/B` * <( >2 9YXKa-й e6TgfJKk SZ\Ԩv |YY@x[v!+s%&}_vȎh2)^BER |sB%wqe1Z 4b (ukT\]wt""8Ή(,^q#Wz/n1[u%BdA2SoUV(yô~̑}'D  cxIN{#oTols/kř[ٽg8<&~@*)2,1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@O ꛉ!1 aA\-=RZCL=1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@O d@xp@x*`@x*N ފGWqi:mڛ~CUf(LI nL4'Y"fyYL%ﺘF9ru^}|t^6pWvR$}YxS_2TqCeˋӺ^/\ į[\d|Em?q/F_&i- [lD$3[l xKy(bdeVYpN $붑eG02΋^W]2Iʜ]"?C>HI<os=4xqG!s7m5┉5lLduB)e7dm,e)ʪTEgzCPy_g}řuJ<ݸM[x0N1i\8f  zUOmSy4ʴ«Qjj;qWϷ<7,SvR>U"蠠K謬BSWkeZf9E5>72}1^{42&5<%W! X;KWge&[i:y-_=>TM\}Ro=wz뀈[ Wxw~ 8؅{^sqeUV_3ѳpyI/WTϷezrz6oeU˪ݲrB:{nM 貵hCgip|_t84 q \η_ց(+j;E#١YNRp}KE]ɓ0=C)'Ż*\EŸϣ?Wg2`&=6~#f4kΕߟNr@٫ɯWշH{_xԮE>shRho@TQ)2(sQ٦/%빏m?d=̷h  qlvcdVOńBJLZ~/![`eyeiDiP_]fW!SDK,Dkr}cGAJJ"3KO=!иk{VѶ;Avڱ ӻvi!%]kby"+߱?euS/)"of?\D7;6xEuFҺ~\+?"~̊K_V?%NKD8.qvE#J^ҫ\)QyŜ[u }1V֕-/8 uLY9)+j2-?F͎zr\F/ۛSfn8wo֗Mjr/K_Ҳa~Ϗ~;t?_=u/?6sV[|t*UlΙ "(y\ D§,4̓/Bn AIQp;AGzEgZl^oDR8}xzrue#iܱ;tOOwd/ w{}%R?ervQLJEj[̩<,{YyyFisI/Iw& xLk%Ƭv0 ҍbӝAY00M+ .Yy(Iwˢm.g:i̓@~-^n$xULw6{)1)l>xv@0GV홟nWrY!hh ^攜 wN^9>:zl2ʙCĬ/I~ʸܲO17HIhf0鯗e2X)x=gn([+TǛN^y`0k##YiFu|~{ThxxV;ߍQW[cVB2DqSJa.]|)"w+96.s]zFmmm.@ĚVQdT%R!vꓺ3eUB' U\UkxߕvYB}(V/_rQkE3^Ԋapa#P<4l9TMq1ΒD&Bh1;+ \Y.tJnyuT wn`wQ>t>9Nr>NN'ZKs.\IҗttZΕd3Yȑ9.,l|LFe ײ-xzUxв!t@$9ͨ@"?KW&7}:f7e/]8ZX<в'}K>t.&hX4A>i3Zv.ay>O7 z@I6IgtV}f3-{AKsI=s\,9-a޺$ޞٲnRy6qG9շ""f&plBNbPl L}mZ6#~Uȍg2ȶ. .xthxF VT*P QIR2v%u.LMTBs.Qk˱ \ z7֖mv@lNLSǖes0y۫E<'d:ɩ'AξU֖M0olZ6q$1gm + -۫^S'|*:p9.JgS+9Q 3Y?7Gj: c\oͰ3Y-xKwqu&7wɉ9GYIY |6審ZC=YGs=[:RMF/E[../vʗ0!obX}^p&"O擙G?]/!Ӕz4h^]B rUDB/r ,/ݤ%?/F.Z.JDnfz 5/G].L ra{Y!`oQIT$]Λ ۾7CK1W7K CUU6aJ5op0BI@E `6 JD2//tib`35wYvɡq`1ɝ$Ns[||Փ栞H'?RW4܉l#u 7Y%Ɖ]PHaRH,ʈP{k5OriW XK&fodqx due's0ufpp ȗ7wu"n~ l!ٴxq Pf0ݜr.*"m!)!HOcHr^t|lx"j8uSFRQΌPe<6ON1ʎUʝP)&`yu֖#1| EYYcN$arq I(! KzN <2۶Eva&^N%$IF:9 Lq$Җ^PtO&sIDQ>Z6[ˮ(#}5LFrɮa29+izTή|{XȈ22x\J_`4F5;N#ꊉ(M99QHk裞}?HgF2vOfSW]r 3"2S >nB2zÛ{eHS*T(Q׉5S˿aaO0d@3>;%r$Y#h98 [)±P 1]wwn4a/־0r9P=N&p`ػQdY`P6+l~Ʌ;/bZKA/HeÊpp2Dn0|oYϞOaZL<ӼS-]okM_Q[@? {H:X V ?('Qcx~N l$Qo_WYLwʚiMwmۻ ԢCA( -\MU^LG7zfG7H"[]rN'X0 _[ 1M1W"nd ex/w׹K7'(:1*+s-U QCXORtS rqxFu{r)6.Z\`*;[>Gw>X Qks?W9~MIj6 <.sM _?nz3\xYB)%+0LB0)$]e΄?/ڏ>D+NbF:c0_U9uI4/8+)"my6K2/ƴۚYY"RNJLSfN"HYZdLrD5jk_jD|.}9rUDS>74͖/vgr8;Ȥ%?ŵNo4H+L%0SB) {wc@,l[YxQ(hpʬngg5"^cAKrZZ;Ǎ[wTy)!a 2NӴ,Ek|={jp͗lR۾jnA{NĖ9/ӽD}6ݖwh}:R4nXdXLb>]36Ngs?&Q4ky+t^ZYSL mNozO}jm`dڎ^ôTFhpt,wFVTlʜn*C=47i:5*ۿ;~d? 32Sgj+c;>$,f.|s?9ó/JHFGh\6үo=bYۺD[W+;>uT25&U._[lRE \&R&-q_,-7~& 4P `ʼntgPn0xK!+,hV &DD @C׶U!:Ȉ$Lncp6ͅIkh0lLhͬ0JU{[鳚8ُq9l] FY:pւ6-|r[ƃ4//ne`ON*xx@kކX#z~ݥ 6,I|]v^ן6}[ޛ'-ˮgSp[6wkn_wlj)yעxv¡n)8<#t`sLrcU;IQi FW7. pںga#l*˜GX[O#O'la$Ȳ&[ps_xz(+63mō@8`PoOTh"_.e?ax\%T*кJUbr[%&k֑&?ۯ}W,S@ m( 7O_J8lM;,&9Q>f2>^:.PӉgkS̆zVȮn\HYQ%+AS cpxDRhq-ŇHˎ;/w}!â0}7nQ>Dr9h┮)ENFbޅT+<'oYAȥʍ"V]lt1 zJt)R?eo h FyT{`2k~36< hGowVvwSD;Qj=*Q פMB{p^g>bt7ŔUAE@Ldߕf, TQYUeÁօ [u_EaϸϣTI"sAB MgsW 2v?y}QHw.T=*.G%r+/zw᫨-2Z&d9)➎2@]j'85H$SW2綗]2z:U` S) \8̞M{Hbo@ڻ6F[ߴRC' C͘+HW =2tEQ ک. !Dr-h#LSerksNuD!F7Ѫ)!H^n^ bNR'6V?< fZBrn}Ekn4h´>bhJl}="ɾ`h&V^\f <)pvǞ +K1{-I kTրZ?QI4sj^X1jID6oʸnlL~PtVc|phFRvuj6XWw5|i"]͸9B%g+jvc(H?޼],Դ 2C\{,w`})cZ^9n Bja$Z6wS"8Cl IP.P罨de+vXNQ&GP8{ IąI,l1f;ׁ<|xFvSahn}XJ_)Y kP),>a晒M 會(Z!D;J!w6Iq(ARYaYL&*ȤI ie6˼l|^|=:s;ѪF] r=9Jc;a!7zl!4J(N|z؞aȚUII I\)E֧ 6S'|\oJv\lyiOX<#,"=t?)Q~k\y}FB4}JhbΉ\0 7\5_FpBe-pܹ4 &lExȎd1 (a+VbL߿ T/B{X-o?:;i˿3zZ39m4gQ~=Q:؛AճmZQgd͇̆~mu? &cKX 'w?waZo}=3M;p<1&L#""{M;"i2nNuY^x|{vT…2ї:GcrƐٌ^AcNQiFHbgEJhyAr>$2Foy#/ ,CsI+9L).6LCˀ Qe pWO$HBDA5A1{~:J\be1JM!.dI@ԤCp>{T)VZC@Hh0cJ4F1|,}Pq8Ȏ]$JʿdE,L2L8]"#ۊ3;It;0xp@H`!eOU&"B /T+5l>te5qz$1C7XV 1V2z(٫$DP8Ѳ2"8gPBguG>`}Ѻ>z+vHlS`K̈ŽL͋GCx,9(%*I\琦 #Yb5!5:S$p_a#z&ϽZCuFB(J ^S !R4m\+?M%ܧ: `wɧ)R_BkiY̌KI @CX @Fsđ O Fb)pLp 6E)=X/P9x74u<"E{C4^4QzʡM&l ۷[Cq&:k X M3e1 F +c0\LHq˜E &H `&n}a3͢T3.aԴTūR׊>'806fe[>lQ7HB2نz1{NG( 2\hb82u|߹wsŐ.C-@ xgYt]htBRtQ̻"rGLA n;oQԧ[Q1&Td'`t:DIW I <xTBc9.aì úFNގ1Z 5 |׬j Y@'!R%D%rX\7WZN<ͯo +#v=z0j٬'hqw' GܒQ~8t<:%4_4~kC\e>Cjaw\! c+k1-߃qޓkR6μbiMquY7 X!= pQEp l~57ԉ0ԀB1ުAz RҵBQ>^Uw@9cV*1w hۥ:o!u]YKک%B Cԅ ;>z.¼Cox3 ;GKL~ m ɝ[QO_C |ّ= (Ts_+@r3_?~6gb [Vj'Z<, 125,d_|v{Ͼ(,u7篃dugE۴Tm>e6e2`-L`;ǖc0qdF`k܅ju~:O9ޓnhp=n PYII,\U)B!x@;-\{`>ӼbC=TJ`19c^6XT} OcKmP%J,3~JϏ^FA8ꌗ~\ťO30)h#cCH4IدK GY5427D1 frB7gAu*UO- MkuyLXu 9tx]D/e[&mf1!@W"=Fm}Ϋ؞h.4 N@-R3Iq4Q$(^ը ryLHMFՕ7e~p;)eWcvYHt1y|ωw~⊟1$RxZi|@!0K{PzIp<{)M. yN2 y ?ld\9Vv@B\437zO$ Ӓ۽󗅘7VB o\uYl~Ǥt`IQ+F4KRz~IΕPI tm5޸/ap)ִ8&5V:bqU7 e*Xd&rCYg'Vz?SKn8 5 f:)IB8=$RZaDsRr}G6_!XU z Jy>yl+&RgfκeQ"A(0][kowm]?dGax\ wJ[=r/zRq}Wċ^|:mkլӨ>@eY#_ZNjr2qlL2f (QB^l2T<*-) -Bfqcl/iĮgQ;[|?Qe̋0׌3rLܑ#Z2q[d1ӸNsHi&& z[;*:NkoWW/P$$Fw(njT|3,\$s5܃ʧ{`j7U)wEZDC%_1Qm%L*9X^ʈ-c(x2,HF92QC "q,~vVMD]r}dڱ ~G:ӟ B>19i"Hq- ˰bnw8]&)tr1ƘJ4A^gPI`GHd-TCw4Q|cbT8S b ef !PtΛ5[Rй?T)|(tn}h8c$[,FY}hE/z5<@VVHs>3k|UU]J-z|IxvT=/yރ hw۞LUz֩.e!=K0ȝl퓞xoG6fjȖHRjD)#(&{%yI2ke܋bECT&ME*Ie t!<+x#;W- FP!Fce2)u!Y 1Xm,';-^k9jϪ{,je,s&7t5ַLuatGv-fPZ /:5=j3;24ipj%l@}4=>}mS_CcXK}R7\ _x&/2jB>[N*Q9 nqU*$48&P :P83y;~1ӬFSa>_FR>כFn-!BQO[{ Kr!=Z =iR_,'AW{  EMn2mA!d4:I XCѲvraSQzh)1̲@F{x%~ε?p=jh\a>;^$MS`**HK,@adBD^9pQ\MFw}w2Z[d?}άx(ߞ]{Ҿm uTA%;^Ccо9}Nrd {x4ЀzzY-IM9}ض>ۥP-ļlKmfȕN4qX3^k4~ |MF*ڇV]} g-}.PڱVh\wڥn_א+J_VE0Zd?Tkݎmm܌x"QMY ns(qkӸ9C)Hx-n਍baoBDXw3FXW/ z[ƧƗq?SH^l1j%L{FjGB,Yi}Z>V|{˚RҔKp~/>#&]SD$oF~ɟSs@LA>Q ]:I_o &"zH=H54.xSQHVA{˩ 0q4SR`кm4){ԟ`џo*//yxi h4pg䬯xxUI^z \Zݽ*@kh\b\ǂ 2XB ћn1(5NLOLo1ux80u*X xp% P1KPE^䶢_J+m9׼Ag81~)*]"5 }09)+?f %ZDC d86+:$ }l#aaut+[-]7|h`Bo_ bҟ4(SecU,Q^VJ`-VMW*H `IJ*4أ*Hpf}5E\UQ>ͻ4o>T+uޕ!ëLf(> ߽[ u\Ҟϟ4Kz@ɍkAS%: xΐ^ZXsÄNƞ!%piRj[#In cmM@:O- $5(ĬޗS^p[ _ \[U ;'CeX5$RUPRnJ*>ƲC1(T0=ZEɩŁaI7CSn,חhwdtde(."jלnۧG ƥCj)([JAq&ga:EZXZ{:\mtyVi ƝwF&9f g![ .]Swjg+u.S*vz.Nǔcx^5(feh9{|3J1II.U#&rP/ nAΒLfcpPkJϖ˯<6fUɤE g;v)> W .u9t갦hN^5( ϥmh#Fm eqj2me$4;K*f]r}dwo@N6i}L9#ի[pKт"yV{/5/ʖ? ,0RM3t)Ə3)d:E6ȫ _b\ڹB sNp@uk)|`&7yk^:W:o9M>sC>#J*C4 *6gFl^>i%,~̗eǩfiZg0"S^&qa9NF={|U1sz>ad.3 $!bwkw #DN#xi lyʕFʸIj&1XyI2ke΋bET&ME*Tex4={#cw&@-^]-C*Fk( M.HHurid@f| X) +tjϪ{Y,j$y0s54.a!6C`dɪbf @ įaiT%/ѧhgk+K6O&؎^,`U0hUՌG:VFH{ng=a@^L  <<)l4]n/>LoL8OuBmW}m1 'Wb$P[ife^ۺh`Zukn#1'}Mװcw!8k"4JD`(DY TB54\FPz,xUmy<& }h'g*rqK6p%6ՔgBwHquA@pܵЄMՓÞzvaCf-;\}< kce`g>k͛om}1Z@F%~OTsd5)4Qu3O3A(Qk;lտ)M(+"xhB8RqX LLB'o_~!EaG ^4#OD ]OsU':ϟiy* Q\<_'sqw+1cfF"7/82ma3NpطM,Yfp)FZY>Gv[4툎:%:8hN,o-ܗ\h!ܔJuNR>Tvx.s熗3$Rlv5rŦא{j0j.Fw5h glNd sJ[sÄN@4.xV. (pzvz8R2{[OK _C  ܖz4I$ P{Y?'BJάHpθ- )X!aEb,;X"(W䜳nC[ /ZC@m1ʷ=`+ y׵M'sC0 ZONL% (EPt* ϡR sNVk,S]G<܍?x2E空: CN1\k*yΪ/S h4]SgSOu?{WH1bh   tâ̤oK%\-9%eLg| d3HF01j1IGYY5\TrS-]yGv/)c܇3t}qs4os_᯾>< ^}a8b{sUIFV rp*iYV^GdewI}D`JJrዐIt7EC&ճߏ%%CNa=#znp뼷r+ޟ-aƉI},Ր@3R9xꄰ$Iv66(0Wk9^sx9_BQc Kqtuz-Nʛ P7d䕰C߫|:T1L… Y6%–,Tl76ȀT,Q;)$V&]d0z: NeGyeT{-3ۛBS? SIMPBkwWh hSb@?ωY$: \ ;sC`2Fj^hMꅱ%Un;k~Iqf8C}Tc{홞)5Ŗ5%%.-=~? W Rޤ,(EP} G0FWhSTs>xug8NjѧйōzxB BdC <-"CN}ԏ$2 !Df0ь)3֛Kx%IzvPq6rdLqX\3nӮr|GgJ\_0IJs%-pkq: m\Xd xf8.qMR! 8.-Kdx3wH)η zkݡ 39i0GRV'Ƚ0SGqoZήFL#LS:OK!A>#RgR ^"y 庰W$VEpςB? F䩘E~i[X/AP)4[*mLRhP:o0ecۘ+k0 s!2fƅxo Y?NBgb-.3T3NW2">OȌ:e逢S0vuNTJ.zZʙk׭c >Aϼr,L+ DRiTK/[m 5(2;}j>][Ѓ?X5-j6Fn i@^zJhH&5׶hJT er1> Ĭ2*Qffɨ;G,C{ֶ6mH&ޝTx@QE}d4!Q|}#d 6{ [tgj6}ԥzZ_p,S8:ŕN*Jmex?[j#Iq\@U$.sd"s)\c6My, u$^@?`\7 sK "hKE-diTI&Z+vH*3mjJ3}K_3 R2􉐧.M3m3&.zQk +&jyZ-8{tFէ֪ƕkIAmqJ9j~,2\HAff_V6yqYr#ԘP6Z n=&HZoƭff_x.T(pPkp.e<ξp~\f_F?Uqy/~yV+} >4cۤg{YjO--mO'=NW^*'[6\Z{P,[ ?^9i'c 7uwλ*]f,#zzdv?īKg8so㦵ɢ_r$0ۊƭpQV'ZYp@D{J ṶW6cXRJJuusWۘ@k3{GQ-dPcK4iA}ɢVr 57VA2P>"d3EW̅iy@jC R}!O~סk}|֡H?D$zA+G0*L"^Ǯ "Te#A }e@i;|,Y} > fr6|+RRY KqKU)27-&#:MA|S] b 6:^Z#5q(TcOl9TtJ@q%*$]v+*|ssYr#x )P,XKN.8(;$c%N\U ۿʢ8^R&A^l|g5@ Ɏw~˯{Āp4 $8UCmx8gM'.3eFV.խE%x>.8׿.b:Xq~»Ob5O~>_.g'\.q)YI);6b|4Q..aTKW;g ^ͤϣ&Z^t}V4~~?x="_["H,gFd)]ACK,iҪ쒑rwT}vQi tC 9 d'c=T3xۏḒ99 %R#.hAL Z{B&AO,(ol xìߟ@Q'S,FgTMTFWδmy,:<9!gbuT1HM_2NaT{^L@6߮h0^_a@=k֤lzxSύ;b~๽*kU: @$U%qDXYyf ґzZhCӏ-Te]@ȵ= s7y.o 7pz|Kpp3ذ{Jđؓa\(>C:YXcWՔw^vHeGU%Rbi<3PIZh|-j.Ʋ=аΙBP3zFe<#g:JRuq`dErh3is8@ >ؠg֐5Rz ho՗@mGYS![jEJ!XCs5CZXc$Y0ǀI;)7+sۄI(ESeܣ c_gѠ$`&|0ig>S?:szX,ᚢ! A>M~$`!ySO_Lw(>GnA .ʹLU%a3Te&~Px_]o9W -pAR< e}ƎıI6ߏjjwSǸf'N;I?RI*.{Z+]1yLg.eb҉"dݖk/q-%]0(~m~Ϸ >\m:C 6f_9)V Be# LU57q:&mE?wԲ{/%vCw[ǐ6e8V?q*(c\Xtx3zJJ>"=LI2Wȷ2݁jৼ䔗\gߜ$?6 a-*E̊Up"xYmZ3Ɔ!ZO8;Թ;!$6_xid"KS" YiŦsYи.efUeGzZ n"*+A> {g}Sf"4#Ys%u?fJ&BPUm*cJR`1jUE*6и% ԣV(0"/˙=mk)~@}='[$\^aikc'*+J\ׂR)׺SmtT}b}7QԃZm"v-<*t>Zkn~ PiT2KrZ_SIIjσ6NF-SAT9o {s}frŐG.r˪gmsrPG 5Ahtnj:}%pbևҕY9H SFEF!CD>O"@,E`'+*n"z]cqI05'bv[ևlس !5z~)jVqjdClE_WDLDP*|I$e4lx4"N ֺ*tНч":@VFQGJ@4jSf zpc;}wl7xm{aCCźG(&];*rXJ$^-,ѐV\m/zg=t]3E9]G›/"wۗ"JS5V86S5rz1W$ o$qr7$O?,Wibe~? G@[8zNql~O֥ " :>CG.@K#i atE/R+mL+r0V$gx;J ~pso]l=YOwGu ̏EcmQPs2?EڔF q5ZXsj9Vv s[ׇ92yZ@ecn>5z驶[`AV@ H :d pVg&-h,Eb*d;M,EsT>/}c;ڻ 4I=O oQ˕ ^btR*k%/hϏ$M>{Bk:[у4 >4H 蓦믶UVhϷ"#gU9fI]./ة]̵ۓBz7וX%6w BsJu jY-b4/g.!WtjYH%FSBa42υWcQQg5 W EpHi&➼=ykvc͌>d`hc̩B.@hTfK8By `K1@abz~{c>/{ k^_/?^VzgCk-WN "T2F? moW?)|%G'B] UTc5F~Gob@SŃUYl53Ks0[h³Kp]A8(ʗ٣t^Z%VUB$*((QZX)@u A,,B~CxidwhHg'>^hԏVX bt wr0wuIh})lo$s-Z Mz abXc߬󸴾-i&OU&>Vgq%GoX&lYEuSe&l?|߾Qb;JO)ewQ~><#vJFn1;;`Q(M3 ]!C)t]vDҒv8PMo^uog@)Ah6&+⑜i&r 6d!ޞ}BCpK:cJ8x#!6W~:9 F-;4K)jtV)gT_nt_Amxd4A;4Sؽy/U'R gi)[6sm Yk̜EMAz'hҡ;@;hEԬrv{o=c00y :eZ0KOʽBmͷ"^y'e{{{{wXI+@)אB*pUPcuF~T[[ol[Wa͍^.Jo0 m_Z2i%}Kp(&_;~, mO:N "sŐ5NivJ oJO;<%X/܀"7,Fck`W!ߪvtOl6}F >-Feah3a;f1WTf[ ZyՄQ'־'h]sR^cGjkm^kr1} 3Nuڱ8S[6sb%5sFZH#&hƾDoP2QڱaWmHic&h7A{L1[ G܋U!`:I6Tj>&kk1:Y,%Mpɶ\J& @Z[d%چvle^W}V!ZՇhU[чC$x[f)jw"9(;tFcmm2+jQ2b2QLSQ-ņj);%kљ.5gY3A݆AoF?e.s؆u @y Q*oj4> n|"dZ,dZ;Wr8墌euNRGt9L|f6wTt5DJ!!U۔ 0nAoHƐl hkva'SYC&=Rb,UfsЖ[?Ai]l Tc=SzNSi@ zp)}(Kr*TyZk!bk&`+L.ca]ՒySߣ4] 0*֣Gǿ_Z *QR l\U= zQט -ձީپ;9whP%8qwX\ur)k"$JL2}Z]eC(CB"X &8VZ SvѲV2&);)R̕ۋV3ED-.Sv|}Ӛ ^,ߛnuŻw{йl!b&m*B+E"@6"*@E`X$T_W#x?!W,oYkAS&vo\c'wy@ejcRh !JbA%UUjP@Xc] NTNmݲۂi3d](]hٚ!xPufܫSk+X;%be0fE*E"3V9iJOhY%R(W=&k0ٖ7`ioL8Xe;~pmt'ul RUbxk#s ,X3ʚɚJ,Fr^oLQ7.c2U8W-JKm.k<Tl^(ۤ7kK0%؋<3U@(yUkv!ؖ9+YV¦V: !zŐȦlReEMkdX`+0 h}]3ͦ;]b9ܹ+pp0v%d&gNOfe1nVw;/+0)mS3u\=p cfդKlIk sҩv-ðj9F2C .ImYR/tG-ሸ]fK5?AtLၟX9\!iZ5s@dV^O[w=!X^ׄ@ӜWYةd\hG=M\VO 9dyWu)KZM!m̛^؛sBvۘEh_C^1m:Q4cBv>e;6fn(ڝȔV1Jc-Wd!o8GOc"Z}΂ٻ8$+u^"2" .OB^eY$/M.2[4-7@Q$**2ȸnEZt= &|zc@jʉ3AUdvX$Rv9b#t nU!WR$۩؟\zˉ4UN|v+yTnIӥ0/~:i&6L 2K 36Y0oކZj>/38:"^#B_^舸ʣJp7<@!rpc:z"v7BKB q7P)ʹ;cy9`בGo]LGwsgGAKl֓?-'J>[.Ry7RϛںWgsmq 9w]~O :8wj'=箙 A ;W!D_Qg6_Xk3'?~ zvGVk#ɍDhgknd$Jjӹ~8V}^~i3HU*yDÌ%Y*bEi׎nOy%ĩy1)mAGufxSyrcau^]?z3}eHZSMoa9bMDS3ޮ-tP#Fu^?]NGV?Gl6BI T..BemA-]sm nfk=حQ̅j@Sb~Tu{ &˙9fqƿ/@J{mxMߜj+C9#\VK#Nί泩ez]|:%~ް"沜8D[.C>V s}o}0~P&:t6 4C!jSgl2*u~~ \&p@U֎c1G'i\\\pkjt]1ZQtYatT# 詼^r l9{gͅsX6^\OV o` LS^2F]>TZE,jtՊaW Q+-hzNSY-g}Iz,Q5$s$s]'™J.7)~OM2k +аVH sG^??^dKjQRÌf0%5V&PLI<Ё}c;~%+n 'T6Œ𩚔lbR&*n(^ag.=Xή%r [qsK]}>gtWfnF8G_#nxNOgxW!ؐ[N `r(b\'LAc.+yN£r2nU.\Q:)XJ9M%җx{d3 |lO-(]:kS`yV׫ Zun6&QR*Ua&TA&zX0%[ [ueFv0:?#b{o肑f,f83bdUyWUk%)mSsmڂsMR1 *Zj4K5^":^Ҽ>,C1h޾@CNIsӱ)&cF5^tZu's UGAIҶy$Q+.bѳd{#%+QٞFa 後5WZє[$juTd}"7BH0j bH 5֢BVKo\9Ghyw[BǶ%z%DIkQ}$ OHr5  _RǞدNzvwMP(>a|0ǜXQn$V32H13ދDAZ[5@('(fm`~bRRU$;@1 @mmkn\rfs:S+r-s#aF9b0%cEҫ={"ls7v#ldan'I>i.i7G ob%@wgmvL\Rm;nũj;]A8mRE#)dݍ@CIBXQt8i N^VDС8nWR:X5Dr= ϕ61z+tcv=+ ZG[n'BFRS0Ijwa& ǫ;gFg\k+v'$Z%sLr;9pm0re\p*r껷 =s\θ,g\v7VX.'.P>m~YdP!+RDqGvv|(*5qEЯyn{p&( MVaԜVmL;+P}i\rUBP\(u`fJg@ApY* hmFm3lۓJ\g#Yb!6ag#wݕڌft6+YJ=IYgq%R%fʺ \ fRVMu|17{RNEm@r߽F#rޚ^ fZZRZv8j(zkV3 u ңW[A?W$p9,38a'C߽_sF[w9~ѧAwu+J>n2jFE䶟u+4Ny?HJ|7"j㯗~|?ۥ7.B+{wwo.޽,^l.:/_ _/,\yXz6&UkzȫZc|y[9 UzRc A WABnZsȣgvCUX."~y9/owflk=Rȳ/!IZu3P I=!R&YugHӀ_߮Ȭͺޒ~c q$2 7_qbVYCF&r}q}y2thP鎏xo%.v*.bWlݚ] 8Qi^TolTI+k,SEqУ^TP1b,4AM(ɏ1(m,R"Nfps 3̘qeƌ+3f\W3Bm ' _YubO^2vJz:bf2ĔV3<]w7eD [3rEQ(TBW*̦?p;*$S'!ž*Z %o<9diWdMjX1Fkئj`c- `"v d~+8.$Mkma}^+kwrEs +,"{9D !Ɲcn($IQ6$M92zl5_`QSW%dSjW"1)Ћ/ު\ihR԰dm/Ͳ_&"sƨgZ zCa骡>5eMM׋v ߓ؎_ߜ[JjJ#3qx5 '۠wLnz$!{ښ'ǘ5b%u;Ƭ$$T[g.['$v=KWT*"EDLN"gTz}Sv|:+wpV}fd']~V ;HrvĿi þhH,tV]y:ZJ!wj'K:wVHrV9PjV)R]=UMGrSuY'?Nbo8T=OA_?\n$f{Y X^~=;pȕ)XZͨ%"tkD ʎ%2~- _6Yπ_>zfj),8F_7}>x^{ghmvʼaD*Zx0 [F!NS\>1DܮW8bh@/h@0-Q 3* 0ư@oN1V<|բ9,t:Ҿj;ՠՏ}m9zTܖ 6X J$;Ck#`{PA*'YnZ3L X`idym*B-4 `Axx*nu"*kkHs?Ρs%a H*&k2e7Cӻ>zaFo9T2r5s@LN DޱM8 T@iX&!nkJh]j5ޢqf/~ < 9;]$z+##8e_9fM|;z|0ط愴BR4['yAp4\]nKmC 7_߯W\_.\.4#\E.x IBix)f_og*c8ϋoujcy{}7s 4;NԞ9#'c˗Y\_7B*#$ 5 $*j9nWns`'sxHjWkl іz˹RiBq<bT(Iu'ጐ )'fkaj,qhYV:@*gԕDe1P p"K]¬VJulK S7WsAØz  µ&6xޗY,|Yd2 DVˮS|hfVs/(Cya_ "&Zt@tڪ-.-^3HTCD`a Lntw5ߕWBQ)|w LoT&.G$6C0EyW+ Ma'R-D"%榨YvjmSTgB ^JN%])k!! `Cegz34}*oIٞϕ.Sͧ7X7'sw|& q4bove>S&9VvYA$ҳ/:AIkP,A'X 42r`Qs8oB$"͂1hMfE3e{ζfTZ]颾N6px|!<&+(m &mk]0v^1ӧ_־^ x!,!K鹘mFSD#Yjц\bguL.5XZ u*~Lw~#蔇A譏ԝyNŨ^iq$э&\?Yʟ%7"Iϐ r/9])^,ؖ]W"_lѺGW-^y_-nHWizdá7.U@PHxdW[%'*Y mb:z˨ ܥpZIfIV3~paIo'A9YBT7FYd[#롢ǭ$w('+OCW>tR&#GoQY9j٠rOA\m Pw9$R(FQ D@ΤW`3}eɊu5#F=+22)]?ihK> yR\`XXij/#"٣\<\ܜ/ ejIV,Ps $2/:msZ -(cPjkGlVTܛO ⨅YZ+QQ>G=XtϹYt|AtWE>$I00⃞-> ׭{uz-Qp'֭In_3(v "|n)W@PniJ{TM۾|QXù,ڱRL_K*c!:X :m%(z 4)#ARY\%BS*Sp?ԙPn)^RʪTs!J,Ro,-i<-3:ح{iމa"˩%-s1t`uNl}2.TbUVh<*)8LG`jMMJ0#W<=)Vy/`Q؞$,dI-fحV7"*VepD%zkH Zc0MPS {ofzuefǾu繒DH!8 9r6>XTAW?k24l71AυZӺ9dq$l9 &&KJ,9؊1cZ#CܪN3yuOGo/;/S]JQ?~_0STpf'U{rNC!d}ш"VN*Wɷu!SlDvAB3LmcHmU?шaMgSD7{'튈7M =_zBDtCGЩp>,Sq2Te@1VWTbU )-nqј([c9$+s)y&/i;'$/F |fHbţvZRˋNsgniF J7R|,MR1tʘp$u܃ =@k{X 'WQ~hC_16uVN)B΁Ss,@фU9ɬ1Zl[l69kc]\!(`eA=[I@3U'ל <wDHZ\ Z[=n{A"n7rYamo(lo Mayܰ f$"ma邕lG~ZRk z[|n-{: _qUҬCJ00Bo zu—ʥaLaA*~6-vVNm[E/Upj),:ˬQԣl0\"d6hξU҂S}Gq>9CPq?۟ޢ]gӟe+skhM7)4$pNh %|S7k9v[5ʜ8rub*Vv)+Zsaw+ u'*]>wFV5*5**RY|Hjm9ڲ/h%8u5"V2&D-[q3dxjB<&7LMۈ͊T {;@D5SbjB >:( BGT+Tc*"c(o$۽4 @n\IaNUI5 "ǘ|EAɖWzkjjR)Bb9r00 ]х\7s)_" DtwF s!X*OR /Rƺs@,]^=x4˫ݢXte/as gيy6e}G,kb/$|OO'lETH]FTטȮ7kiuEYQ?v dZ2-/ΆqiYC S5հueQ>LհKD*$% υaϢDN}Ht+%!Q7-<'ID؀ N@tsx%nĈp57@)m%|OфAh x2U_ ͮ6kc!c!"Do$|lX;;\>qKc3HE( zClx8 .7q4'V92)ii6.'e^K/ףwWgY@wK^-ศNpEL\BŻl.;x<}9&`E{x%`"|1v`ߘR}F@2> @* U%Nca&;@6e'cr*<⏨1!Xg7HUO՟]0&X'xt+/Uwт { n󕆇wSJL]Q-]]LO2)&^_l{P;?PQm-F *8zf ^K&_UdYm=3V!_Q3DE>(S9hI3CNʳ boD~ ߺН53B ?q.jDv/ ^.'`S41aϑ:mUk^^]l2Uda`ʊ5ۦr\8`L0=NV#&=5k;FɅX{ނ$9`B[ ؼ/KePѝT[[^ﺻԽT*υ o>MI@7ϓ/ϳ:YM @7[ y6B00"jI.ߓ:-&yzy{`k[v.y>6Hi`$az-ƾe`w''ј0bTȄQok R` /kւC&:aևVk`bǟ~,~8B^Q;:h)S>o;?ͭ/fK]sʒ9%}vEϯzOQTP=~%^Kj%vVϏC=,C,>많d>ғewVAb?I}7?7|PnoјW贞+*76Ͽ?~yp@f4UOڟ?]w?^ല.w<:}WԮNT b $EB-2:CSU^)ɽ ,^ ^PݟZfsU̳ 5مӻT$>Z6Sъx.Bs<~|$~y--Y O^Rʁ3"7z*8O5{+xG_$ܸ=HH}nbǬ|q*=˜N#MqE`#c9]څ06VOS˥اƩгPSUgnTEX+}\ ?emr8vA xGcy6ZB*58Tm6;j뫮lɚ9МQ'c06=UߢrtoyHmdT:֠tKGcy#]V馸Z-e0 h{j?rNhw<uSnDQ!SR/K*2vjq-vga`2E+|ڟZ\σﴶlxb/Jcגgnr:=^H+i֡ZU 1*.kҾl$s@JygJ O꒑Ęt!"JdYQ u)IFh0%4d;#`%Oo.I^S#tceRW8ѱq^4 \"ŦDoTMWU 6N6dDkpH|!PAC55KMh$[rtpSE^B ,m_&[Y %HJgMqa%װj c4S2+BR4&ZGJP6FynRCBGsQ'L  8Vbd^AYbbl9i,7|VK7,ɞ41/ rؑs|X>[RjmYjt}ͳJ`=f`bͶ^F4ՀQxs*GlGh s}uBig89*Ȣ * QmIDyҸr9֥B}p"bܜ->)cGdmԢՂ9mՙ+xn KVֹk;( %x[kI{ Ό2+$iyUQ: 5]g2O^kKɾioؿI~5žW'C +, ,"lx#&߰N"Ʃěi <6xl7Zo+y:2qoHE^~jYJY{~Im87vPn7wש 7ux2A\7s)_^fXh/$K5s M4K r%;|H9KIP~+Z5Tَ٦a܌IQgVzcqH zٛ_zoE:Rc=ioGNu`+`|XB2cJԊ;)I6&(^>wt3ɜse\i?Uޭ67wS<|f/E%F\v]a2~юuG|vv@X~{*4}l'Lմd9]o*C3U[71b)}R29eSDci=ѹDN[5˧O6g)7v`jg^]$b9Xqfu冦NdyQ;e05.pJN9WI3тf$44.W@ ؆s.Tf r80%"hEJ!g%E*^FA)i M^Ɔ#CAr჌+ b:ZEVX" K!XR0N#.1Cɒ\\ĽE\FFP~m` b,w ]Ry"m\;b,AH :f2`Lf-<HMZ"FGc4  !q@ e^ jVatA(-]"vPBOԋp`zlu/7!o`FJ0(*Cb>2!5c]AaP"^gXcD |F[lmFjIbN8%Qj +˞s.rںh Fs ^?rdHI bSiY,R`T`GldpP3TQk"L= ISD4wƍ`k:A>?|{GH+&!PDʍn&w/fQ w<@TxlcV ,f,Y J@%"C#{Jj[SN_R{/A;(S!-8MDGsdVz'unLmo:KuwKv,t7$I;m]f4&+S E2 +2z$➭(ux,e<*뉇2HE1:F=IdD'n'[Ւޮd]Be~ _7n5V%sQ y>(+czeL`Pܬ&JZ}l xI WUI7Z= jdB^OH#Z}Ul `>W.Rh${: riDBp (_KxN r8G ";ƒ :lݕu%D)#Xϻg7gJW)ӌR`!yAW'bS\UcS;n=Avf9/֙E_mlKM}T@'ꈺ@? @1~TΩ)n4GxPΗ@URS 0D)p5|o݀ecTu]jjFr^ :t~ny!kh4W~ޯ?}~~d/ݎzwPyB)Hg_U/EzYi氟-X7yrۯ:z N(]9_F#lǁUAR?`}pyu S"!E]cϭ!02ո(\EI]V j ;Br^c )՝RZj^!L) n|ix4|Ԧ2K35#Q5Xtn P,)[&y(hЌw t@79ъºw:c5]JO⤿:8GN}b+̈hC] %K83RbA\*]드^І'c1! İΎYdDMYaδʞΊy9V&܍0کǩn/* `ǜ(xԂ ՞@u9#hh;@|>8IJ85ZsM;u7;7t1׫pβRlk_`(?;R**C[qnqݸQjFt 1ƍn;*YY-݆/;I vƍp<țPcVzRr;rf|f@]hSZHD"of;YD^Y)=2?J r\i!yZEuZv0锦8F$`2tL:LbI*kmVes;wШONKiT:}QqP?Y 2&; L//^0N|N2A$sH[c+KB[9  r.L{9@%y4-S YrCljzu`m;ꗀ_rH|4rM,RCt} .5`W>AV(T3#l-wiW? te`}=ˇSާ@ ꋎhRuD׻(%j$u]T,foS|7$| d!-𡩚D9 ^\ūy*댠,]xIIbP{?78#۳|V,tpAM͐"lȫX_CAe&* 8,5>t-$(&"mtDN+O c}06oZށ0Ll0N#|-qYL t'] n;y᭷Ófz8+ѕ.˷̘LB1ܥviuFy[o~>/j[I7Ch*Z\wxWp嵝X|6qKΆ2&sb12xT&W 0r |Ώ3 nz0KH/)B%ı**c))\U;+ArL6bO"NDb1+і 1s{ED#`=03j]NkJZ=}n^Ȩ-4[DzN0\9&1$,#.c {C*EohFAks7 ` oS!-FDGrɠdV<`NHv'@`%mW%HY`l 6dE11 L5a"Apz9x9 &H@ G% q[”t1EĦ%N82rL`x(Obgg&SN'q5씦RdtGF4Bη{cAA6` ~ >#j@2 ?^}Zv6aA9x[3^$;Α؞"?__ 5:.# M5JyNH 6S0ʂ>XhHye4}yiV-8z'q?q9H}D58P_I]K ,i/M{Qnڋr^Tmۄems 8_p貰Mzs߂YfFve#oo|Xb?~t7 EέI}tOK7^fbm1O(JOhJ%le!R)Iٮ`͋{~S: d9*9;NӔR͹ I6f?2U~CJ&R:!q?%H>$(EqHу͈6iӠ2+P1 6^1)ى炃ZhkFpd)#V Q06V&߄JH֎Oǣ!Ph|k<~xҏ1ߗWPNk*0,NFnȥ6ڍ_w2?rER,ӈLcKW,)D{C0D,\(;zpB-fa jXC1@f2y|9ǒ1""({ /QH샷"Տyϒ6Vh,h|rMy A0Hmd3pR(mxtAwx~$ܠ˃&M;Pg>_[PZ!qgHOH-Ӯ0m1<̅Z3 2´0EM鎺0V2,]ORpI-YWNjkbrfNկ:N(Hczw$li mϿ?0YgwwS}L +.mS!Xv<o W *~z? x4;)ޖ=W6eɄcp`g%1~9bϜAr`Gii!5z-`t?Sqr&|M,W;Хn.EL)X G7w=?wE:\vzT_kFt~Z@id&SZ mcۿ"{}g_ E M @ m5JrzwBm)1(آs9sKz׳9IY!âܺsuKouy5{(6en!1ֺ3n׸ù9̉}jmff0Mo,0 j?AlHʙ'1RbKv9mdR":.OA2CƊ <5iS3p%P·i%Ty*کJFy҆W X -!grcb&4ӆ,+ E m''IW\re`Z M5mbE@TVwI/m> |"6e6' s8, #~&&<ǀ +AAgi/<('(i+*)L -9_'pxSʀ PNrфdJUevAWi6mhA;%KNtFޠRMbtITi-S>R".5y ',}CzDSYbjVxA,XɔH,q4"^ƑN#&! p EQ[s,C7pyINS9GJ֧/EFi d8:j\74Ogd-Ƒ(Hی# P) N1J:2p&_a/-P BnJ]h:Τmg@B'wlv7u_ O&ݳ;/6JL1ј4k6( fE7'&/Bh{ ?Fp"qϩ9B?^s-`Q$Z<1VbE*E[aT`L`2% 2[REWY>p[׬Y8k^?V59Ȭ5WmR3X-USM1wC)J,Zh2LN  $9R O!(IMC6P&ќH5h}'߰p0䜜 *za\su.Qq,(j4 LS>_%r~5g: &c)p*eiMł嚜ϑwƑyL#ӚI-ϒAnFG9ICli/͛W'fPf˞ʹ>f_ѻ8h( }*&%,xg&?8R/;oӻn,2#ߦUw,V(H$1D%>" cR c+)=\\("@6&RN{T@q6X-9_͗48ʅP})bW氻&NŕBF9OCݡ+Dkتq^!{{og3W_Z3_<xދΫ40@vwma5B]`".0J3ݝsvs RP<Y®W|4+9k Ô5Oe<i!Kxjvv!Giq(RU>FOfRsJR1GRl}ЗM eզ}}Av0ɖħCtؖ1r3v|B^w"ż8g=7a*.b.GryB%Nz z V}Qb۪/?Y>lݔiΕj'R˴m!Ŋ)0T7)B6І7zֿī(w~f#؆1MG"L(Z5w>̙Z+4ye_񛫙5SW7aE{՝77Bs(47Bs(ܨm7Xxe:s35c85{)6%SD hal01aIŒãwS ?uVT VrFVК3KSLdҝwkB(#U(2KTTa IX"m|7[fMbKL'w>cbk=<(8FJ$ IsH&D"eLuDbc'| k)a9| F' \L@j@FUPlha؟~2p4D?x]\x_9UY`r )#IV8 So::Z/"S;sK VI/K_tTH<]~>Ix,ͼuwS?M ?Lg`60;B8J tL?Tf(5. ܶ7\!-Ṯs$nYRWK2X%17RH"DYPEc+R^X+8.¿Q693v_a󉀣A%^xK9krɥج $6*F^}$`pRd bWnQpD:kkb#0@jj c*:UD0'@N` 6H$`FpƁ. j1G"87rbWY|6T?>Ů>UvsjڦW(=ǀxj\iPt,;l9Rku=,I2(X+4jPG >WQwxelV8Z}nڨѢBnQ4hw#BNqcmw|b]9 hA6"V$X$S^JH 3hI-5%Dҕ6U8B%!4;ht.;Yi0A\?!O,lG]!uiݫ֟gjHMQ˫*,FԄ_3US3 ;p:T=?/s-kxŽD&j*2Y6by5qșh\i0~~f0RATSEԎ<0;m*jz sי杕+l錷P'΁msC||FǐR<:{5|  VEHtzAҿ]8.w~IJ$6+OYZb/#R F}oƟҹ}G3軴;uk,u˄05xz~G`52E'}qUۉLba~RȳKAm~8Zc&8?z5Ax_A W֔of~XsoЇSL@u^߇sE% rMEx+-yG?D#eV}/x&0Rrϰ$V tG"2.f8t.L&^F{}s9TO8NhĹڑ{x:'B(x[7OK%lr}3H21?,[fg y~1IvεZ(FOy:QζiIKsp(Nt*@C~T l!?봓JšP}Ʈ]6T*_E^ j}Z\𨞡9r, .[q.^#n7J|z LFPJIEHKn"FqD#ZƔΣZB9LY@}$c e'Hj$zG] g;8"PRS=kMՍv4-/ teIv:3?e6Q*Q5E3 Ja/zvXWhݳ:Tr1k*}sjx(mE!eΙ\Wk>@^]i ؒQ#f1R.4 )fX0nuLBp NG_Tkŗ|19J"ujOd>תb\K&ќ P)x'j>]DN%S"TӃbCȟ$"8L O8Ң{Y`|b/@Bo:KX2tqaYP'r_bz6B>aR˕{jΟ9p$z5篯k\if.^ ,˞>:%oe(9R sO|er` 0sq^Z+ |OR&@MV2O`qƘLj} 4~rtƜZ<5']fl`ݟ޼y5{bVMFӡ*t fwd4Jx>]+zdѡܼWCZ;?pq ޙI44E+/w39g9_f0ܱw޾&/9 {p&6͞el",Jp"Di $J)ĶōgOY!0O,;ʰ$5e9VغhYetӁ+!]\;ԯA);*B)d"Xx{}랣|ԡ8g:L0W.qZѼʚVBø2r(&8jNVs`4F Qe~Ӿ/yb8ÈP5I)C(ş!nV!s^+zc!'V<~,U/GFl"i#@Hz?!MuuL*ɏ$$[F h|:3}z,cLSkrp0KS)>O HyhMV?D!iși`D(!*VjP,jo/Ҡ7Y(X]EZo`&4ͅߚ!`(H * >r""p'#Oh xbcMa+4`ΒH "ՕD"}4)V4oP=ZSS2+?uhƍ˰9溅7?wmHf~ `9$dp >=Ȓ#y^%%vm'Y'N,u䯊UEX5,? Skǹ LԀ ,E2QBiA^:'eJiю3hUwNqow7yC'tf$n|7w^`9 t3)b@* 8&= =văB`qI~iR|ճO&#DX!jbZ+pD,guD!~Y4چR^J[+U;71wAR7o̮d[-ʦ|^+{AS|_ xG8qlp9+ Ȩ\!h0S01wZxd걄# f=[}11w*`uLD*MbV0^>@ 5'9+ +6ڥ'zǣ>95 ҏ>XrOl0d@ȥŒě6sȼLKnqđh*z0AZHcͬW<䔚b:_ZƸ);u)fiz[ltN՜@0"ED`H!ދH!ދz`r(x! oh#4NυDP9\Oa|-#`-$9C37|l֫۞lb$B U&l)B@ҠL"6XyDBKP{>0Tf:x)C<납 V*Nd%njxBx ` N|P0FRy[ߍ$%jxLd[76 fQMBxe,F1ؔJ8 ͂<oLKÙRʁi݀s8$dIB]JI:Z(M{*-al/ C0¢e#|$~QRۂ,AN)b=1zR?W$5mŏ^@zua_r\e^0,hH1F*^~l&;l)u`狤<7#OgX[|/ Y+d`Xº B,s\Rߎ[o7yjj´~~:($J˂O5]**|3>Ki2pBGiAh7xwĹCc]wZֽK@/P` ̑省mݢRs.9S"}_uϿTfx`඘k{ACY=+eyxNµ7RĵSqr >NNi2s\w JNì{eFZlY&aҩnJ (faq܊R%V7+cF|MsĨRZ)U,){OX_Ӽ9aW;- u w7IaמLɓ8y2'O'S]QiqPX-abBJD-!A:lC@>0XV%r+ߛw Li7hLzHx}^W<+U;>9 <$j:DsUR2Kܻ]:yfhqh].Hq\#8dD2DL@L@-gMKi6j/wQ+Fd" -F,ޥmXI_;ό78]@1o\gC%@.kZ' !OE;L IG5;yFqD׈U^ <x(,~F ac ;BQC噁M<T-%\ zQ)yd]“Z!1 t4@;OϠ"@Xa@G\  ~|\  Ѹ(ʟ3Xm5>3[AI Ҷ>DS#^>=8}WPkTA;/NPoRJ!ǺCc-SA& (L$p0Z4Wz,ح0IH˙ 9F?`L(78#c2xբg!Fz=ܯF"-_vZ: !_S`PȶCĘ`ж:ƫ0"  ^!4&ҽ SMj v8Nnɍ8Q(^LzV1BdJ3fƤ8G0Z<("_}s+z̭|o'Z'I^9ɋ,'6|l'N)hnЛyF̚$2`KiĒDJ %HVwu0 P>@]W V.:lL)s)T&vy{hu0*2#^1 लjc~c~&~)m qvץ=+G\7FhQz&U|xkr.bC &iN:ʁ3q\vF;%E\ 1qh) '> \`8LX#H_S"kOgɁCZJj nL6ފl+Syő0vgRh hAa+` ;O И~[s !rH|F3yK5lT^ TytϣӼմLw'q#S`ٛO|`w}z[v&N 'jt@?`}:}ZcOյpPל}) 4?>t$uZ; "Np+t4U6+pt]frΎ~w20s(?8FN11pw7@٤ Z# —*m2;) 4RS<_*MƗjWOj_??o6}/)|:8ZaYûPIM{qq[jxq%/Q4E>?Q>{ThVU@ĭVm1^R2QX?Pcғr%=Ujl`Ηhfk('PO1qnS( /F"3??/  PG✣;N%**j=273w"N*._kficD]*;u$&9.&Ԃnуsm)Y}vs{'>ɓtAH,ޟ6x _%XwMnI0av3q8FRHa(Gl&:,M tA@UUY0=4^oӔy>5nf}b<;3sm-'%܁b~ {3$lA> 򪂖hfcIn͆ jkܘ5-1gļ~6KW쫫T(f{pV Uoj|@ 쀗\wہ6JW"&rg Y Um(P&ᮮX-,o..J]=e'ޖCKm ZPeo칧SR$Ū\v[ BGIf!\[}tm EY눣S u\EGekO 5&1&+roGGqw3j*R Q!݌ J1WHFB)z~.i3Il%=T|\BrĆ*UF,}FvWGXA0 Sר/zF)4|]BЕpFEekM G!q0W4t_-1 ;^B):OrCfL\Uz3W#@+,T7a O) Qgm=:"<5<v//xoV3,SRQD&~#řb!cDJ$$xLƄZYu.fAViW%C=V+*UgӳΧ<:4#\X]V)j$$ LGI)%Sqk- _'ݏ8Izܠ/RDukPe:C EYq^HdN~+~PU.3Ok.$mg?-'CccFQۯ"*D8v0) )xŵz(`>gN(RCzJd?>__.G^.eO{gS{)UMs(*^o`!ץc Fڳds;]͊xfb…\%ϢȕW~^>)/_ F_^O\;fYGfAWax~]ƃKxf{<ǟz9fLһ}a^)4o?Q;W<,dl2y6O`1u3@^R3}; :&`lc P*N8&&,^bv.c8Osq;HwV>uh?fj뤥lHeVi$HM~`fhlX6l^i~DI=)815h=Ƽ[7 ǡVJxryAqZ,0&M,E#%=t 񰝁Ǟ?LYL}xƈ307v/hZe%%|,x7NZVZn)VjBD[R:`PA5P>t(܀Ecu괓Ψ5Zٍ1-ʁ윤ɺ:6P?HYmm 6էˀ2ZT\\2-S߫))lUO~ .Hi>D mĞC"s7V!BlA=`f ``m> 4mn HGDXFAljO!ڿ$pDʳ>F^ R_ρ?N%'/(=u0lLJ#i0suD< tn)V,0jxī@g| :nvT*5_fRz/L\qA>ge_RJ"dj i !&6]yYvr۷>w޹}T?Yڅ72kwp%) %A(:[d8ND/PL4G!0&p.ԢMseR"(&xPϱǮ,|jd1,~Xǂ!Qp~? t`gAdb*$*9UtXYPE(1J.-ŧ0(WDH(@DXFyIɔ#NyrrzBrz؝7;_jK4e推Bt6Oc׭ذxk:70.|#_YFrInX01a(#Kv:!jrGD$ QJv܈Xb%XSbugכlB^e,Lb ^AWwp=Z7pk>7:zhjIQWLjv, ݸx7ލ+w/) C"24`|X+D!% "4V0 3KoJQOZ!]8U4]8UЭ] +֭t%©nχ4! 8&`ǁ& Xc[%k-@KT NjNnex:PEIJkmy{̯X$8m xѿ?xF h}K}x%x;/`wp"88h0CèJX9E$EP("AIHRKd,Vd\G&c&N~N`@diM݂{ACr_ml{|>g*xFrgxJ㚢 h,v'qFMK\𺾛u;7=cRU>pqc8{Q3 N=0fV$(qN&rBpaԔ( 1*D"P rLZu\Yn3yG̛(VVڌ"Â4fk9To`5ƽ ]bEA^aǧUŌK1‘jF&`"QYPFK[ #IPH*Y; )* V1"H P@`ud" TN B1`Qx%#kEu!(X@D̢$ЊDcj\`#(d YkpRZ!<;+xO$X2F&D91:)"Nh$ 3)&5~#Sʘ(DcX0m=1!rIbbP B*L$C(IhXXK0RF2m~MJBVIliMwۀ>{a)KC1eK, 1L (M1ap +087P5L斄T Q|:-ڞ"$vn[^ jTY% aZqhET炐2>uBrF@ H.8O[Q8h#wm޸# "9:L(\\݋ZMؾ{9$k<,/xI^x"S8}۷E@DP(4h0M}z] BoG#5pGR0Q-<,?ݻaƴ&|=)! *ƤIv8Fy U#zEHS*nȁj /FKI09\(C .=biPD֢VXp+ ʥ˪J[Wq%BZnY*m\TNp+fK{$wP("JR海\n݋T\YXZ)LodYUo ma>\ X^ .`d?Kvczs;-/"W <͏eU:Efko=tq$<7Y,ea~]slz:i3`A?8O nf*+nTF)ZxLhέ!n'.^b'Ѳ0L&Ce;-WBǯ@:g㛻ne'^8@yfۑEhW/dklVФnUt~պ q}d>ݹ]zk"SG3u> L8n̰*}jX0J?bdc޿&$—uoM~>mX5"&˷;~ޥ9"ny]04ax;q'X=833.pO\fh7>Ĭ] hU52$V2M G4sHSe;3ʝUw_{ՄL[̲$ Jj]6kyغE5&]N۬Q۷ZUsnOYְf2J3vklPeBV֋m g4H h m +:ٱAoKicM 8AH,sV*P; J>=IxP y34zp6D b"x)Ǝl;擳Ku$I\!u1׆qHin诽5~wcl/ƟG鮉)<_iwfȮl za4|v7({;|?~2{ٻ6r$W|Ya[[H4{|A@6ٱ7rvf߯(vղnINMbX!*꟏/DHRSHTVAT9閪CZM|жF h3b#Mk`#$.M9p.QA䳯Ns ! rI} -*G%G.H(\ ,X\De#KeaY͑2&tLp6%Hܻ@0 /hS7ҐST [d^ݘO嬗eS!ǡP?<֊X57<Ost?~?V"0E8M{18m7| 5c/tS2QVYl kn5lװFCvaїlWI{wth*$`WA4H\^l# |jöfZnRg&k2U>w^N.y25!NLksJ{9&jAP!lJ_!ؾP9RZyCߏ93 GL k &DδP;`4J4`X Z)T` ɩ-Z!UxN $_U#~! E[Ɂӈܒ.׽1*gP>엦7`Т6lȪSKgxWv84o.eE7- `*4R hNl @bUf FJrZR6ʐ$vX77/[oya_ݼ8+? 1^e e w|mWڎ2@w5W_ѠFhgu+Qh[lGeu-0VKU;OͅzYM-?O"L`+6bWl MT]RwI%{WW:*KJy*x.U*Yi\Ib酎fSZK^߯&oSkڈM~d$VL5RI/N-c3vzxC*SGT"ut9&/ZxX|9M~9l LW,%`Qt&T^қe`HVNciPF) }j碕[$o&?߼JD1e5jƿnhꈩb eҋJFW>D&?R9[udPFR##"S+d L:mir##-@~Vg_yoϴ~Lf~7QZUSQq.9ݭ3坆nyL5 51[Kz',.nu[nLK~mD ?k 6zU/a^?񚑇>4Yg}M&frTP!&GO.yobvb?uY,p3ymaY?/;cy7燥e:í3(d@Vof=9/`(P|(9Ƿ 2S h{0(Sd,]ۡ[,,LpVCcojmg*O!ѻhTD1v%HiunIM%O|r_U'Vd[UY#dӥ XN%sEYY/osч.:*]PI=ckHfIe (zhٟ,g>$"hSpY$GIAv,0Z+OV9oФi1I2؞ԉu<}ջ'7p5p{SK?j/-܅ʼ$yr:**6Bt`h-!9*j(([[!m ,\*Zik; P 2l2_ '&RrC ?|I{#CvtL/^DPuAB6FVӜL}6oD;Pke)o]3qDu{6w%C'*y. yQJѺb%vF]bS30@/t\A=ꋵ̑e{a}?pm cڜ%QTzX_V(@coL{C:Cn⦼{: (~ V &YYdu&1z*0`J涚׎ߢ&)~% -Ӆ XvG)<RҪXrkD\Nƪbt4olc’0 KAŪqa*م(8%͖R4b=>jm%e y $U1ۊ+04SL&b(XHe٘Y\mm0U 쀩 `Ɛe͞zJ`B6JJJ.f致ٞz SOkm{fW랲 k ˽ϓt2iT9!-sW1)(YNC񛻻^;,u ʴ HMYgfJo P̃tiS KNQKk <G,ѺH 5hdR4J[:hctLH|;IFu.mj킥Vר7(Jcbe"Ȓǩ(+0S +ܑ ~õVoaݾ#73oPfPN J2Ğt$g,xg'پ&!N'\zńF~? )׋ׯ)_̼yrwq'd徸b.mu.fUg7}Lɏ] ےMNZ4eωiVXUeC)@@JY^o^/bOyu 4da3eAxen*_>@A~)ò0ȪB@q a7]UYY{+p)#ћ[ 䘊E# 3v ӧۛů5Cv3wfonރxF4(]( ]I c.6zHCG gJJcwqdHײLŐР$+a\TQПjûǾFÚAȓvb +y"KK[ȻPq཮8rM>ɗKy̅uN @Vv 5̨'zOnPlgy lJ+Gb6Q[PTMnK- ORaK Ϲs%+ԁ.vҁ =!2m+_!CPeM}S!C,ZA⳩ctcZ‹lݗTдd c"HS#seT&΃M]^ G5bF`VkwYFlrsH$CSi"7ʤȩ5c6RK驣Ł t"5l,܄șѓn[tzv ћrLgE37(Y!zpXþ~,zx B}|`!IYҸWvOVvZ8qe\ iĎ,-E&jj9q&it4!e|o5,CKkmdw)1(s:JF %1\hJ ydXO׻x^b!baQk=e7?QRtG@/CH+XcJZɯ&n9}T.Ji8>X%\oS B@rΙD 7R`1lÌ7bVG^U z_|.6}PAMԄ2A|`:(p%"abƯ |$ F"E,x%貔z_kV*a弹ۯW祰j5>g'EL̾0$-,ajWe&j'Rn $ǹ|a{?y4o;m*CQm8"ЍkgH$;-tfԼkٻ ;: Iv**wojh]5cs:z1+OO31wO6ihVVvl%] X,)dD轍뻐4 ηU[/@!`Ao\B)eSy=HbCo?QT02]`Բ< m>i^] .WNtsu9r4c$'x~k*w;l~y3tJڊaI;LPGeNS^C)Ufw#~d2=>?~J_s\s"1(8r_<✋eo<2*n1F_MPDLʴ0j'%<5󌎣& pT$]Wbr˜pe5|2b[̫H{x.8n/G6mb-{8c/. : ~S<D^`^ˆX]FG,/QAB0W^YM sd/ٶdazc/i_{v+󨮞\*!9#)BĂp'w& )`vՔ4/ӼOr'knH7Z*go~Ϥ^$-OIϏ|`w4_2?_a.2 8:D/PM*"YE 0G %ABds= -jFs)X#  |h)v נdPyR0c@o)=/(rI4+A$q"8 w[<_4֢38HOPeDBN44[u6zFJT)9 4Ƹbtf{Rax@I@z9xԚ*)}E 0XIȶ튥_<_=p 7\Zqc3K')?t)=Q F[jĩۜu&-uF2E$%72)!1I4rN i@ &clX%߮m=ڌZN鴦;Nk&ZJx5_oSyf9ӵJ1j1BYHں,zxc&ҒDګ|1ѥ{}#4GiQO9%պ_MÒt<O:i ?^Xe?|,vt]nf 8Рv 1ek&yG>A2"TF ˚1G&ރ2Dc7Odm '55WQ*6*zxh e\pqƵ|zt)s_'咳!vQ$\pNe#A ghZӉp1h#v u΍v9 (9U}OHRK5 Qꢶ2 NcX!σCT/YGT~.yD괘ҜxFG% 6 UE)62+57M'da0o5=Ũ U~1gzCXwh%vff-vA $P&n} JqkxHת~# B OƔ;9њ+'bFp&jt=Bsgnmfs|ӱ!Q5? ԋ4B( 4aڇgH L~!og(0c#badD nJ _t/F_ *5\RdJ5:+zT jlVs]N4hzr-W zGcީ(T<- CP7.bB\OړAŻ(Xr=i7tJo9m[7[WN1Xm#LF^5uLhuCCqSzӖuS ֭+JTM`/vLhuCCqSm35g-(k{ȉpJh^paB$ Whp10kɡ' 8B)N$"zuK9 Qi=XR5daDa +JK1#zi{IZCCqSNޓl[7U-tD֕%u&ʴuuh֙Rև|*IZ̎qnY7M$[Wm{unxv%xg UtCdNɅL3gΧ>Kް ;v$BSė ZOįV8IBa6v;%ɸ <~ W:*<XRN5gN%5)5iRQk V{|-$ـP~j7IB FD)i[NՋul*I9p$es]@:WdT"?«9gW (O"R-Ea.-d\M @<aThkDv4bHE0V'E!ٻmtW8p65m/:=qnMINdI%i.LE^Ĩxb&qy/7}P[ǏSSdMh7.КU-*#eD o)%| !M5D k]ևӶ4.EXYv#DI9NeEa%eC O"Wϙň%PJu3bKax#*{*1Y}r\w~9'~[#8 7rض01:]{SWwH6N?ѻyLG;P[PA)ȲՀxz S2c4zm62FVhǴ +mRW *Ϋ(P pOAh$hn|5V# 2˪"$5z.Gh˶fZ,6c@ ־'ը>,(MŦfKэڣK>Vc Ecg  ]!6IPTbH/jfe`^QB%.?Ƕ52%b~F?P-;}"U6\qܝd~G>U8ǾE} `#`ҳ0{#="5 V#jyw<>%jlZ v<=54'mG:?+ϵ54@^@/.FHDZ Jw!is-v󖳻G4pv5+m#`lYYvVHiĶEmڏvmWhtT4ZAq- +k۫kk5k%7N klQX1^Emr°SD]*\q)khSqAguK`@,Z9WkH@0 M3+6 v&E0@(};UCԢ+++9&!DD(p2`5QN2ݩMGeJ}0~_H[ͻf)ˍFGD, W naIXD,RXv\W2wdNcM(B*@"YkS 1Ux ^R)$ zI HO0&ԚnOɭ=B'_PvtH1; 者Ǽ1J ll DY*!mbǟ`J PYVE#;tm ks.ކ~#r%H1$H];6RiR!ݼx]ƙȵc&6Dh&DrϧDtòyJ{\J 9jzCbMSoSCVih@֩) =25=5㓃~ `E(OTn9 ooe0.r^!*caw"P lɍQP/܀@"?^ lUŎ+Ym3O>Yi՚k EQnZ4L] cri*Fɞ(* w'ltZLdPءZm u>V@tǓfv!j2{G+ Ac,. v VW"sE:? Kğkg蚏J npV`7$Ebw,U-i Zg7^7tCu3#oz:I<]Ȓhjl\`; .rm|Y'vžƨЦ$^ZZH se3N'[  _5zS=oJFFOlӳs/^&i[aq:4-;Fk䗟'LUˮ-E|"ĶW <`C؏w\y &ϟg/Gh¬ٰ7ˏQؿg8,k\ǝ=yt#<{3K1fL>_h?i&Fkw+b_q7=9MMOna20ٿmZPdz@_V ߑ@!'UI2W gie\:[Lҷq<5߶w@^{c&ţڌE{95gypTrI[, Nc2r+GWΕU7ۊ͎$\t.≞ƻލ>ƟF^hmf^!CK.? ͗"ݦyIva[̄ lKAX#@QaN`FV!J#K""2)D$_]M=3m,Ӷ8\~Dף?fdx7yӔ=RxYl)mr_m\vĺ垶~zk3%ޅ]w1r/ 6a}!H@BQuMhY3x{{_]y~u} [/A߼ycax- g^M_"'ϼ*żɬY=;quvmzh4bL(M$um^h˲oyz't|xH3\ɭxOYܷJ"Wqz:N.4B>xe0`~:m=#|54jd]q}ٹF9yܝNwK/ ~u敻p13yNy 9gGoRZћɚ+nE6,}v0ׅH/ͶL%^ _VrdyiTsҧ<&|s'ʶ6xs@)fgSgC?KD7pٛmͷG:OjbGOnKl{rɧ(U!UKh:J?t`}MW•?wFϨbςhx$ϰD#cTE4TKE_6 -6eYfSh\A ձVHIv T,4L!(2$D!GѿȲV2R (%>1@gZ{8_!|ԏj?F?m%H szV\r r8S]St̴z9t#ʲ==-Fe1 Z+~+^,te+ ]YBWЕ;u;C>/|g; Yw<:[^eD𾬽p/ceW|+_ߴW|fbai*bv']N4~xVZD(`Tųwuիdພdgc0/OT7( 1,Q춞 8G~G]㨸Q5ܔpTQ;p棯鏎r*q΍m^Y{3༝}mƭ:g_ nO#6ay3ݝĭ;r&XaYk=6ew7Yl>X>9WK(V]\i(ٳzݻ3vDB8EާbM{S0q/m/m,1ݓ7)(TRMrC8Yw4[+!%+[қ}^cYPgC+wG!Xג(ښJن/Z{'=7r9jZ9VU=Xh`jP.=g AQ/^ c}+g[gXgayX  ZgLGHXrzrnqE`~7߭Ri;Q{َ|RAn0z89?G:y l~G}{xM|ݷ72fnß6WwNsK(^4ryQq gq}8Ǚ?uKŶU~yz7VZmj{UyzCja[rPڛ kb[]4x>M6yᛯ>lOmoߛk]di`TƝm z[ g"hhI*7vɗpk).bB Υk\}3b޴ʠDP^4ƻ^ꔳqVg֓-DCRўIBbE} 8q0ɂrpꆭܻe>ZIZ7k"RE, ZM-Sc4:i-EEKR u)*3PTXbTٗ!Bɭg%ds&P"5g9jD Z"*W4r;kH1C\e ) mok1Zd}sYQ}dN}XۤiԂKw%o1Z4pg857-̪=#wg҄FG=0@E@!~-V[F崇 7_>6a( &)6[7ʹU \'gsĎiJL@%ݥ,n06Ǖw.=so!4rխBmaD0&;+\ps{]{g.5;ٍF|ӲϪhE7G$e@6iEHFp 4X>uTj1? .v#[QbTRM'k 98;bŴ) Һ0[O|#05}%Baxtp%.1Gs'&6jIE!h0x8\K-#b]$.Z #s$ A2Y`bXl@# ɬZA\A-R{̀V!ZŨ|LROЫSŨ2~nVeМoc2+5K:Ȓq|V$8|eNcV#U eJ*d36„\Wڕ%4Ic\wvӋW0^F(8t1j>(OP>W𖞸u"6NQwj_ަڠc2:OmHCKNm4&+? Г*X[uowo6 N.+ࠁ*82Ed[ Ld\],Vw>YI,՞(2oc@&$1ǡ߁[I9ЋYe#46d,%Kb֎PF}B J0Sq!1᜝KpV(QRFݒ}I X Y``xeMIM#90"{w$ ! E5@dqGf,dM(RDqHYAd&\C b VjbL 5X90lIj<#lȳLf,&Wh۵(z* &c 0ݮ1֎Y= wip tKvR-jp/Gټ~JcA-#:{@ 7~+%°)*9; U{ >NG'?K C$`` ,pquΧ`f=U6\no0N6cT4ZK@+Փ5AY<)GfĞ\Vǔ`m1tP5\͡ xy܈Vuyg-ʱ}A:fRBɕѥ'V)a1zC @PPA>7(84`6btrsp-L 9B}9뜹JAWՀ C +<|@cD)7b!Fb.]OTuNcpA|dW*!4 2d/͆1\ki֮V5g0ֵ&rc}4(o8#UK!@?}Nb=EOq,-߯?b)~{}cЋ&{6[DnzC>-7޾Sp8kGgy'8kWgw.Ydr8?|Զ; }VO7g{cWn^]f ,WMļ l7 \VKQ{p=8c\0qIVPyEU*WA<=l4ey8KHR}{Ȟwx1.gDf.eb m}14vSwɳ8[$FhR%gF4[>zEĒ+]sgh^D$#'9uK#}4QY IhHD'SV x R*V)(r!0yL1xE Ihtq4 f'?;B_.0prL1vp1=G?>{NWo_|v?~֯8 ^>y__/vI5ڼkZ[ݷ|W{?Gt >?S'`(R oۛ1яjzLG\n"w:x*?\Q{}jZ>͏>N>B( }鮂g2Nᣫ_͸ xI{ݚ*`W^0b{c^l2xWQڌy镙#凋@zsEw(?wQ)^=S5)e粟k u*v3sty^ytv?E_!Hb؛[ಋE?c^$Q!%o(QCCD9]]ꮮ OuW1ĉħܛR].[ͶK)oޢwn>źvs嗓Omӟ=@}w_˾kwmIXҜ2ؠyWŬ&WM/yD.Ϫ>EUr)K{Մ^¥1&(ػ[tE׈p;g,Tmτo}ɬn7rmK>"1HYȳ)e5'B,-[lPhd=h(b'_#IUf:һ~M:GKj%,?;_|I#|ݚ|\U5:5ҥLgiFlg5"!t`T#R-9.v8jXn::|fDk;O9yjhDW{j 6ƷYq DŵJ1#0BiyNNkstq6jC:3emB=ׄ& 6$a3n:8V(81'! '"T .G2$o|͘0GA}fDcTd}ҋ, TUssN&[ҦNqaJ'{m 2̜WytƳx`-%Jmƍ.qg4~y[;.b7QCCwX %8%K4Ԫ%K4m-5%Khz̻ٚlꊵDy*,M4xD RۮRôR8ْZ@t1RIXyYR7֫N!\b vP3# aŨl`uQb:D"(Ԩ:pK^[k5d`0NzxvώtnjYߍ&q1Gݿ]ɏ`eޥE۹z|W_>c.qͰ([v52JyYMUUw*` }&Cd/fL>F&)=8c#[{t H}9癗!&(6Sy$F~ʨY!B$/D}WVat3W2qݥuA #bnƗTF jT0<.gAzMfyqo&V!ek={틸 %޶/Q{œP_m_E#:J틶/Pt7"|Ԉ(ږ]Vܫ8ܟ 4@xh8Mui?&2XGΛOZN4Bk^y"LjL2M3R(,A50ځߙnP|N#HSBBs=|g1Ac \\B=( F7o@59Y`N\ǤdU6JEZ-8Cdh]FV;YG.4j;y߻Qȅ0nӡw=vBb]oRD FULG$HS9sTZs )(;w;s7LW݈JݹۓPpݝ=s7IwF(^ݹQ543ݹ!ݴ_ȯ߹1PQ( Ge \T+21NѶ*KjvP|z{Yeڏvm䃣<4rz~JMOV0dg0 ̘Yû}XhFH6u$oڇr({4@[Dd"uMŨsDGpO޼ eمVmJw/1us9]VÒ]j4,קNl 1X(Oͨj;ob/M0G'ӽ V 4㵫t?o r O+ r\1Y^UYc%0z{ch)|t1SΓ+ųnz=v4]i9_W{,~{@Jl$ꐄR ]=Dk,EA=Y f|@truws7!OnRkAVv6椴#:QR A'T!GE#)R@Lsc9۔cA܃-6q9f KELr$v^NYBßjQز;o|>獥)7j2;oBuq(d.:8 A访1ZwŽ;Cz݅V•HP"=yɓ_'-@Γnۓ&5RRvIM$5mU=p9f:ٙLv:eшJH=Fۍ]h% ,d_v3J(EۇʒGM(XHmSYTpFW~\Pe;KS!Ou%Cۅ4N!}2ލ0XY5xO}MwwO\Op~Im`O?L\LR8_Oҳ܄Q}Sv+NgU> EƷSt+a8 w.y 3_)--ǚeIc?O/F XTQ7c$M&8Qnʈ$D2T^(JfeN }d.3O NzPb %KЎ߆i<[_?AfIKFg N GO;陲Nb4I [/L9d3&gy^'&lLnɄ`9k@ ٠xk[E6Ҳd$YAQNt`R HBHW+y(= v P!F)HK  @aiE VqAbA\X -ZCA-0S\_#%2#KYyH` .JRlY:qsH3D$zOfH"9Đ38Dq1 KODQ*YPN aiB8CdkG4 oJƸ!)E %QHKZJ Ɉߐ<8,a1:yb`M:XQm@ggsMI<8ɥ#\%J[U(%׶U”ĉhh gQ&wm@J1!d"ɥ'y-FY0Ȣv 2S佒'+`Ĵ:ǼZWKί Y`Ort qi"LEf3ǂ\meb(R\ ,j ?(p@#L\ph @: ,`;]XmeU‚G؅1 lL4PnĴ (_QR`0va@D~*.D,Ep<4$frP%x+ W|d)e8b4Ț0 h<,5‚ ( s.6apYT3mM'R% fc5dr&k,YfA•D7޼Ҡ6p*gUqM" h_039dE}WI'9E !WK &n֢1w&~ /< xO{b&$Ix0um MWZ`8FL-Fg`*@] fh.PkH_繨RV!   `D+=7of*)1p7 JxM/\:39P.#dskh+p A(P`AH:P#' \@v٨GdX76I+dOg(dE\"\x>W\'WV"0Pjt+V0 .T#Ѣ!T( Ed'UYG=kUe0?BLІE2d819EKkLHMQn1OQ&*_8UOBKP/"0Քh'k5k 9A}C>q~QMq=x> {8$ԑ=B]ֻ٥v<+cPU-muke[+ FeYqQ.)}oDG6շW(VY>`k-i9U_GA>>&-;n]ug~}kgԽP~7Aetn_ gTP5vyՙ͠l,WNLe8+ BtrMfؐeB-UU<t窧JϺr`.cziICN19 {cBG]mt2;\9:z\ GhåJr] KL[`ࢍ,Ѡs 5? ly~at#ojFx%1~K,FUue0ȠAhgph([ (Z|֒}/mb+z隮߼&-ޔ|o~CBIb@BoKYJݴ9#n,SWakܓmߤgm>Gaa~4VʚUKhnT>sn΍&6'vOFJ攻rE6=ባ+YVS;K);O%۝˳pA[#.jEs@Z(g4&锢t.Dlv!\[v7K$`L(>QNkQà]I2sso(iЪͼ{o(iPLwp^x2ͼ»@y ԏ(ݬV\} {nޏ>/el+Qhv/{Jԋ^d(+٫9J4S:L-ټ5f[%W4ʭIm<{0n(ڏ[en~3(?5XNڛrvNrSVa֩YQLT{|fAC-QJdJ: [ɝ,./qR01JTJɕlTn܁#tCW7j<3aS{ff}+:꼛 C;FӺSYk{W~gvsޝt;&Sb/;GۑUޥH;sow.Ɲگv7>}J\T R8MgAeIhi ET1҅轆ԼѾ1֮o< о]x'#Dt o7zBJծo< Ѫ]x:Dzv}qo[Plmg.'~wsou'_? ɽhS]OU6GJhIʄblSMQxN k>_\8~vjI,(퓚½xtoK~{ݽԝE/:bJvs냃]zL]hzcp=q%o;fc~2]]zQn&bvs/oχoFbRύχt.N?Oen giیڔKYCJǗq/KMteu4&7ͦo;vX!{ұgt0qU19pEb.I~:\HZ" \6(NZH@Ȃ."-@S5G *M!xjC;?OngX'̻⥅i~ޢr\q_GZx;-iNwZR+1Wyims u*{g0mޟPOv79CRn--$]iHx=Gc>|T侪Ś?د.aoBw58VA S @za1e~0yT /[^~ \Bea5\>WU^W6ⲢL\V0, cLW7/ߺ:,"/CkE0W1E_1gE^tVBhۿuޔ"S`ŊL0,Ƣޜ%o>sJ>mYmN0%W*&**Ym9L%X٢8<ƍ Yp 娌`qP.mD4Tn/OI]er 9VLV}TrGc"/#?.e&Domֱ7Z2 Rt0DP[|46-t]-nI'lE(q-a*ٕA3E]Q*Ə%zNJe tX޿3u4cYډިQC;ϷWޡ,;R)\-JASX)>Q&yIuzaUlȧ qO|O^U|4 3/.=h b>KW}qkz?D\8]߯yuLCşo0=1Łxor~z Oq8)Q=:j')ORUEYx祜=Z[* i'02O{i5(Kݒkڈ~L@~ӺN8'/rPNB֛>l86ܬq^  ۨIq4Wx2d:nѰ Q$}*|լU6J04RN )}{jլ>a\S`2I+kO3rQHf;Ü(A/|Aַ"4K5hZ*O]{(k0?x2#Kce>p*uZ}K3ٽ Im^y/y kW,3 fYz7gZƑb۱G&A23fА%_Gw'߯(6m6)rOv|h˲TEUŪ_=Q/wY r>hs{q9RJWj5qm9).Q\OmOhH;eg0nN#PǞ!bL3; /LhQS҉@? v>%u$UTv.KPUR_:Nc!8iʔ|8LET#4 X*Ր79m_>>YAH NvQTJ*Mcu̍I137&}N,h0@H*S&Xe$8 RiIcArtu_T9]ݗWuM zU:V9݃R(R` l`7Gp sp0L3 nib f6z2JʁL.oWm/@F~G$x@ta" "$4Z jsN5KM"4a0YMc.Kٺ,2٪t<+iz;M)mMigVS) *ɷD6V*_WT,}yA D.{Wϊ]d6|^ M ̑jTKJ~o~hnhcsqX,ḑS3e5X^y^g{Y_{}튚 < 8Q`C=!<~-+߄fu: bY~^ڙf#3T"#۵ja1ϝc\f*ώ [JMih}D_d dsјZ]w7fCug)TmNϟj(*'B6Np[Pыof[kvLźar=YQ\_&Vq,HQDIɢ\3sak. (GZ6:R Ĺ dd]ybw} S2CAQ mr!ʖ{&@TH댿Ғ$q#mq~3Ji4kIj5%]SJH.L%|+5(Z0rlnJ<;Tc-e i(`$b%,o0kyƅdηۏl8g4sVs>N[0,-yw1AB)ڝ`I&,R{ ֖RPx"N")݊_`O_y&"6(_"|7LfptME A aqE= Y %X9zwm$=h& .;(li^'pG(f6U kRq32L1hGÃ8ôbA/ngFk5gTg>0 >r2:;s Z3G= a%cע_DAG0qb/3$1}8Edq!_ VpթUL!ނs{+AmE+\N缝Œۻcu&5>򌒩Ž\:MlFTc 0P(EJJ5 WYb_jOf:F P=dߎ=I)gՇpΖ}HQ]*i>_xDܫyW -Ā{PL5%"h~F']pHƁ.}.ÃF$p~hzXBÁR0\!r$! Co&De`XtHdv˾5Ws"X[FU0T`0hJ͓,iwQZփkcGQz7TSlY|dv 3kAR$OLG ͘+Rt=T`hz|ȫ4\c`kTSk23ς#+&=ĕm^K~ qT iZ6;XNC7_ %HC{C2E(c%Cdm,.ɟߪ3@K:1ntю#thiMua_Zz%Q:E1>|M||?+"yyetQx^"~i:+܃ˋzFZ,`)}}?{sub)$ `EI_?%m67!o^o!a_VSñADt_痟_1 di^ 7h\2fZ dh+sjR䷿_Qrĉʻżwzf]CDrAlbp>c ӈ뤛h+#?I7Fid:g͢.n$V"$htataGRR# K6o< Ż0o" 3Qc7&<y?*Ndf 0Pp5G|L9\NPBj (o\b HW, s AW0A\ stTW'Ք8; IvxāX \e{P > vH>qp",nBE#5@BtkHp46SЌ3C/ZDz8Mlx'aB$<]s#+B4II&dt34kfghJNK4Sw:HAqbLVct$l$yEjDB>{sSqBX!~ |jHE@*) R N3Ez3;OcZ8Vϓ/OuHV2-}39HSUm@+۳⥚A "3S;؃x-`ӝx8PUjlm"/_cA2(ϫB؝zU풛!]fVbHJi+qsKZ4Kv9>sgC {sV2P%[NNeӭ  wh~)kA—muPr"=D)ǸsLb%aXpʶ [Y/s1!qh<`jK!=w!DDȦN3J ];ٰlqC+s"Z :l0< + d׿3-gu`zCN(rǑ "}e}F .oc큸^N_\_Ӵ-eR2JE)>NŔ81"(V*gCK^y9R3MjFƧ*u[</Gh4>NÁu;&)B_1V`:k_ڹN6aJ28gtV+L r;gXX޹Z7#̢×"/jH6;z9s,zGר5[ p $2vb )C-&~ WyS3:Qg>v \),-tK-Ijڊ_? +AeP J6g)%"pj/1u[W,ԡ -AJ}sfj-BqPSC!SV(`EX]E3,d,f%ietf:Is<]d4+jvb?Φuv;r C/Ó),f09%V'.ӁyCMbs`[qi]u^3~z>f_o^Ne^ސhev06k5i:ArCL7Oi2dlh~:v Liad_&Uj6MGm&6]c\ۯ'mw}:0B e|XG>$fn2c0*4BvƮs_O'*r7'>8#`wxz/u>d}Gak4Zƪx.sߝف7^fz `aSMᣟ^<4Ai젎p.GzsNf_^?wTGԚyEt5`Y/?U>:O;Z+/2C?{WƑ /&:ؙ٘²tsJ5tjmdx%  g)1J*[ҏ&z(n`P>RK}R^0شл=O>{W[bV;WX ] cms)cmd.sכƄ\L2Ad:BIp4 c%%b i㷖K ŮƂv>NOkۄ:_lܛL7@X컧ZٕXDh@9S5$G/!>P+x8,zb4PLwo5(3e´5C9wdd" tH F=8{ ~xjf 3zJόfޢӧ? 0fLÛ3Ff6PBȇޞaw'37{Xt9OiV#ObGL]:%\M_qntEF/ݻ}F e8bz_PSK.%(X fjpv.2u^sǻTrտ8_PFuD$dV8^ѝGjB;If ف$3"K2O5zA;{A¨r&gÀrm&,! I)8Sns꩖^Fݖ@0`B0BGh:`ŜytUibr`)]lLn/no+`%3Wr.ذ70zσl[jIb׈5`p&1O0ߺۮb,фle<% Gl f!:`VsX:_:r.Uf1Jj]$S3!jPoa_ }=J \iG%ﯮ*:.3tHCt>nYm|NUi5F?Tdpo+`FE-q mʑ2-_j._K^v:W݇.v.kߦGb0-M|#0vVA\'Ӌb}8ʬsq>w0&? Bq8ʓ9V4gG0`-x+pnVy]qk_jy8AkTӱ>\hɩϔ&ܩJk2JvmK8UyF F_4y |Dp2CWYuʣ؄aUB[6w "&w&`܈?v?Eƶ"l^Ut?bL\>1z). a̘e}cg%H㏈-b0}uqyrObrj9zF/_nG+AIA(Ħz N\S$O'%Kaa3[iX;3Ŷ)-l6 -r vUW4sj!#iWIi3,\_`Ux/dt2/gp:WVLSo}-It5D ))o#B18>2 VJ0}pw×aQ@ xG\p1\ "jƉJLWM\H j 9gF[KDJ_)LgNKWİ\HR|<01x"9I* tvΎّ~:;/F,'VGHHȉ$JZ% >="딷?> .TJ(DkެrI܄Ղ{;4˾xYGS&jLO\w.W9UKhG4GaA OIJ?%)S/')Ȟw|ݪ8>~-`zkG|rD5N@.Y_a:]tsW! 5㨊Qi\1"%i2O>)jA#ܥ/iȫ+9Vc(Wyaé$4!U6T2ôE[iq!JeEdDpN4մg/VWKr:*}ΣG4zUj&%ijiɴkbxV^گ1ĴW vxG@xܸ"8jfW4Q]&A#|Ii&p\ˌe" gA\ٝ/xP<M3ĝyZiFêɈtZ`,*C̣w CN * SW1d+zd&}˃*7:q҂ !׉9$\prL\nbuКHv55 _3?UCl ^%n# _o_[l[JѓDop0J+OTF yy>.4YZ` )!i tFw>ZI)h/1 )e#NEשiep6)k  TpN]m8;!]A}@$7m$QmGԦ1(Bu]"A@kJiC/螬(<z/Un%ɕ#D:4D8e@Tt̝i!yhe(fSQG6[G4Q+R;!rI 'גCDN9YZxӝuZFŲx"P)4Lf)YX(p+C0&0#k\>{f mrAvN!9Bʶ5EB3W=ѧ0yZ.F8ܬ/,[hŝ]4J@w >b=|`CtBL3  !3G%$8n(]+7d@NIA \ТQ 4 |TbJuϦZ+|6`vMk4M`T{W[fb]W^6BW)?ېs~zm֡{3z^,u+7%MCM'-10v⒣K |hj7}Ja{vջI?Ȁ.[shMMLm%N]Qҩ(4ڟ@VA,eAET]ykDP7 uo~\Lr\vrýQVԨd%Y tF2 vo.}ZrJ5UIE0⪘y;^Au3rz RL[_k}CV3Zj#Ȍ֨h~h.*zhV6뜜jRjQ,F+TZ)SdpCBUua30lwGnko_}|Ul Ss՞2`ءgjvAJ#%gRkU:R6- y.#Ujs#^D*hFV0G@j8ȣn1VV% ea8~?YXDf-.=yt0D%rRg"u9}4 kYdѰ/y`Y֟6B~BaC~pS=ك4X\Kߜ]| "Vi{Eyy*e-\OUzg~CnQI,vm@W\5Wu7iv"R7g珍elZ0nBxCdYyAyFB<`tġ_YY3 N7Og8 RO+ǘN]+Ѹǻ FnbJ-nd 7q)d΄U.ӊLz/pýMG 1w!?{Oq_An6*}HItjk郢+(qk$ L$ LկA8P;pW,VVfM\0jx HI$?y.u?Db+*,tZk+0ȆFby0tGx-q@ Ș`tx K`9P/Ez*!XQ!Fx(8h?ד ID:)YG}JyMENRDez>gsB#pvT3(p "D01` cB\Z+SR5ĊB#qؔj,ui9ށ$W+hr ҭ7(Y21% cJ {ؒSc6k]NdJ XK'&KrMKwt` Wb%Nz~  :9$\9)qD|:Li0`\+Vq!^xo5qys4L׻GY5L+`xB~ T*ϝ#B5N ?W@ Ƙq[CѤ<)gɽWKT+&JćcbM%Y\<ȗF>ﰹřњ|$X3WS<0UW}8U&RTx«x%q27>^XLcsKUS4f-QBR:x%ZΘph7'wO##Pww's6ݝ^߹a֎ ]"|$s\^Kyoײ}`I\))"}J#{ i/$moi՝ȗ,2Xc$OUOFOg )Vd|TBzp݊[k<y}+r+irK+͇d_<=Fm8hw?5ߗBqkZ44ãqZPwL$XI=H D & {jІ <5 aIxM/&5hG뷸4 +q RYD` 8I$Tr-`7"HF)/K\Z,syb$ ̼K@t&`yEE1es&'u:{bt3 I+Ԥsn#m=^HZ ڹcy\k}\2A2k__ldgkLYkM'Y>($3âu`h^?m4C9p|*zGP`*M`=`gSXZ("_9wD! T3/@H|Djטppԩ %YuZ _)o^E1ڷ=#4v%VӮDDmuhC@8F)%^4`-1Fw_S!BbԵqPZn2DpM Q F(3Mr@cf@18~T/?Tn|_ x }M^AyjY p):ط.&wǟ*xCA\Կ9="q_?ABws$@j☷hzVɠ]z%Չ3蕉ᏚXvi*&A8G\w ,7X ]2@A'qL {2&@I@-04YrMyU ;PK~628Qf9 vd,E3 ;1JV_DΥ۫sk|.PLڃR䘟5xH#i2oDTCBȔ5vݗ `pVFͮ= !\Dd*qj79 VJ)]v3PJ/F4WՐg.%2UVqP+x,+DHޠ`}lxLPتag5HO;P+&q \H#5ZI=B-rO+H3"Rb:yGQ<4**\U߈v${_)|$䙋hLu+ۗ@pƞz+k1A oé#r)`ib{xuN9  *H- T 8mnLy΢kY_h+XXnj+ .%1%SF4Q.k#z*I҂oWޖ4Րg.E25 >%nLOYS*lツSu?k;f[f<;X7 =,I\,i7pu0vu%R .Kxr:S%-+={SH>=lbtUuQW՝T9U`Iq %%nO$-ΞTeғ1uH:`A[7k_Z]csV؋trX c/UI 1VII>V-QHaKݸ~`vY"1NaS$c($1 _-~KB9Õ{PU*b3 X'g(uȁ>0AmyR,mνS~͹Ym GMU}K3wWgt\ػ2\?'5z(u@B.V2(7vac9Vgw&2,>W𕿮6M=⧋ Vhf+NE"Ik(x$klp5mQrŃxңlj/q?,G}Hae`!֥ OH?7xÄJ}~W띲,yνx7׳탳8xmVhϚۥ_GսvVF>+$o(G^ uϋs>|j O(㕑)| 7@1^[O!rejwDOof\ݪz}oT-S8Èϝ#t s1;msZ }.{2@}uɃ.Hu 4a\"T9Фa.,x~Ql%\Z("_E0 aڭl/4/n!$#փjR]`HQ0:gEhd1Y.o[THV#~ۘhZk$ .o'1&Q̨}ݬ]}zn6>|ۿ}ַd w<&oytN]kx1Gt~1@w|w4 gjW9??<}{g(*5ιY:<[~Ֆr{sy Gvpd#z?3^i Ŭ쇯߹& a]4X)'w0(9L 쫻2r 6ߏ_Q׾Au`W3@(fObp.{/F&B}^hG{pFV Gy8H VpgcQÜf@\ц5!x :lЉIs0tXt`ϟQɝ)Dh~ !5090C(\l:ԳN29L"lP=S\Ȟ k'>C#];ZaAw|:xkENG+"rϛws"|ަNbz$Nbz$MO%hNE4Q)3F88x?j3o텆[LA0`#-q6B`6u9!ՎV*[jaA NY8D `r޳?{ riz7DuISϚ,|[;Hr&?^EbRo[bDԕ>=b sIWlNYkq6Y*_ (Q7:M-1Xq {8pJH'ެp/&iNZq%Jx+"TEm+O{A (zpTKɎnD!1=%L:`{F"P9[BGu*MDQte.__ E}yϑ@0gEgTD{(QbT_jHA%J5SA+JJq/L!jy۔S L ,O2a\(t>i|N|_i dApR#!ZNygL$Vy T\t)FFeNoVim_~073b'33ӰBgZ:iejٵĨtC6J({ F8΁wov[`fVjľ$fԚF-WXeNRI VIq+(ve¼=PWBan)fIvLE"(}jN[sN#IYbjf+6Ǫ92[>N1gQ!RZ̤QBcUIEd D0=X"8=6(navBm'e'lEv7E+) (XTk+?{pU҉[IփRo585$H0!3ͷ3 ՝Sa=@h38 <'N97ß>D@E-mpKՓuxfR+2d9]ލ>&Ԡr`< w3O'庒g$,`.NX 8 N L`EѦ+LQiZSm>N k 2qg{p-w=\s]uI 2d IulybYJ_<#i;i&>a*fM9ۺgp$'39OšFQt6 vj)Ej4Y1eq/oJ;0ч›p3fa`L pmb*k]yϓߎr<whFYLw+0/WJf97ͿM091(x!'tCS>b[L-1O6ęOM9#kb\TŊ.d?wrf9=d)b;ZY>xN%`c Ry\cTLS^wrht0uKj_xEirl[w ^R2)uOkVQbL0<:OHXM>>f#ZuNA_u2c2KYxOzTY.fh}.JٛYz{>0? cjirW\\q|QVŒ&=.-u[M!+Jh ƒ+mdӀHDV%zֳۘ\2!B#Oe3&:|J\͎VR0xޠ1m~Gqk5m[Thp&x\WލǰO,'^rθJ/qI)_äc|Yio?k5g Wث3=I7涡^9C^ 7u 9E#mL N =Qًh߇ŭY|,Rho䊔UQΠF7ra`-`//$VXT|*;^QVEY*SE/oFQY9<(( ;XwU} •E%~KN;jVшlLУd-蝚itMWșcR0IZ?HR~]$ep fwlto/W+ə='ݠ[+1=6[G&v [3HFUTu:v5ڃvf ӭ9iPHo n3=E@DGP~hI@}gc>]>`)=bv6|]d2DYy۷i8H8} [|^rK$zL:v0~,iu p]Gr˄Eһ\{hf7M !4]7,^ntug6:( dM ß k*?P#^9DH9uǍ {L;e]e Le7~>Hm^:&hU9nB)+_0pWs׫UxM}3}4'_9i 0=rh2̱VF+ ZLZ! W2ZoZ=uTM! c`9Hi ʴ9#K\R*c!"Xo0c"2ARKD1TA;ayk2m??o0ZxB*2D,uXp.,Ez -@al"tMדW(,]7aP߼x=)%7j9z 3? L"|f*m;䴁Cn7`_{#f>}܀?m"JwDLb?r?w:\dGyxSZOdKܮ0h0{]b8^oX11T_,XMoR_?lz3ڴWlr_0N3C7[@WR! 75++kr\?Yj6lkb~St }mRx,3ìPt ._5I=y(wܗJLҙUbٰ%wMR_htNHvg&o_hm&ıH"AƟ"9B3ʴhgƒ:ɒ,_]jW0#3~ՇhƋ7_w63څJdo'q`[uhQa%MJ#s,NkG1h,Ly,n=CR3<P(!H7,4Cq%BX" kbR+QP1qU1+sm &LP3YCe̩!&B;3=!Oت 1oy<$O5C.-[jF%:/UX6G"nq:\(Q])|TdNE a&BtFx(2Pcw4?,}vZv67HZz0҂j֊DfNyBm0y3 [7|fdQ /yMAG/~L1J:LL#HD#pJXj)vs &90GG< 1bd0Q2a."#mGnXfr)w0S pַm)FM+;=7O0CtP=kqnJ78#D~S頁Opyo0Syo"A[iHxR;;ޱwrf% C,[ ^Jʞ_hty6F~~Ja}rc3PBDosx2g/=z(Sd\c-jQ!% 8fi)nӍ dHG)d/= U2yˣHP)Lpٻ7rk+_yșx9 dN VHv<{H]%JbdUSc Y"7&QbSW QVJWprSi߳ < )Gmp߰E(̗ރ ӥUS iA oÖƭ/=P_I|D`ʓ;6tx䟛RA$\ZH^_zbkiPD|?/bkѦØKЙ/,lJ/=>t1n9*4ny`RS`%Ӿ7d[$eJzۂ X(WFMg>x·-`Øb9\h%⋵\S AJKpEOS%" 1XǡԺ@JpQ^e =XJҨB *"RJIi$j* 钑DZ;lC]kJ _ͬ_)J.z/rN-w.(40nHq-{GڑP;,^Zlt\eHOx4CeX|hF`%P,j7E t'ݬ@ZV,sE0H;]\QSTA $(*ĈQnռ! cMpYvڶ6_m_/x bݯ9N`l.?=A 6ʙ<;W(p i¬pt%YQhDQ AqKF$􌢓P=%Of̨(.1$v HL4ndu|'[?ȿ3L2C.f<,Dž#3_nf}gٷ}^RTrp;Iɑ[ Z+*D ,-m|v@HG~ fnxX }WnlC7~BqqC̺57Z9Y}4YS\p.SDL}7Y &#0bWp2$Ab) VXCm4@fM[kYVcoos.ƻg/}́m|I-^߽M?7Mٷ ٽt/0Bd~[aJ/`o޸oa3f;\} ɿ>y}sCHV_O-p 1Il R!% :N,8КDپw0uK urD/ݲ]v1y:P g<#B6F*QRB,Uv]e6 Z1KjG/AHaG#GqA<#7qC\^$Rn) :2'C,EnO b* SYiJA$DjaI , ퟛi8 4 IYUZ5^YRdTJZbeAMq m!`)t9OĢa5!(̾Z jh7vvG͏\YCj'dB1ZʐBrd ܴ/ 5Cp6"lPL1[}ʻ+[ 76'dL$Mg3to\w3B\?ts֬ۯf7G$pMS-<<͋Z7ǵʬ% '[{YJ4)F - uDU!Ņ@i%L%YsCPAR`AFoh1RBmUYͭTAg6P˔uBL0B%~P b\7cՀȨA6 X^P ۜRfJ-4cڙa"-_V{ M#S#V! KMeYɌ yia cb D@!Et3ЕC>/gmC JoSXQ)KN4H(^QqƢ a ħ$y.nެ3lPZ8]@pHuE!#͙dTܔʤ_6OJaAR&8 _0Dˊ#KXaCCPE A䫟 c UCrVTc9P[kt bRƺMѠR$`®^s_x2tˉY%@n[Ȥd%V *b&ΔJI#})VWi|X߫46Sj_.5-k?>qn+7o>ly3c3t Lj|hmn s!LZQB8h~]Í"{6a kgn.:q3oy?qsi駜rwG6GMBRoCdK~H`[ zFi@,/Zkh o F}AoF& 0A-; ޮOMO!ǁЛN?NLYNnU9Wu~5?L-ƭRƱ[9kr]B6VZLIa Wn5%ƼM\镳mımm 'ؕmF'}b#?ޭO[J4D0V(l!? Fx&Xs(:,8.?D%5CbVJXr% VW+Ʉ;\Y%nEL)Xն ( R^C!N끐!u^IGb#N Mn_k3/uzb)D> HL)QgaRDBF+* k GnȞ#VYQp,Oue~6;J )뺤G`O ]b IF G=H$@R\"T,c CG!Cis*?U(Nj:eOk[eac3ZєGKEߤD/E/KpZP:F; vq!=. koxPRs")/Ҭ?`_rJH˪K8V}֊°"2hl\|znŇQo$Ht0FN:W$)[z+9!+-V;`(q/\{֏rWFwj1n_)G9Yg@敹{Vc;[n2ض/!9;|233/ImLxZx -GLvXu*g;'P'z?K9MP6fqlai¬c xuc༉5O 5O*_槃^=>0\3G7svW+g-p((BP[cQAOGLɫIERi;4 zN\%;RvN N^&]#$?jרv\&#v'FD鸿ʼnKHdv'\%'PSrrId!ǒ$zlHQb#y=Ӌ< #ɜd 3 k$Lߛ>7F.K!$0~8"c9ϡ/F}"6\)吇q:R/CF.| o[!HmX`5yJK5.A9gc3K)`?Q;=7~gp<9\k]g#$x =O"-sTy<]]ƣps$rqzxrGb%OS 9IX0W8OAfYq7Cx.7X;Lsg:A|4.<¬`=]>Ϋώa|vJdGIvYOir~IGglYljy2_?|dnd=?F]SqYޙ/ qGS|)S}[L D `(0T4Q*)*kdhCDV)Gq~[,Uv6fЯI$Z=.'n+գ㲚uknRfnshj~9SeVE7~v\ uovam泥>ctcYX߼{֍/o[ɼXo6h~[r%9iߡlIV|dX9gfGYG|qu8Uu.>^? 'VZ:.TZ:JO,f#;K%m1ݍ2-WO|jS7^pЩT6IyI ߶+/MUrON' jd.ɉyЎ;.G佊)KV UWhlb v6l[s$όy|?"桤{Rs,jj&CO:0.Wox-5)Wa5JdHђA&2> ͜DΉ4@I@Z|yH ':+#W~B>r H@AЌ҆Am/@2$YDŽWфB$_ a ڢV##O<9^ >hkcBPi'xb Cr& :iԄZci(Z3BM&sqVP@e@E pZl `iq[z%o5TuFHUPtQ!2,㎀,q{@$F6 B>L%#URhv@Gg(e,ȇs0bCt 8J(3GٵŞ1SxR@jumAFԦVn&)';9vk4eobpYvf.ϧʚ} cLy; GvDJ֢ppkЄ C&56W$E#N#ܗ$2QR`= VHSs1^d)g* 8/-WF;4&i)#ص1fEA%aJRk4aULh@RoTB {(pe$8cNn+F5={QTzDIR"\B~$aZ#[iQ%Y;b?nF0xc5PzSi8h ^^Ly6& vNʉX;q߃?(~puZ4_z)M|jP=7JQI<< hFތ4#btv#YY54 pOϷ9ܞmQ?u}NbUYOx5U*[cv (v٭MmMĦ4{?n8y#B XidSȴGk'nE6UkoٍVJ1hb:XvJحMmMȦ*EvZN;dR N3VaLm1VЦ y&:gSMF6*1!BX{h&>:R88EsځuIJ4JEBCbH^d;ic $۷=gI{c(pJ)1V&X4ɠB !\y1AhK*RmoӈEZBsU!hmR4Y\dHnnH T+pd$v#AO tT hngc{辈*8P1! rf?!ٖ{QFm.t_wpa'*q7SF~ @kyӱW֋\ sr\^H/&e/AjJd4\mL[p;vp\&؜O5iդraCwB>Khv@`PWV;MU/y=Ailǂ%G sʍ+_FTߢo^qf4ԁVZI Ûn\,G^c{gtuVmCl5Ÿ`)j@t hj? R({WWZo>o0qDc h77|У;X7TBG(!g6٠5\EFYv 2i(g[AſhUlrU9դpoű?%}w;j}\ oZ)_HkLH9☵Y}{\;aܽp`_47Ʀ} IA6dWP+XaL&A3HH!f ˜FlQ3=,NBC Ri8&e) \vk?lhԞ;n\ ef_]|6<4WH.yˁt VC1Qq%D{7ӓ۵ F[L uP6#Luԏ;_׿7f6~J;?-BOԿsnG>~g0 F}> xlEB/Bnݺ84ゐvuOy J[1F5 mZ!ygɊ]˓^}JZfUнtNjNĒ _AR# (j09dHAD‚YAmLH\?z &5VZ יsm5ImJjc 'Wz ݫql].Jp"&)cb(DCTu2%Ivɘ !Rw%ރ%$5#5|$Re >]EB\l@3F< TdADiJ`21VyF"h3t(ClFΟgXИH;ˍy_<"Z/xP$oM < Y5[^_Mת3&9[LD[ #7FGw@4.8Ṇv=_jDJjݖ nDqJ'ާ154ZiY3˼l# {}(hweW晬{U:Ӟg5_^1e%>`ѻhY_fdH87/0QNM% Z[x)ɹ]OTO>͎|ܾ`aZa d_R4r&/3L'hǸ;)Vw s]9XL{%tAIWr.**xSC_gAL>p2?ąUQO4G}2BL^!g!DY/7tIcWWc{c;dD+tΛ]%iVT^xv/n=Z/^~eGECꝋ^߀BZRܡ[%LV(62͵)hzΉ`ULCw}1YZ+Cq>O7D"vLR;FE0D:Kq0-Z.4]{(#|{ϔvs'4]m*at׬;\ZjP5nO i$Hhhiz65W9(Y}XKv8ֳvXm0 1 uc}ѸjRhTeUYY*xF9| Gr`8a99B76DL>ү-e KY<ԸªRXa4 ÎrO<\8X~X@198!5fVmo3HĴh̅,dq \m4k<0O7ܸCcrn tb9$9cupHhgL V5!hQKo 94gLG 0[mW ]@7Ā3.m'EKhfABSi!-*4yb!1!IJ łڬIRfz sV'Due{$9"`Y53?G3[&w,$xYC.H_֐KzP _**I2N+x:'q\9 P*̹g&vq=HUڽQBW+DXqASU;oDF6rZ4[[&okeNk<0-hgm.$^#x4>~}QjS +Z8fC,{`o&d4Pb$}Qtqy^pZտ@%/9 Ky"{ّ8R_nQVTX!>.A_w_#~w_ vI)/j_8 dY3Z,%-~l4^lXulV mX||o9#FJc<2cޅLgB cΜf9V,SۛBJ̭%dutB7s'BCR{5>%} r3n?0sƸbm|5S _&~/4|t)J̩Og̀]jHF !VB DrM4 Ly&{$w'Y$^O t^ez둷+N8-K{Ng-/tĬ~͇}\uԐ"_s{ḇgϭw3B h!!_@sŔ iypC" id!(=Hib%?;!p$p/kQ(8Z&-wU9m1ngvfD 9ǘ"%H(ÚԼ(1۞ ` $R? nPL$kpFDtw+ =l:Uɩ]29Zrl>:3Z}ي5V~>Ysec:Op9E{҉C˨ ^߸+ՆzQ|Rz o>eكNf\QBg I\Xf%C$6ݎ5B\)Ԑ2KRݩ6z` H%_4c1*6V;3 :,6?vAO0|Pd{fJ`d -kYMx"d1ͬ9.AMpwNooO,)-Ďş5t{c|&>(wY3X.Q@3)Zjj̱ml(2GДg:ѼG$^kʙ0Cd D6"Ä=ie~/J2Xm@=J'`{L;m>Sӌiɳd86.;06X%̖D=H6#,mMfsNwiʰ< ր2M.BExԠJFnbCd=&hlfs :; \ؒi1򋑩7;(]yȍ2.HͮW[RCx([QC2(i]S#ѱ fE*^e B |7c]F<å|\2;j+ Ci~z?G3m8&V*.rۯo4n}7轙N'n2_\I-vm<_US8j*)v>$T-'x_Sؒt1c̵1͎^ߦ(iKW]κ]5Zӷ-Y5*:@*Ft9p;Fėӵ/cBq/IyXvqRywRxƹ;%RÍ OQǔ:Xݨ#X^b&OEv -T5aHdSLo%rb} Gr,'Ё*v#苫w"mtM/&R.-Z8B>|d5n1(X6b;׊:Z!`1 68%߯ _?L/ZOÙYuQXۅJsPae:c rHe1Γl+zK*4e*=iJ]vvZ I?g<ﲅA_>8Fiԩ/=1O7(t`O(&#ԺuSDtX(g%T1*fϑDE) -l+fAm|{J5i U[Ѓ3/їU{%R K\:A0Bʆ.VmcxnJ;0/su*(t7z]bU#-bU qDN󗼮qсYTj S'%`gn,DnAרWpU4a0#1SXe>?UML_71}񦜾N%7)ρG8bRH0^I.L ȄsǙsWke^, zu'-b#Ƚb6̋9{;_4 ]k0&_J0\V$ /NrlUH6EӉ5J"egLN7@Wʆ6Mo S7 &0W3AK9'Aa@289X:D 5\ KFL5'g0՜dFa’>v{:X>-鋑SMtUjV5+k&Zc7M+ k^@"VGaYKV9p\ %R5u'?cTMAߔ /Hpύ&Iu< %N*X步M\MB_``B q L`m 5cfZs(>j}ŒLȺZyp`J;L0xC[,B%OMa?>n{cf>5_a"(^ug/܁۽UQxPϿFyL oț׫fO}Ql@?.,C %1~}|ՇOqV!t])gdf.;nŊxlSt RRMUaRa%$Of!r8#nK_|ӏ ߕ{3F\wwf-z,5Tr l^@|SeSD87ǽX,FXm}X <9Zt?B}0ӻmc1$^_ׯ< \fȢ)CA{Bѩ4EOK!&WWA#guM! 7. `d.]@UZ,9\JZEvb ~Q Nu6O_O)nGaMnFȾpzA~ʷKa׿B3פ54G·~0K>g [Cvvsg3|`grѱ@g|? lWį ww5pq$ULCok}|u`޹]y"B8x0irD$f9 2VyTz/Ws4v'Y}8W+M 5E(geDJ@4 AF*.4') N6GԐ_NE/;čb(ϵ D $ E<)-cd,ZH9]|KfHiMPjZץ^K4yrr[d=쭎q{]2̑< %QԓJaίh3`T$d&wZ%-u`19;C@BnubXn8O_M#ՠ{ʉ33`s&ͭ`0N%0@(&aVY bޕeZ~J[{ GL}'xD<erw6FAk%U}wT>p;+!ZlTRvL=xD+FcDv*^B1)FkSK! N;cbD k$5 ag7)o@aG5V؎?1)4BR.7#`J܀B1T8rGnbw"7R+-y09.`'Wn)/o$8ZZcG` / lb.5 B.Lr"kAw tn/M" TF/rHrJ(n:j(C% FщjC =$9| 퐓}[*9O3=Xx̼R-;نZR䶈Ela:-zˊU˦~x}ll^g2!z,lcE'xc{W۹J?^vLc*ԝc:&~V<򟆳DGmٍ`p^ǟ׌u!gsM|" ½ZpOx_Ip[oEkTrs5w9h8HasTns"ML^hvkCqS{=ڍ"BV)Sw*팞,Дug^hvkCqSc%*JЍn/kXx`ٛEfլADGD||Mwcib/&U۰_}0ӫ0ZN}ןu{d@),c,8u*4T,0 ȮY2@'sµen"u`7y?<_]7?\_o|VWwތwQЌzs|PcoOSgqS-4C6 Y>q_@2!0kb%byޕ5q$lCud]ìۆ&)$ D"@ C/Xlo`KLV^rjJgɭ¸R6u'btd.͚,t,h-}+q‡h"I-JycVKn;=~a ,Rq@RF}wB^h5nw9RB^zj%Zټl$KˈpjxVG`և\ -z^%[Ԃ_ҵ% } PyC] sC&c;by?5跦k)]tIgfKK5}wH&FA{uJтQ;+^?9"T٫RɊҩ F9oʱ}g | @f)Cuq^}Yej?SkH%[?<~Jv_ZJM⿒2d6ߐ$(.>ۯhn4K:[R tl,2 r=P곰E$zَF"^D'6Y!oUp[ʌFBk37ÙVm40'W;.> 𘎨 Bʌ/X]-|wB>n}yU艆&KMy{eI߹7֌aO}~d-tJ,{|?mL,#bmY :8)%~?t0\0(bK Xidt)E}R$u[w8Fa?01{Y6F!8};)㠽;LWDrdVUT,S<:ˠl"1BFxAP3q&$5bj eG$r(T>@3_r95Ţ C/dbP%_TZ`o HD[nj <g}ͽZ"R٤KhV:e,T #zCaI&sJ [;VÓF}N+g/׾ JsަuNvu~%3XA' c#)L2V|ȅʊ 6I2*m 6e}MѮ? Y9?'qx˒wE*z#ޑ弈{$\a{T9_yTR톿g}۳E5Gz8 ʜVԒLV.b.Jjɥsֱz "H֔eABDLIK]4Z#V+O(.kA[ޜWocH+qD(3iR-+qqs~r)N֖ec1 nRƥqzYnRlvRv@e&r $!(Ph5( L"\z#g7)ɼ0>ww_3hKET 71>Մ-SQwFJȵQ背{"jrI&ANy ,#`PiqqY.Ȫ 98iI]Yhne>KcCԀ%mU;JAG3bi?ݗs4O~khu`0x:ƨƿ֕lFg4 `%y'!¹mLgrx_1jo_ Kpl"3u[Ɠce DciM:Sz@A0!LC9x$_^7i_IWshøøcS,,wdolŸ;T6Oh۾qy>r$Yl#}GԊwbjRW {;NfyPo6sY>)̉ {85k' 6W9&y:)]  w0H˚-k .P7ֿ;8 7Fʵ5Lj&+Tn%qY> }j탊o*y;Uqj+<)9NI]7nne6nc|Sќnڻq.Y)?J@X< Kd:nCMUdiҀjnCFp5\[/k8\wN9R'ʚ˾=$;J0Dw<\/fvL̀Cnc%n%VCMxy:E jD_8փ@Cl)C@Cꮈ\ ~ E|w^2XaچM:dG4y+=AP:ɧ\SBgQ_f^ @?"])jqd)YC786u1~Ѳ3\|_#ΪnZ__Qf5_to_MO1OD=7B_{_q |Яt< .a9-RX a)p!D{\MVИ#h!iY'-0O! ?bHaV{rsrX&d2|G/@'ay91We#a_XYis!8c?O؈Ne}Ӂz>BnddaLIChT?{A|B%aQWIg 'Ei(,V]I2 2\#xܭiTWUmrcS~ hx#x3͌f@6gۡHjg}ڱ a#ٝg) H4; \" ֲ]2+]k~!@k~QA<FQܹjt%LGԹ Ɍ9c%c&s&|bDun2.k!Fyc|v.߾(˒9-4 LyAb;v{fMey}}3 %4K&ϛ#e椯0dyT|{*{͜UiG"{+uf=K|YZR? 'yeCVղJ{F'RWbބP b(\>O2`pi\ |;SJ$|Pr>~r4/Ex[Jmi&_@d\|)ggL 1'ygXIJT %D=+4Xxq1T*r}WvxqRvȃֱR0frT( R[ X "TWGIuzVߠpt˫bѼi>)_j{>we1|#9}?ՠ.emˠ?j07?qxoC rׁZ6]hoqct)T#JK΀t7#V5IDbP뇻F!]VcCրXEm}rWMrPDQ֜MXa~ Y}_&i{!թ.0Rl)Pknvy!Wv3naW|p#z0^,9xǫtY}"VK߮.@x@IoGm9u]5_IF>kQ i[2Q R(}/ ~%WEv"z-ΔI^eMի<z|sڧ_x =.G}-[+Nn5KtWguF_=Y};ބ}?/ stؖ₮ $>nHBUһtHL/黋<h?,~b%t&t%."e!ޅ 4]?->͛gl4%Iar~1>{ˋOk4)*G74q /Rf iYMSd !As&ղd*3#"#_7Y+Y/z WNL5?z:[Sg6oDA`bO >%Z<8XՉs)b=Ev )] OUa-MMS0!4#/?ۊ3FbuSYН8"Mր 'YG؊Ϊ2wkړ==%g3L!L4ǖK!!mFFHo`"&Ko@_-:FC3FZX$0Έڔc9iy{ ߼_[,Yt8c^s$;͘sO gyh?IH8?xz:mP8ň"ဓw-V et܂$&9="9EqW@wJuʽ)NO.*bzP@Ξb*7kwk2b$:l̶R7`n%lsFiE7mA c(x&Z{.?R2 bJ"uZ)>J#:f2R)E!GuNူ^m'q\3&ZGՍ%eݭ}=V''c)?~¶b9u{4o5el]fcLf>"#=cRP1"+NPl]? =y]6+t$[糒)11xhR)D"(6Zy9V %+ш %Q 8@5`=Kl3Bs0'qOX|}Mv[ErΣ9&wrd ؋O)bњ_ u,*9'< ݉eM5Q2aWF<x8[=xSv;-tA0{X>x 35:vE>v|[Ye_|Y`u]M-S!|SW-18˗ˇ3;so;so, vY}LɁmMbYB1B2:.CygWeT%a{6E~\X3Ɏ"{Ÿ5Ngy6]}giWٙUȦe%L4 ,qt=[ wR^/7&c..&GuWf b6N"Wv^W,^,Zu6Y>2ZBp1 b6T2afژcLS{%x-9yQ<//$zPç'ФyQԁqXl#ع!,"TqHZ~2][WTYdTU1^ I|QOlp2XwR@O>;`խyN`D0"f ` WU'UIl)r(PE&n Ǎe[v9؂]7l*!:#v5ruZMb4dg_oF`! (Ǚ9+ J)9(zRنUjJgkoq|~EvLOfLFĩUsǔ`;lÊ>4AbTe+"ddrB`DiDذ~t  ݊<9Xge,2HV/d9 MZdc3I3I *evt+}"<3Ui8""x5{J,96ŨM&wJv 1O#-pE#klLt1XO3E}.l'w* t(!13%Ȏ%+/1PJkb,Z$kq(s=1O-SB+2[A,ywVx#zCnQ-fd:z[O̜yII𩣸j QfOҩ貱#9`aԗ;ۆYTګv__xc~y{[r֫Z 7IvB͸Ev{C3{pqvsm{Ug"gΤWaVC3pMDC=s6W^)?1X%p̻Kգh YK*|3u{Di0!u:K/hjlI6)& ~}!]F;FTPFBHm=i۷ \{9r%3_|ʗo|٭5ARRh I Id_5PjЁ` U Ps hk:c$[4j8dueu]B%0ba 2~ SRk9.PVG pjp<@TP0B~'pJ k\UJS;b66:62:ȥ^#H^ ra&Kئ>i=VjΕɩK轩LI)d\ bZ%Y=L\ Zch-~ s^`uGMPo(.$ZKT3Uݎ-yJmLR;NX`dF!WL4W0I= ]ٴ GV0 &ދ"-9KϬ^pP墳F$0jE6(i'͡EB5GVwNw6}|v0R & ( N\ ;o):21RtlCqYs7 o7&DȬ5=)zZB&(!O緿͖k{}[1}/ MR]׾U@>;6;r|!V_v. GI'T/˸ AR˵+VͥSweٕTJKVg"q?WVCN O;(jЦg= Ì9D!nDt" 緼 T}.Pi"Ά&盰}k"іw/t#XW(ԭ<EBSUI>zбU=7ߋ|<|]7>ʢoXO[9ىA4'cp}pPz- 5dJZj}B" . >ԠR(%sdR.9S/v48HR& dM}P4'Z,2z)&l,JUGʂRE(`B`&ǔM)ha4sKuL i$gsɘY&I)QڹyEtA4 x8U6jg1]C1gURVĞsXw (t$C`W`eNR'Gg9Q;dNjDc?Dќ9s!bֶse:0 t$fPk71/ubL-Dm?ERcgcDH9R]\u=[kg0Lx٘j;q#RcA%-8~Ck }m J'l)6XIbxu;|L P)T2KE,$^L9oJBS2fuN_ &ڒq{ ꀝ^.9 lkS?.'Wl^|QO}dnˇ_&@5QخPty`~Qb#`f$}3畞9 C&t}~@@8;ޖ5{0Ic~]DŽdom{ [q"TNY<!a3'*TyՎ=N pU\#mՃXV5SvoVV˪/-|E8͑UKwnY9,/jdc\vM>Agߝ9Or5SSk Ͳr?w*n^cJE[ᐙ=2[Q_[iSqucIdŐǯUY\]] |./~A84%8`pL!2YYerx/of Εke o|tue?\d_c>]]|ww|AFE-k,O Y4i7sa0+& #uJ;!!(٣ o{A0ۣ< ů+1p Gt :Йl8kD'(V~Aho~A G] zyYBjd̙k̜sՏq TauH(B6P˦2өJԨBV؜ LҭԻ8vߧeTJiYU2?ErF'kٗ[ ??-? ~~@18_3E49l%h7 N죆kaF2hq*sV:$sـu%BfKhx%ۍiQ,n7k4r#{Tm*$zÑqJrYrZ2j,JicPAʮ?OπkBf$և}-՚tfТú험h=21@ D&=]gngR 荸}pHDxkA Af-LG p /FL6fFkή>f9D'e˝%Z`1V#ш1u b1ׯpmۗ?zN#3] 5LmPfTS}Y+ ))(z`PTm.5qvǶ<_p5g_.fێwCJґF՛9DE G=*1a>ӓ)F[kF>M(7R˔hXHQo/?.8THqf[Ąj=Eق!t_}YMDia݇!90{A/(㭩?МHfm>M>\AjqC/_=2NbVzOO` G;܉o|&:k$)ɏn;S['6$hd~aG"L"F̽u!hXFt67 {X6lԤ_p*>P VxGNU㧿E=ԽG>UԬf,p/d]2jIAW UGA3V*EpQЃBɑ– C5ԝW2e6Y^lޝ%-z2 Br눴ZٟyFc/kpV6sjasKzNjU@L tJs SS|:k7Z8Ƒm R6/T E@YPta9" _x}slnʱMnvIyPPqƥ#o 9"W8R6:`=ӎC> &rd2JiX2-9e< Xfko]jG>IǔU2"&"1Tmvyֳ {&{W~;2(#P8XG];4›c䟜²P M$joݯPQΎWMGæt2?_ `Q߀v' J>_y/,XAjF3 KJf{!+b>΅$ d](e -auiO)<-dB+I'LNYi\\zq^<-țx㈬P%31JmNJ IcCʸȏSJ9UPY*(GHNt:X)TᒨjySWzVILJ4ZPuP1N Uux6*PiYJqnZ2bv̇ʒHbsL*D1~7G2:k<bL"U`M䂕ZhsFxL4nÅJ>PVB"< Fϱwg=}è0g 3Eaʭ>DaK$21$EMOz|dyNW^[Put=^7D Cj)܏?L"[pŽh.{#kX+L6?ۡ\%;qd|XGM=rp\n ;XMǣv:>j:8ptԽgӢF=z9FByy.K5VkH{i:/NksXY]KMh !{j*eJ5uUZxTPgZؕ_Qydƣ]u6IIURwͭJ 03zX$tO-J#J`myY&nUژFYDMSxj~*PB- +#j] eV. ۼց0{uB}\i{dZd;%hѵJEj/PR *T^U'UҴ4j奃P[1hSJ`ZrJ6wKԚD(EXUiZPwDmW6j:-Y}F%PQ⻢Y2CSc%~s:%y516ݢ&z jܴމ%oCx@A6}Ɲ+{_"c]0秳S4&*чQ_La/_fk3j~C-yd=p8FыfL _]OenYľM'[ht/|{6m.kf&&Biq^ I[^νjN8 Ry G`cYvE9﷋N=~1LYH[l:χ$Ne/ 5Ϸ#qd$~c{36oyr _Z=y&HYg,H:YnP}~1-/'qA;?n eur}}}ק'BSN/΢?}9)/u=yd4KpLf_ՉeyM6׸}. {5/(4ok,x }혖u Enn~dV_O|A`.!0{s}||GCŠY(P{&-_QI*t(/LTscDvQ3ٔZa43YTR;YU-8?6I@N`GH̓!N|Odbn1}xl^g'qu}84:10xå1}{Ջ">|p%z|Qg1؆CS7QGL.nߪaqP[[˲#qZZwk9 t\(H/bww˙\#8FG3 *}3U!Y]Ś̕ \xQTc{&1#9YΕUa cS1!`Q| (Uɠ*,C$ | UKl3~g4[mn\mI"1pZBCElYYפF=3;b.f.F^bN#^NcE䔬!Ḣ YvPVKg-C2EZL)Q(ww:(3Ɵ{{+E3dN6\P*${xྉmv#R߱L}ȥ&y=V`m[L yL/:Tzp࡬CXGp|f x 7#w*3@/ۻ8$I,sd;@qE;CB ocP B7XUQ+T^Ԩ HV]D Kڠ 9vMs,n~,A`ҌpuOm82d(3:nGDk\ENS^t&4|V16| t#g jȕdc =ܒ܈O)X=P6БV(A&[(k͸,Qh/UR4Ɩ![Y)tPy3%ntRHѤio0d`ƅ TR9O;wJJ|/ͺBS/ͼX=jlkGq?r#n㶍MgͧSſӓM򢉦Vu4^\7Dn tNi>mגp7'տ|].o~N~J=GnO8`ja&K$7)96E9Ն3zjurHs/=x)]s[&_|/Qlv;}J8lbgc21I J^0AI0}Uк4ۣ99 hX%qQGB A\EW)${تސhPK:T/%pIhONߞ,61nWzJh RazKnD}Vk9/˺Ce Vl 3C6T{[wy37"vLɜʌ]ֳS'Rd90\ZkV%ye"GEZ> 3.Wzj/ٲB)kΊG %oZZ.<lRUX^|hi2]vQ` '7>h'gH&N|u)ppKl V(K/` KsT QTH%NU%s*M -ʷexQٞ܀;5[~ ~_d[iNjSSx+fIuz@̍]c>ILA |b[sח;& `bwHUpÍ _-_%F@*uSxGa}y;y05dPfL xV496IL7\, P=vDD`c=gw|;\)7[L3i9n+gW"n@m cgKi1r*\a d/B? ; F iO,gU$yu^'gҔNշ>F!Ö4w?m$_ar'~X7KD$\`6ZM"TO艱m SmlCpmpӉS}8U,ƯC"aF B[CCjļp9${vkT-YK 7BQ.p8Lz\:Nd~;os6^{ab>;9_x4BON"l[~Bk">٨Yx.e߾?{UZIpֲsr[NW$ZeSŷR )]/rhczk@YOo5 3f7pfN2I$ 9 dmRP-tޜwдRLmYnPVϵYup^]2AS4}Ļ/P8-; ޻3Q8.4hhO"3%%Wq'dP;ǦQF^lyjoO\&˾g՟LZ ED( {H~,kf eikCfόh)0\16A:* c+ﵚ]pI?>]K&Yoӻ}2>}owYhQ4k{f{m8yn{tww޾a˝/߼=Ѵ~޾?|wp]ߛ{ah&^r 7տ!Wո2q?~/f6OL{I8.+Pi ˼cz?(OT Iٔ9żj$ƸUiG.8~ ;4:Ӿl zgY\kǭ,|_gn?+Z(VT ۙW̩yɣMSs oz~n/?||Жw(Nu6EvݡvN O_sv{q]vߙj1tų^gF0EM{r |$sߛPّw/︱?8D/}j^Hyŭ<ԍ[_Mopv:4&B";G L) >)ͼ^;߰tB`d".YC`Fu8l: EeCLV 2En&OA/Ȁ( E)p6 5Fs~z G#4?n-r-15q7KξpbLS,_̹#,QB@H<RRљ̠=f9$ }Nv b(#A 3$XCVƚ,gb9f5˩YNrjSձ !(Hs]o7Ɓ^;΃J̠@ Q'Q2Bi w[bRq 8g*8Ɓ5qp2 e2/< TzbS^*Quwƨ&҉[OEhx-2L'{O-QDH@H1pj$2E]p`6J1A@9D<ŊU[XrfeClc2Ψ 0x_8*pIcTFG[*i&5xCMk2^d|ud|<~0,3[a'G`e8o*QMBTCf~Ӈiô~ӓQp")BQh/,*mF8΅5Jx*T=V-|i5IIEC"d Q1`ə X\$F &'L{s0UQLA21^欼_彻DʫgE]azm Kkz2wT_/#$`?SBQ ߽`-]iq/aeʚZ&9PfkE:*P<>NAIiu&5- "h<%Z(U2*Nxy)Xw L=rюu`)Xw >N͝E _5]/opܷ,mdo#xgq>ز=70繹AD 2H*U"FBLHNfD %. (5 a' _ր% ;{";]v1 z#gYs0a%> ?%˴Xqƨ&REHHr-2Tgwfk_J")T{:p:Yl0v_qO-}/6nm6%(n2-˘6.S#%XZ6i8/e~!H* hENq 4VTY0 +IFp9"z]jsi:IAqudjh|3@Dp{\J\2 gՖE \*WR֢g/ 1@AG!5PH ">jutK""{3ux1dҭE aQI s.+njw@FL[CFId`|hn sp J*$D5J\[EJnY}2<`E]2 Q&$"ʔQZ @6CBCr"j}PgpQDbѦmS <),ZJ&BE[7%RJWyCd>QxR,p-:OJI"A*E8ƈ2oJ`[q[%^uHwŐS-"cഢ`LcsDpJHZ82:@!M:Z pSI|zS-EKȗF$ ܵ"hPe4 ff}|g'SL'W"w=S6#ָհv]sQwBVĺqf6%,i#XTZڂ*#HO`֒`<څsGks V́$]̾~rb)fhf2*À"֬\KjEpb (< [P Rؐ:"x@J}mUM0%R.ƈ%'pc4Ɣ #US੧APW䒨黤56)=}$$s iDX)P_ u Q`# LW2VJX RJ0M8X<³aպ9'l-Nm:޹g,c*;Y5#hƬFe#JJHm+ic0)aC1{${ |RUѵx`pe .ڟg+!љ7 tZVk]tDb$=b#XTox4DJ•h=s0gCM..y{1^g+ LFhjCЖsl(66 U# A785DaN$`5 lM=)8@H2*1&U12h &Р @E:qc(bA1siUU/(Ϗ:f0G"YsV'fp:1`H$H=>"́5Ez(MbZtZxu6z GlG jЀ*Nr=Km;-T@2$KN8s<6:`9-#at4E(L* l89!(,aXBgWXNd}LUꠢ,ǯǽ^F0IB>ИQfB& hhXnIq#;I'M~Rsxd/b2F֨2pVĨh׿7Ƣw0buW.^'(sG$8NA'$5NQV9@$sLFBОym{o{@\f~eӖ)]6*4F 7Wހ3pVYKF{\zUǹ=u+s*^;ׄJ@tEo"q+ST[ꪦ5Uw? kZS>!L%۴CB KNtu52U) Zp K/LpBZg̤x .xF\P tzo Ĭ J#b ̼9#+f|ԣd, [izx04f,.а !u=G)YY5Dԧ3uƝuDwmmKz8{Cʀglqv-GIAV.5Cj(#aW}SU_uuծҩFM=yu8Uݓ3){9wl§MMOO =?l]UymvNF׃fELRj.VsO;VzZVpfG Zs#]#J`+%5e##WQN;>HfN;hNjyuĠo>hu@![kꈭaw#}sVGSu?Zң5UG\Azv:M#UM /yA2 HRvBfQ94,┧ Hia4 ۭSr7Ӣ BrؚG,o<2A fV̍0*sض}kLys RהiAO)5INu7w `Z]utavItKzP'@m͆(XC:\g&]9 ЃW--heN/Q({rT=EFc'?ǣ?j+쿪k?/ƒk ΞT@)TUfЅ)J`qBH%B$Q3ݱ{3;1 z4LӠii=9i4 z=MA4uvNf4T{>U7WZXYK:=u^hG,z@6̎)AiKf*Ĭ8M0 v'MZwU>k/Lp*]!jj֕.^UhNJyzZ}dNznM=K | FQ .X[xջNQgwߝAzhvN8x̶Y\q. 9W7w"9x\|HrSsuNu73g0:Dƨ\w%5~R K9ܱc#VZy[3K7.\;Zs-E zEs ӹuZ;vyaZ;AzHeW%⴮;ӳM 7'2iAJvz>Q9@Svv9Y^*Ief~bV2`EGmLF&  H7uӡ{=oN洿9oR|YB.&xxֹ7AՄP!(D,~k(3U@BX T(nڞ=Qos#’#et9JBG}-//6fQd}BG_~#Bֻ$Ig}HQDo'z;ۉN{@j>||Rw^DMn0iӠ2UxwHBOpR^_  n$w^^k=Q<:xAa?j '>O~z.!/\\YxTEڏN|eW7([M0.0̫KV] ;qڋ-ZD]-u+i6i?[_&V `Mw_/_us /[LTkTk\(Y2iM1gIo??-FŌYŻZ`l W簼Wk[[7Rm+; Ü ppĢ"g'>ݛث_V_?υ\Gk+?b?Yf6i?#1k6򨋛W?637#ڑevdؑdQRtC2D_jpLrYDf2"c2A0bϺsOH~bmzG3~ޝΌ0];ҹy{^煛~kP_E;s\\~=ջ?*6Wy~{{ΦkIS0'uoruFE\{zITEB5NM `!|r}K"ŕUn&#qVWUIVkPqtlsh7u m7coxՀ{*Mz p k2U,ߋ[QCcJIa#cEј>|?r(Y!5LJq( ǸE1h+ј`P0kz51fF0N?}}$Jl@Фq.a&@gj1ʄ&Ĩd`PL,zOIAqt.9 Ne FȊIn]4>f֒B)pԴ j*ma6 (ˈ!77Cj5CBck_R]9&D(jE,ڇ"#K PnԠkGZ`U 5Gᅫk+9Z☓WrBC!"F~k"xū&ƙ Sd,/wb:Tlˠ+f7X)ʫ;z@F` hyuBCF)1JcS>#RrD F>Z10RH(Pita_ H7‹qr fao(bt:9Pe#Yx4>֦Ad&ɬ*$~s`&3VKdRY|7 ڂYK1WzO} _GD!`_a 5YUȬIl9prM]\)9 *H@eD"<<%9ev31LZ`Ujޔ هߖ^J&R^v8 3nOlƤ>P49`+J3e7uCet9ȶƨ*3 S;en4 K"a/>7 cmzz|iiNh2s&0`JR&߽xm%@y4; + |lRq- ΐ5ԣXxߐc/X͇Hr(l(/l*n s* x*^\dɳI!w]9Ȋc3ZQ0KI4ܬ% V?shop$ e 3;!IMp95&{ U41 ^T bR-D4o9PB1ƤQY읤EʍaBVo Yo&9_n𳅘ԤdLsj?yw!gs,16oge3ި>PV;Iή6vkFtEI68Flc}P :&4ltFrahYC x~Qrz^˳zF? m3Vf]|zv9Iv+Z-^ ~S˷I^:m/˿Wϝz{Z 󤄽Y?9=󊚮ߡ9,AO#ɽف?=]N9(hj IK ͎.4˻CV]WZr\$qdC~6$eHtsQ[@@r I)`e7(hץ [ ҰWE,j@T-)\r-Jp}9z*gv\T 1# *- n%:)]׀iYkMIفK˖M ޸`K`y  EZ  A_{n!mqy!΋"Ӡ7%h`;EٍYf~VojU-o$uہW@ܢT I@7G8@Ғ|!م8x;/ mIm9u3g]d#A p52ymnftK ~~b,YhX2dp2}@dP#R]90#sps\cVfb*%2 uel N3c. ̑qn$mNɞSZ`Y xNXYMFt.$i,nKJm $rVnUqedlAnL2 gN= 4 s#}?vAU<(ɺyC)u*W;0fʎOLSOr ΦZZAw{TC?me*kͤn5I>|zeݶߐYm,ye{K ìi<9zf ;%#Hsz,]uU ֫,J%QKBW-5P]KJz(ئ-Akd!dwڒi.Ӊ!$]:Ɛt R]j9^Jj zL)~[MxD'VL)e:`dr|As+8s ;(EqP8vzmJuՓ `NC( ( Xuw8i3Kn׉AK2;qf*Vd */yڛ2,TT?{VVR~w#EZ<3(&;~g&cFߪcccǯ]$!>_-tᑜGhtVNlHf` "X!r:qqZp?Oʉ}sfOKsn!#tJH PUrʿH%l f=;0UD|8-Nv>,ovj<|FoagםOָWIuЪH^9J2@ _陕&V _50ទ(djCqS.)O}y>>nf y*i 7'l]ݴ.O'Njm|nmrKL4l+j<̿'ng?xs\1!E`Y_9R@911L GtVcߞi.< ;EӋ,+<t[R*fBJe1b{m-&e> -ŧTjH56^r8{ؿ*ԈM$1'֮Go\$ome `BjQssgȨ5^Q =< -rR'9ZdmM.c2oB˸`/h:ߙɧjf1&`r&zu=Ffb aJ9*W N83ٹ40!\h֌NyQG)Q:L7&kYT`F B]]UYA i ژlfґfjbhzi EBVfy73/*`AiUF#g*.,$S;9O"3 BT6 Yaf+'>.)2pQi."=td<<]"r}ij2Gņ{ XBHQR2U0Ƕ| 1a2~"$2偄wR+]IN",e87`8oK2ڊg:iLd^g^hdD̰<$]xČV9-a9LaZb"dJt `Ɇ7SSX@Z4:'q%w\2}hO[E` p; FD[إE}Lc>$?҇$vYv'3y&&F-Lk`pM/bbib&svjH{-ݻ3Jqҩ{[?ӛEY̭CPIcEJɚpl#h!H:a4)&$C/ :0eIy$J x|D| <7as'Uv玓~(AoCKNZwuI#l3 wE-)'i"9iI]ԪK{ 4^{3Жn;j+kfi [jdyhz@l ڒ@}6k~UM 1ǽ^`+.lM7¶ )$lcȖn BojV#4TD[A 5Րְ HwSj%QS5fPFmjJST>1pM.I)`SBCSs=QtF*9+LضTʰpt zBy= l#ݟxսC$K:X[K{}!8,&[Z{gR ;7Fݥs1d_T]>ot~\9+;CL,xd24ppx<* Fj8'=0%TWFUuB z,?>{J83}pzkq]wg0{ &`Pu'1\I.Ym%dF" a{{rF|#kr$ʎli:x s oF\?- )ZbV"MH-)y+Ԑ"i{ף9SUtWWHx.J V̔M dk[<L#c([04R HE2L13)Iy `t =w cYf$ 2]lt'۔uAQ@s%0aig=eQMU#"w"~}NkFWV|H(,L? h1X r#UEC{y#h5q94a(7Kga'9y(q仴k%B2NY\RÐ*5-eqqW@Yތ^ߌN:vTpo<Z JucBf` r T+ \t(R f($ @W%8C$"YrZTey:B,6dpry Uy`/Io @hayBss8ل {2)5-?f_;~; pRh4] j`[^ &tNx!!X›Nj%Dn#28ĤJDƝE@$S_ԢJepҖD'g m.|5)="J"c˜9|h͎UQc qXMmݫ. qtF,TU.B(^+)꺐Ak%Xϼ'㓋Ͳ~}`y_f;b`j!3HYTn$@r`~)C R`l`SmyoNo.E;QꊖrRVkԡEop6:CҬ9bp<G! of8.%+U%ҹ_q:qeAB=T1pJE  =6rm3F 1% *tԆ8)[gUhGPې!̰ߙ.;pc-,mѐJ~50YA?ɒ ϲM&/nkUQZ.]:HbժBX*9XZ|peOAa$G k\JU$#N1㤪A~)2uD PZtwlmOAڲ74#T1A0e5u2tS%J G_ʴR[|t. $r}z xg)e5)&^TLA?@Ѯ`@"Ԫ Ʋhg2일J {7ĘB"),Αϑ; TV*8]6޽ ϧC>S;E TbꬖĊX2n0fQpdā6lR4B$EV@\#RZ>z҆6;R VI Tδ3̗ +\Embh;:g@q Vf+dsՀ"mf-UZ#)n@M[]&6X،x%iEIOHJ"..@<7]+SIdhxT [DOtB. xM0=yK%( B*lx˂h F"ݬwy##i *D *1AlFUXE< F{{Q2-dPn, 46G (o5Luf:{wڛb].{}}>~8Wqp c̠DV3ːI =eNpK=2e*Nmg8f&AGW\ Թ G 79Q$FPxJdiVΕ|sJ/޾+u*)G4U) '҆G $x͡%mivL] W ..$hwPhEUEMpmh-*/ iwa#^ՁC(Dd!1lrNV ,U1%Z"zwСeOaHuWiVonSՕ6+p0#uv e µQV^a%3vv11/ %.S0p0(* ETSܚV_61H >4O`ABhଶxpr@LDh:My?HK;\f&z<#n. _C,ҜY`h8=MT8ֶA4z5]G7r~{~%A+w+EGxմ#yàuqUŹ6du5tă6BJn`sͬh%Tfesa[fAicFEp}iWg2'TLcܵK&XOTFh%\yXi.֨Ư}LHv2h 9 urpؘ,I1qk62ެvK<֢;Le@(Y`|ZS斅UJ6*) } YX.[UC<7WPІ,d?Q "R-%$b^ Ҁ30 )-6orI|x#/+mʍP Aiōw+B+KUMͦ7j3N&>`2{,UkhR61_MowU !xxЌx)?xW|׿}~'{Gzo+s/j{;ow l TgfE\u|SYC/7%ƚw/!d%~X./]l~p a>ݬCn,_j&)/퇛ǰquU̷K3!Q%!]s*TC):>5S;גUi>q){/7!pjuD/BZ)yÓp S8on6P&w 8ch,k|i{WWw?4XpDNW˻r4t"/_ ,)Vpć>Mopy=^P4}Մ$6 VۘnI!?ٛfM<pPC ]Z_J V/U~37Q_$)Q][sȕ+,d'P}75Uk&N)h؊)QCR3[4@!A@ċ }ƹd(r%G`s{z>7&:¹߮Tk4q[ͭ**yV旣|?wh]:-._LG/r~>3E>^4ho8A܄ddcP'`3/ e%!̲}1hkǏ\%\Jm_o$q# =}n 9{&\~ﯮ re Oم;6byΦ`GDs.4!їN_kP$^MpIlyspm:Mߞ;:3J,|pGVxtgẇ*O<}fI?PLqd+/h<|_rh~^DʢM/./ֈQlʙ>EjؗZk^B6k|뚁xlfu RĮ~ Z/Ĉ*_}w'! A4=bjmCNXBLj~!ݾ_,b,^%[b6E:@=ׇl/?򾗭lzI0ĊEL&NL♒<ڳwHxq ͜>۹[A`am33(LO:SQড়v0;w=p!Wp1F!xƳ׻~M@roMl]") >xN>2ƣO7w@Obܤy  q,t]+-<zYw#ևMKOw=//6p!4NJ%bkmq:@L>ptQ( +da~ZQթA} az?&ke+P` 7]}?#“`1i"ؖ8.hq,Tab;8CNgݹY~pVצ*u9VG*b΢81IDIp a"r$˭]g {Z)1C-j@[khAI ֒Un@SZiF8eO o5t?# 4߰"[?%Fח~n.(nB#|6pԛ 2fΡ }69&ZV@&J!ݱ(ָr<ɯoM:N1 ^?~ooqTH9Th|IbyCyt:ނweY2s?Yf"~s+FМ|3u`7j%>p׷o8rt_,s7!U-j*VZQPƞrs wMxԿtM7>RQVJm#GȔ7Bܮ/:fQL@k]Ki#w׍ 2B7rm#|ZFx`ҢڢLGIOe-c1xD8J0,6őK5fibeў=.F6z:G18ƙSg7+F LԪH-J\!i"QE"7 |lbU:1>k=1`tRv~r)ř={QMac9ǃR,WEwwp6ҚmXل!؇ p溳 uf pFXD{렧Ѭk HT24QhԘtJAfNsK%5 ;QfM"1xaxl>-D17*>7;3R)YwoyxqW+p%O|3NgoF)ML윤 Iݑ0&9^F=M0Ƀ$5fT:&<_n0JPO+Oҕ /)IO d2N RB:vp*9}sq7h㼈?On}6J|a3bo"^xN;\n",14"uADk8! !Y3)9Ub)U\d'R$I?ޠL\ZFoFu#3 w|-m7Ikoc,ԔX@&mTĀ8mfhTV6h1=2i aƠ3>9VAKN$S5X(Q2*QT%HgqHVq͝'t7ÙY=b;8il܇1CcRc6v2#_5%4GVHIBԌ9&TP6{s1VbI rlє[^q;2,Je}iVD[NĒUGc-˚5T$|= 0V]8 pXv.#@qҝysEN}'tNG*O~'tNrK~5)0Vo WZL7LM<'hT[?ȦO.w [nbi"lXm($Mӄѝ7;u-r_rSV0X5/73?~l/SpDc <>=ƌtkǑ3Eʾ6,WSdJKwړz(ԕ+TCmJyFjP8Qy9B~vjKZܜw7hƇg~3 LTC^B{!jU_؅锩Cr yO/#}eT}$q( {y{XMDV!0  /ܸ Cm7ުT ރq">f7~ D6IP~G7=Id=6ל5=k BW0ָay x!D}0 ih\VG>m}m5zOK{sdjTЁ~~ڼkWwj]mu6Ϥ g62Pi((L' $QXT ϯFXh]h4. ~3$>)k0¼v 03D,R-\S2S Bl G1}6&K0cLR*-ݨ/lˤlCK&aS`5Q3V:$K"yPvDz@-v4D$G*tg=**P&XuU"U]B"2%Є0{xjΎ6F VQ4bĠH"pbNR < -,8P_9npD/F0~ Tګ1XHBTKb nTnx8eKՇ@ށ fz ?iow7y\vlzcêw}b oňÉmϐ բQVD.՟IV:؜קJ+"uK7ORW$dݫ]k"Zt`ojhWDu9K9 Hr"]jճz%eSZ>~UP}# wīH!E_Hucę)fݝkv4¨ 8^q4E0DG_s)8} ^K#R(8B1,\® !nu%'4h3iUK&KVE[eESS}2oWDO5`_5:p>=4|[D[8!4-]q/bؓ6%ab^Nצ66IF&x)'֠mVPj'ɲ}N}If!v3D Q!iN 3mJV;F101_9tX0:Uu7p9t3'~Gb 1Xe:{2 EB%ŋ쇀WLkl=kTFYト(tꢅWwT:5̠S"ϱn:Us01Md 6A5~CS4\Sܸ=J/B _L6Eɡ+Q,h=IUe/^/\d8j%!v=ͧ ۘZ(z˸p$%ӉmS" T,Y> ZhNO*I= DkN8} '{1Xػ'hu̦kxM~|ٽG$q{Q O$ŽIlu Cyo|=z9jj59cjO:W&)^jBuV_H$hJNHRwA^$xqƈwkR>`V2~ Pg7rKv<8:$wv#8 oqVݫ8X?LjHى<\?ѭ34z0Xwn-&TPa5U ١0Xس-tʃ D`ĊuJ =?CrA .Cօ+0mv[]UTSQfYb$XaejRI8S*1gݝhVWB$~LnBNa] |wVAPWE=:"P=E~cS!]IĔT=HaUK (|Ḧb]peκP뮄_#3 mdJ =vsF4weJI%Kx2Jɏ'[V_4Z;(J/Fv6{{17LS;z;pnnڹog6No>4ƹm[+ϝDdϽ:EpA-z) tTt 1OCqdLSD-J̍5wٹkF흥Y\Dx[w8~ՅJ {Eoݷ Wo_ο_ئMKrRauf:R^ |. ;y筅m?23g0 eF@L:bcMmhr}bXC;$10xv&) t| | 1H =iη G*߮( #=(@<-XtB$20MK@,=w39@i[|}/]RY0/]Ǵ)-w(PXmێ l '<3D '[.Hx?-bh44W&nhwk"7ӹގm2*L.<OrCS#@:hAY,ըZi_$0=zX6E9%PVG/DaTT#=LV=}ϣASyA;;eN꜋ӋY^[~?O+ 6 $Dxՙl?~WJ8Q66vC~:qgғ;?<<_,eɿ,A7ϭ'n?# Em`ATUk{?9THt2>`>:qq ΋?6[5$es"چ<.T> c̭_Dd*"A5ײq_2E'ԣvPUuO˛rIzn(zll>z8gęS] xF:`ߑYi*ZK\ tB<=r ^sUO9 & :d]&R '2E(n57%84g%E6c8臢kG9Iz+Iq( W޽v]o ̃x;3zPaڋ u{ \(B/=0ŽAoWF Psh{prɠD` N_ 5yٶa34MY&GiZ4O&8Pf~et{g^fpf3Ns^  1pQM>./co3߾k| ˁS]!F9|+8rRޏTU1'%Tjz9h7tCu麂y}\8>qtjVHľJ 4BCp;r~r;:xxAc{;dsSp.>]G+H| O;cPЗzꐓ-='w5Bg>~Uc/?iy⦅,|tð{|wkQ5,)jW@M*m.4 [HWUS Rh%]U!,їEӖv݇ϥIp-SA{)\\`fwۡ0UKaKoc6a}N_o8 95 ,1^@=bzܲ(7nc:"U]x]3jZiOhÅBN{xA,\Yۿa-H2sW<ݒxstavO.3VTyQYn gVm~ɼ Mț`9 }7Dc:B؎E$7ZZ0J1aG lJ(m^ȱX%}"1(p+ӽx< K#]7pxBu~_ZBzXrc#$rŚYu:f\?ϜP4++PaCk ܿ8[i~lpp‚jtm4+muqm7m_Efra0Auf&V0VaʹSǬx'=Gl2քe:.t睃JhJ|CO# zx4>U٫?vnlmʁJ%G^=8)fuui-2l/H.q(2mMw=@gXFKHfޝZ\XnӺh^rwG}}v6=VharUfVq= 9V]eؑٝ#d;EKX X`"} a!0vts2-R #*b:,=ӱ+t<"SJy.cn b&ҳ0-1j]{Oe.vM4*g F[/7A a p['_Y7|PcLكض:s#97yg"L1 mۛ&o37rtO;sss'65 kZZlSqg B6ڐpTWocx/ Ug,ʺ%n՝و(@HmslGmBsiu`q1}FqeQb= ?ܤcw!]5ԩYq,=)$q*Ee)~z.ժm[RڎC$8tobi"H0i$!%# vr;BriR`p0vܡ+A62bolݯw,J^z>t.֗ EܓVKfV6Ey [=܇{EJլ_ %Q:VY֧Z- 0A@4Hs^i~"E&F)[#ac↞=is` e&E!E! |"QjRKs,9ck.Xd_a4xEܗ}@vlwI$Ԁrc$$..嶵=VDq[Ľ{i;0wSjj8>L{OTKE(rHugF{b <@3ջQ7Z|:L%q2f˶ nptYi 249J}y2I%2 PǥziU9%BeYKP5ݟy&h48B@G`\b+nW2TLw00lN?H 6M6WAv+p76 tY4xyg:|hXsUJr >_7a VFzJ'd E8踦U>c5^M^XRd6^aQ+Gđv oO|x^{<"p8A/«ʀD5Nr g?¢qqV|f0ۃ%`6, e`!CС/kI{he'/G [ӝ+ޘ+6 E61S r%XyDr&z00}&=(|^L[M;EެlzNKR4.ERRx8CQaV]ϡ}@tHCF1ڌåԪlbz.UVVVFNR¥,G KѸ+v)0pXj_+A9 kWCygC yj߇$ EFLLÈi+6ί7~EW4~!wGrK20Fl_ի`W'i8AU4$ g]& X+=5OUW=]]MFh ~k+j"7"5&: -Y,׉m!~ 5 >}0濿Wl2q݃FcVx/_̭6BI9#/mN&^8:_࿼ 9+ٔsQ093E t8(od;2 :$ۺ=ںe[VﴰqP"-!z8NL%ϳ?[{fz==oIl˖̪ < ga@$4F^XՅJl!Ia4]S0yæbte! *e6Z,idC4ECݳE9*(!*k?N>ef1 3}u|un=t9%/l~t&6ìv[* ;nC!11?EW9S4@(&tI<$ #^0H40,,ƭP_sbDǨ-.9YN;@O6@~z;rIK>}hfc g?CBi K-P}|,D wOR<%̉Q|⼿<v NNg/~_L1o?JCkƴZ~l1j]̾{˫/׿L}|dѕ4 TR;%$4ͤ߫|.=c艬0acֺ?}-cGer>ݦw:|'b^Nt;85#vr^^7rbr(֏n ig@L"ĄT4x]ʂB=3:eyȢtT`S^tlnmW{߭|:ˏ$ƿ3 Qsݣt()tAse3XrtK$B1BԄd@,)|(9'iT.EiK6+ 6)s1: ˘FL1Tn:{`iҹ42Bix')YchU , Ÿ"`Қf7Bz tABF<Nvj7FSi/6D Ac\9+>-Ub(xPQ%J ϢHMIJ\ɠN*iURanbQWm[L)&H%fK33T0BN:'g+HFHZ緅h̖ҧwْ<[I$B2ΰ^M"+k0*}QcTnz:Į}oo]Y!4^evĞQQ4L{DgG2>b6IYr'-4XM6I)C&42#@Bm,7PnƓ*甂}3#cqd3!A[N#ObRU-qU[rݕ5䬩%gM-9k%g(Ӹ"ȹZikQq4Llf( Q)ȮZA(-$2s:lbǔt %0$]=^gRb{^F^ym1O壉x4w1"i}4SS-I f4FFȎH_rU:Vkw_/pYxayXכѵ ZgǼC;(zp̢ekWg&oxup=g3jq4DhLmɎ&I1f5C儒{U1D ޙUc4j|Lţё絕nVT+h4m+EIFMIdq)rJOg#)(}gr,viK2 Jz+Ck's_ #r5ĉS{JX8쀀(k!Gy'81aqIM} 3g_cV_a8BVk̓7Y\ec]{g|Ž~[gv..o^畒WWU,O" U~SPdNΜ3Z.gUKQ/@u.usr4c'>"'(FrQ2 A&Y4 'kQJY_ULx5;1%0X. ":ZظE,ĈXaM?S.pj8kޏQS Y.$l-!bTM?$S R1۰7 8-IЍCs)щYs6zP 9$NyZdee6 J rNDcKBi<æh2wuPr9) O1oqGFL@4l֎aڔfՊ ӽ5C^ }2պӳaUړ0|ͷw|!k)Jx^\R`ś _Ĭm=H,Rs5rvVv:Z˳/~' 6p|qMK'kC30[)`&& wkpǪ=#0 F@tW 5tnנ(Q}YftLk`t]Fh+v3 Q=Frc`d#G˕4߾2)@5geVj@+jh`edULA" VvoJm+# jelӧivnQ=֦JCza. 5{Ob՝e:Xv)|sY~U iATƤٮBQeIGzy\SJKpLfI>H|sAB:UeKQc0ӳVϦ8ʱ^◵0Du):[s޼V(YU>yqLH^`OAZE`A)Ԭa t#usCeZ8zS*n"a'`o܎WLLhhkPɔ_`'&gXGίQźC,0pȦ$8ˏԛ 1\.:NWNJl_^cXӶ>f6BD~satξa8Ӯxln L10k\rsgdOͷ >M6z\l;I`m OEOXK<8*%&1I]B#zTSY"G:sޱ `i=wX\vH7-gY.:Jͯ;7%ꉕ5]s^m AZԅ5@C:fc5uM4LBu5꙼1֒tfg{Ϊw L;H 9Uİ $at=j5 d^Aj:w^!3b֖c?GT_d3-Pu+YM F*gVfL/e?s;'|Vi "NUiz=*K!@VҲn|)?g_փ7,u:NY~vm>g;$^|0Bdm(굧OxI˞8Hs$Hۥtإu  F jxXCi簆f&8IBl:$8hvwvYGSUmndj;e;|F( =q2ym_eY^վ]5L~:^^?3nB{G)1{+d4nLz̋oJ1ӸVi8Jx)Qjo‡zGS:KgpxA?l^wSNuzhf3^QNp|}{ܾY4c.;!9!7!L%D,L!.meiGe?\tvt<ZN陵m~p:UTo' k?ǭCC uEx[h̉vޞDVU"nszy넜~9ğ(׵< |<8:][o#7+_ŀc'6dryI :։o+ɓLSlv˒,JZcAƶZWVŪ׌yi%9^X Z~:ۓ9hpt8A1mW/7^&& 3G5u@d0Ţ|}pP2*n S6ׄ=kAФÛdYx8(f)|-m. \c/I>hܑ3%58j'}/ o"O^CR-=k𧜱rUQ&m&Ca.>q}[0ryjyDz hVɧ><݄@g&u7V;zo0o<،ݚխX>4>F ğ3G|4 +6#tj~6BR_6}pV353ݿWk]CNV;{F }Faւ6:jN^!몏ﻪget8ۼxIf-S+JZ+߅2Dn7FvA%W<m5{jJQ$x}R'G#qe͇YgP,// E= _IP ({c;:rq~{Y#sL |Rӣ V ]4z8^iV-( '0&Hi.D1C0S5Hʵ0}8y )*@Yg^ [a]zA,z7nAJ"Xjd?=56F߂@@*HXYPl4 * &"aD`VSA˨CcLm`ڃjov*\O lG%G_Ϟʆ6T7޿TTapo!.M}^3>!D{h<ww_CK3|9rs3껻kx>`7n~^#Ղd*Mh:Z+33)yq)80~|f *GMh.bA9 9p.Gd쁣mK9e[;l0цĵDc,XɥwΦ'.*Ζ)DLЎ1iSeLHnt4}iysݻ@DCޙk`Up0޽ll;}]<ַ_[`m6pUn:tKsCpt&8a:T kQs4 RG9!b8hMIJ YHC=n(&49mx"R~ھ2en#hY__,֥Wt.t;hL$blIBVh``/lpH(`*\`.NVD;` Z!"{YWJyW.%ֶd36uJ[ _E1ڹ ' 03@kp#<CA(ry0" nDY{2VL{[^G+(|s! @lf3̪^  *ǘxm(я$0=78XRS.T!Bkl`J3_hfU)5=;|êO_ VST_|d6;݂:ήe@Ut+⭛^On&_&ofWà1鍐Ϧ.]P:Ɣ4r՟ |O:MOEM@'VYQ09ӟyu((JO|(\!Lj I v~Xnk VAuۣ; ImuV:lրp-)v֭xp{rEOq178ev2%s4$SI\ YP-cQh#HU[xh]cphU+՜.16qg4OSL y"Z"S6y1;SA蔎u:pm֭x֭ y":/SAHhAd,:wKg(5zn} F^JcIp Z -G;Ae4Gjf 2PXJ1q8R.MB$oШi_K%yDH/=be !J=b]N@bXHKҒ8~]VHK)zYf>fHV Ѣ>(:&3"7:IAҗ >B%hB`z1`\ He0iq^T?'=ѦenG3zs9ofCIYL.²g?;7<Μ<>Uz̮<`.'~0adf6$sx#ΪlA.$cMnr{ *ˁ z;SȖ(w$M3)aRj#Qnh \cgg ƓkX45]w+ [rh95 twZ;uqp\˟OQ}-: WSZ>lWw?  I6{ 7 ~i"znɃݑ \붷/VrtH7P|3TǟrGvd5 J58}곮R;'󝍂yh&>ep*/(KbzNϐnlT$i l=؊fMY>Bܕ?{{5' e$c|5ue KFCNDz:}ͨ?v܏uVI[q<}m$:p/jʡ pz:㞖lq?>旅ފ1>Wbirb޴+le.O}]2V(B}|3|Zn[R9o~$eiЊ1U %j*]6[M~]^U}Tj(B4.\1+$sZV=ݜU ZV Tê 6$y"/pǸncYz7>cKSߔ*iA+MNC9,P &7HCZs1JhC!_ Aju4퐿UDRl]=2`R $Z_5yd[R5 JZS3~"183[oD}s¨EنM]Yh|S'x/ }!h-m?c|[y,}ch]_iBad5J9 nq*g5jK[MƟ׮,3fX;EL%:üE>Q"wZF8 NZ«F=DI pJpFkmwOOr%tZJ[X.~1H1Hz `gopb<`+)ņz!$;: klI9',NF y-{?q:靧R]U!ҁNzv '%ϒmVkri-hRXX yvS氻tU*bK:6>+ۓԂ3UgЁLT)e4wqWaL' c9&B(al@Ψl ڎf>Q{ WnBnz +GZIAuI%vb&mI "1O!` ]2}wukaT><ݴxg vO 3y zdS}b\>>ތ'@#<""a`zb񑢅W;?<@(?sQcF!3V rH$)Ody /My%` th1x\q*τ骿d*nnL=إ9{iEgChE7SaQË9^uKv/1愶6YZuoE)sq}kJ*eS, E^59?.7Eyެ~=Xld?ѢD{Dƚ80uyGZ+Wi{![]?4Jg#U.܌ț X~^6 tO~MslY$5 j"yJ,t/6;.=خB:a!6GCHkFjesGp6*<DWw1'D~js𛾌Żf 셄˛_ܱm9-yDy60B \ [ώx{ k{߻ŃUsS2}u;6~州*Z~SnoθΤ5n . 7z&wS$ĺWe׈}8%tG4񭙤v/k)}f,JPX6Hʦ^Ȁ$g΁eBO;!R $CviUX <hф0yޝs#ȥhX+fH9}3r>x4^FϚ :O.EÃcy 9<D˞w7IBtx7x̠fˉւ. ߋ b|BO3^P䉉/qo-E֫o]r[CJy/RG.hIwZQ*DfHO[OeҒm'k,"=uZ*N|ϕ5 \:@5rAP7% 5 ˍV;q&:X̩-'y&`&i1\p"B;\B*ͯxK4cʵ0e0yfkJuf̹Y"Hh*3` # 蜊k _O(/!)){K.k-[\*_rZ"B8b4DSM M`siA\&SG)#1`K=aM|e싢.}ym%R.w8U-vw$V[\)]7 wB=y=i(=B כtX9v=lQ_k^No nZӃփ^إ^]5lCǛ1(~>“ k& ^w?98 fULrۜ?odNzyYmf]{K>ql1[Ȁ푇r?7| +?\X*ɣ9/$~2Iܮ ]!/]aǸ]h-zIiT dsיK%ux #uagkɗYe B{; I.4kD! k M onWZ㨔Ͱ?_񃻻rɭgNoR+]V:Hڙq3IϾݹ܌!Z>( 7uӓmEpt-[}ܢ{QDk#D]AkW,E=NԕI$EaI=R*^uj@WʌQv0LXA/^1[o&)^ZG_hj|ﵾ\kxzC@ ?U#5Q\$d.bX-abXe0p@fnJ!hZgr0*2#~gHP5V$!{N4fx hF^H9nZg(JDȨ.#-!J^|j_!&("":pwNd7^4Jrϩͧ=׾Is#hL;mJ G /"5X)l%0##jTAOSF@uB _`[p%;a0:tA,$Pg ņL㒂'ȕPY $<5I 00Z_Fpr/]Y=mf%b{9uxH9/uAQc*n.600LqBS5ئkOL89 R-T N[UtN-e*Q6X:pLδ0T ,1*5YŒ'HsJW}i|Jy%%yݑhjw"T v~̺m Sqd i !oOY&d >.sl$sIݫ:Մ/%x&CF' 20P3kL ÏFUK+Zqrm x%}LStnٌRx .Va!N;TH*az2eDq5G{jB={^P_#e|X$kGN_9F~ 14*rxT1εM3 RZX"rFMFRG2'i}e IL -b/#͛$dwksERI1F}t6_7WW<̿οʟMf./q2罀dD.8kkJ=bB@Y5;BԝaKK(ݭF*q7{)E9O&* ]jSX}=s:H0ѕI#Z,͖Gq8:~Yr vXV 1I#f;T16>aM1"G(/l|6^?=06΂ߕxԃ%2F%1l|> _!wki]pg"m((>f,,|909A 7.6'0a<d9PJs*Rh"jc=_fmɭigFMK~cV RoAf9fi ;V2kaeヲ[ا$.erZ㈙"hfo} `)Ӥ1j-U 0N}dMXE) h96HJ+ 4kݹfB͘FS9gQ,"c)ˋէbu?v>.K\¢vp6+#l h-Ma|>cE*K#X"7NU%keQIIdQ(*SKڧ8A:O߮` p5 %_h{0*bjJJ 9;PVZ[Օ@ iMbƆJcg`B1BVĞ?JFAKTԤ$"%Vւԭ#d-,E$IX&ם:!a Lh(WROdo'7@fB) JTBFq_(>FثJ (G (zoL8U qE|doKN5/qtmw8*(uԣU-%"4Nk3I#!b3%wgx8"i" P87iopPg3t{BHnK[l>^2sL{KA"zHVus;ئR4A [lcȤo9+X+m͘3gn[10$?UUg)] -l^6tʔur<|֫wj~Qܼy4ۺ k(kbڷx4rc5ͫ~\M-uPD o%al ?z}4R5W=Y/|wߞTNz0:~ _|}>"^9U} qSf0oe*"?ulX|nQݗOɲݰep@0lgJ/̴ y8&#l'BxXݦW]jE A219m[F!11ȔAN~ۥZ0^ޜNin_:Lnμl Z'+ޞfط}|Bf>H?^ٽNi\ uG.V?rb9T hEfGܩp=|;QwtKa=CM͆{& U4EOM:-!:FV)owKhCCq=ZPPI`QU!^k۳տ/y𬾘?roWEz3koe%ތ2y5ǿ6wX\__D&!gWXk ,fzijGa3MCDNDM bXF$C%e }DV ̏f2ա8LlBekK#tHQ`+Lc g25)7@EYMC7lt HlrsiMh _[ɣ!>o;::-G*_~$X5EUR&HۉOR*9jע.!#غFvfo>V7t_r *n305, n XYTE .@!CtV9˻|<8&׾8 pPI$'q$g5qU:S.*Shy_<(#j"\ @R8 F1Og@NO̗=y?bEܯlڙsba+ bi_a3R*3OA+&}ݘ )u zF%=X@ KVΦTV+κ&#_tD}Xz$M>ki}i7T󋛢vGL2F_|j6Ѭਕ$T[ȺQ;eO+!R9 AN+wE]Ƀ>ʒih{S?.t&NޖQsEAKqJS+cIQkO1 GUK9>GrP &Yb2E3eߠ Q{8]$/h / Me*+O4{X.,):щ>D$cuA|TӘjID”$}{K\{烤!?{>W}5Y;*GSΕӗp"HV=c|2o+w+&eF I *%V-mdmQ*%+T{T3{H(VJ><_nx6M]䔘 1| N'ñ2Fʿ>ф er,gevZ!Rq.sRY lF8 IeaYi+RiuE}{m^?FJZ6H_wZ!Z2g!RRf܀@Җa !8+-uVe=%|mj`QPV(K- yQˮpd9.Ht;W]8fTf>N]+Z>IQ V sJhȩZU EfԨKFj] u-j+QFI TC?4|1gv$m3AS{~i M=89g}|ϣ)uCs)iwR8o*-4S@QJr"ʪtF_-o-k1ɘlD锄8 D$Jlq03KMa0Zr3&rxTfzmpAV(!GtApvхMFpj&cw(ә n$K %R(l-ǾV!g49'!Vէw ?B̮~}; V؛l%J,j|0%(o {(Wnxo/떥g7o+޽g8ka;<Ͼ{ "TΗqs6lnQV띢)H4ΈT wݵNkV/j/kO(G;12g(J EjʼnMHTӚ _R^K KܪDt~5`,o&7EMF0N\,/Ke1Q5W$,1fw@7mqoZ@oKN 'M2ʁ$sZ%ѲPC{*Fq/s|P _YpK9(Ir'՝x{qޟ`h2 Nvk DY'.E"Htޓ{$tAv,3Txλ|gTBx,/W6v~fNDB[ BM0(@Gz= 4d9uy%EB0e.rvR/3֮pAQ,jNx 9XnaXXȭ=N8% KK ȼGIbry!aM4f\գVk {%$!ϤʥN}Z+m_)?zh抒,v|l犭K MAhY;$"Lum>W-v:h Y9eq|zV_<#U#a=c|0&'SvY6R8OHAggN_BN425GT3eF{{GNSdňmw$#sp` MjUUal \2!.zƸ__%a:?} х 'll/*'+G n>}.*%Q0 'H<:eH{IqV~gd-awS1**DomgbJkiΒ }P}\$i@-v_][JrҦC=#[Ǣt\'I!gá=7:tyV>^ң8k`ZK(f' ŴR]gVSArc]-җ /iJvfz8ݧ!)!e(gJYꬦTl╯qDHP֧pㆣ o &ulxcAL@/esF;J ijR^OUP債ˈg(Y4xu972鱝\׌ٟgkyJhޝz#l&Jn;6+Yx]BU]szUP@Q 8קRo|)_OΫ6ERi!nKX߃ҦJŠ|fr 4AQdNCX+w݇o>Oq4]k@x?ޟe]rt~|6'ϳPA%Hp#ق&u}:xpq2%i'^VHU=.RUTZcn pΣ Z=<ƕxFJ,!h').׺/i>3ȟg1*Y#IN4lp>WwhtArt`z+~Nи?,#)_~~Ri0,|48oO 3w'EƓ*?]_GoOSZ 9~N,Ownx7a4ucKTk&÷7oFS(4EfQqC5Kz˓9Nt{xymGA\2yHm\}(zJS.HWjh&[4:LCU֜5\o䚧14E1œ"nL G=%$;H\xSpR-IcAZI`kn[» U-A&A:a6"@ߧ->z)(hꨎpZM_t,rh?M79H؞mB@-&=B AO6 8M J/ FkII)+fw;k7nnYnoA9U]mpTF#Du,e (x%_ j4("ј}nQ*%&I[xem`VDP2 Ǻ*on"S c+g#bnz =0+(OlFHPlgBuhH &붏NA !h'X@*R G*ghPRP`ՌVVoHF"ulr$@5eZq\PԺTR%H`:|{`=jǃ ̳?{^h@ VJB֧ښ{0}\*р{JPCO4zIad-7x1|.&DpĆR|Ӈ8ڍFe:%LsH{K87A<  姩4kp}w9߭; wزe0FZ!tMCs}{Y߁ֹAû8esO>`x%3U6{Q%9x#͖m(fLӮiز%ݚSp#=mRM_,;! Yv;nì68|? }4sSl^̽Nм-N0g-\V9!7S^p-ַEQi)bhchGgj y43@1]9Ss5i\ mn!9^|!a QW,ɣƳk09Iy;YIn^ 8c9*6-cpNVR*4osmrNg"V90*D nNMIhz4PYۀV&+1vo7Y4Ui=XgO)gTop}\:sEc+WV~U_^=lS6ֽ8?N0?ԝ Jۓ7нYt&M?˟|%m2'oN7LWRQUzQ " W,R*eƀS;u gϰU5cMo ?)QW*o=?F+\w?]T a  vz>4Ԉޖ_hxA'NjxpyD w?|; /G7cF>8UZp)DTl":E@u.E&8w)1}O ڏr|{3u)Y<ɛyps|{=A᝔=A+rjg7En [Llnө_^zſ \M%r[u@)x,}!Jz:S8pŠ-O/v\L$gՄԏDܴR׽р k_j7ifrWD*sv͜=Sz*>x&w[ݵl4lr><S S`9c>,-QK"|ZЮ s@ey蜯WkhjLoJD~W"9EMKsQA+ےp w2]&a^5 4 U5L9 wZ8&a^݂Ye[Rtm)@Gml[.뇱 jz!l[fFJQ^mN|l N v #u~Փ>8ơ;= N!GHvv p&{ڮnn0v*Ecx6dHa@oxCWtI0DnjRR aDJ˴7Q%:(i:5tzԱLRvPE(O~{i֨ϮtŸ8T)ͥ!9<)$Z[B2!aDν{f<L5>̯o,taI_-L?>_f9#f54N'4Hi'^H.RT@Z@lm,Qz2dFp!UKB[%Vp<@s>la<.+mM;:.@8q_Mg}W!04|g֔H3L.JZ?cVf)jx4 !=- ,>eɔkl6]d9eӝ{&(UYr*DLͤbQ#wKEቹ n6]s8WT~ۥT*{ٚݗ)Od[dgfȴDIRdq&(k菝@bplqpЮ-:XnUZfP/c3j=?/\0MU.u|/@Q*JJ(|!` 0a[VBzsHf /*(-F`LRnCSa;Yak{toy<- d BHԲFC  \QP=L ÈpD&Kd%xJY??~]MGM ڛ 'iX>UӷkNI)o_a~2>=j~v pқO? TbߟSPï)"T0 *ۓۿ Υ]MfM;C뀟()vǓ!5@^M7cUx%GRT㭔[5(TkFDDȅU*0Uq`p KRtSFsokam-d1f_pE ".v**k3R 8'U+%ۘW`{kL a.]#LIؐ6.[9>"ۘE.?v%iw5B ]t.v{R: *=< Lpl;i!)c a5>Lឈ+AZQ\CL(X)ށh6,\P3V)VN!lt {%l(*zQ(̲3wP`>Qضvq89'3ҹ]c}~csᔣ6wR5 fA 06&SAs)>kaVE[L)in,La>ANԃeY {) L нxn<"!`dpp!!9wK{c 0~Iؓ`!#]h'p^=+ tMa``vC+Z3`4T g0gpzͩYRAA6VT`hC(pJ0L`'2,gx+U S*ȕ&BrSʖs41E%fͻ{Xܛŗ PLߥ޻=@:%%Z余g$]n~%{V.x^C MeEeٛEqU̫2nȨN]dW\AUu.: vM2cl*/: v0.J7>(&IۊhoYFܠ)7ڧxT+ޠPћRƝ{Rs!QnxVYRzg3s#Jw'33xn}޸~ӧ=@ HUK1vG}X1]s3~J&~򘇜૛@clJ I$I=jdr5:ǵ}%|4]MAI~zS $z8&T5 9J,a6cr?5q _:خTiL+rur@  v1QTI/Zn`>BI gyb>|pym0QJlk۠gCb<}~~e m-31xF5*[:[m-l$ѩ!S&@<8LP~R`Unyyy w;\Qtpj ><崅-(~_sVρsQY$\MPnpbԄMjQgqU"M %)RYkR"NE%N;fĝr+$MRm& )dVhf2f)4 mŹY7!nb*s*|ȍ 6R0L!+).(!>6Ѻ0huqj˪#;& S1xMVM.rAD$Z DDm ngr2rr!$p:bl?nۇ>4-zU}܄`J:PÏ? K+$_HpsXh O7b挿|3<o^䷪&?܁< W>i9{Zir9{ws5NIn>y3]~߫wwgM^?:F ZJgE??nLqIl]'}5bUjiW ;ƚ(vkE(Pzh8.*+F_:EtDq[I5נ{"=%ySB4 #Gz(N9<*wN)A&ЁA=$˞eP,:V#" PA: pi<"gDOzN!0t&bĄJ[lζ!4r o~yc/for#sq阮š`0H[ed9,O uFZ!:;t%o4Ee=Q+;R-.:rUW>.gv*ԻSl#e@?3Q!䃴"jxGuz>ɨ7ɨqvdԨcQdJFQM矠]?h܍چQrtM81.Uk ̬zsVVqk0rd ꊋgXhYH`O1 ۆ=`!G~ 켤H̳ե8bdөR̰Ɨä?ML(BS; N`ѵA@oZ(SWP?Tv_zULO?yMOMeݕLfL9BY`2)96W7f~3o6 !i71#l}䕦![D ABuJ 4BRzЩXx5"ALEMVTϮ8-uRҲ03I:$CZI50-9':è3؆ ԅ&ckPbhJx{ΞFs57y TeY"N[- 4/$ԁO-&.`f, Yky*?61WABܭ7WjDm *սJI.NOa?ޭSnw"[#(UJrM;} ROp#d0ёL!$e~fICXFC5sUSy,<`:"a3 oyNkAۀPvJS<.fڦ u6d#VmҖ-n\[7x XV|[Aм `,ZV67D~5]ϧ\jq:a[J*O(`_*h},Ud~+DeYF*(d-lK1Z Ż`P('16pwf:rcwrV++%'C.әCBp*:<֋ԫa j`IF53݃Kٯ6P5m[LVL I V8ҪM*7nU _Llƾ (9z+ZWrvoZl|!ՙfL)flV1-AZ1o:Y60/h۞K ҶP-ߢS?|@Av_'S}φtT-q(SЀ}9fH,Ҩ#yoF2Aљy v/c'f𯫉oŝ>d˜T̓˸kvNF~yz]%ê Õ>]7/}Sjt!uێ+kJJC᷂&RB6cA~}mB4Z%:I*@ rF&, \|J)H3k r"KGFCL&{Ukk} ?8V}+#އf+B[nsl?~jrnV.g:iU9{|]46AV] +7B׻ V)BY6$r9s (%ݪ'U#NZY%)m\1\a!5IX1<U ZT!2reҎ '&QAǻvjIyt>'` w7oI# ,PooȊT!(6e4Op:!ƌh'R711bIb(+'Dq;()sn^Xmsa tlj'5ͤ(%gD% G3X"lj-uR7i: m!P-y֬9fmxt_Kilo!i ~uI<@(ո]B;5(Ux+lޕ Ѓ2{n( 'GRsqo qA@}V}ZHMQˊ](eGP~eiʋXktf;> U (.3lu$q@cܓop2c: ?[y܂/NvzC Er=vj՟AGǏȣҸ1<<4ƙU4H Y †FP\G%FļGsۓP=YO7x]Iߢ_ɟFdעL޽4AhϊHDGu(D.ؽa:[\Q;whk׿%D2y=(=~ICGFaG!F+R`Q:D[/=u!/y/dvw%hj^uZBzBWnި߹ #IT?{Ʊ ~' ͞qKr$v6jE=+E"鞯yapB! [)A)$g4 bHSR{/n*+}k36']VC6͢ y}N5)F_TU>T)9;,9{&9a]x:jtD|dm s6z_$Pjݖ'eFWAWqiY~*֢2c8i:yeFnAo0V ӡԺy"ZR{?$qo $ȴ,ѮFxZ [a {tε=;^>W4kRvLn9Jrd-i;7/)IufGzN[pٟy/>w5֘c a/Ka5mBXO`)(ggIjPtyxZ un-ثm)ڍA&p5rK+<]E Ia'X] prcDh..V.6+X P-e)g-\%# 2 4ytSEn;Jʰl­DΆXPD),*@:g]? )yGc[|h,Vi(!5:ܓYÖ'69j>PhT_Hz./zHBzԀrr FPaRz,Rg‡hE2ՌiQt.%}񻯝bx ^% R@q+\tsjL?v^ZYc)śimqyAO0% o; qfOGrS\2;LnG 1b;PL|jht6hzJ~^.}wga=*e&g\V iV"U"wVZ:J&Q>`aV˘aܒz4J d}mJaeS^xf(`jN~AK5DG dYV"V>(g"H/$ \v5CZ1ђrҐ/E "6.ϧu*>Y|sT-Â_ A.H0kɿ'2P,DA.`J0AYa{ɒ󌒀fw9By,| lpWBA%ꮁYn "ȔN@ d 3$ڕU%p9k֪}ZkLj :9$DV𘢖?WdE:l{>|#7/>Ȧom9A:(ہKh;LlI϶äہ9"}r"U&NJoWH<:6 =r8qqG|LOJ.ժ+'n_^ T {}dRѝ.뗱y@٣FHD%I3EYnlz#2AHs"tjT) YhP>t"w'\"IsHT rE3Us_b썽CNc$>D=Ԥ,z<-Վ:O)?vfg/T502[/R&b_q4څw*0TDV_FB:fi&m杴FʨM>,fg͍t1ި58g-`L Pr0gTdkhG rRb($ t61PZڡ]$53IH>ށ#θ mxE#$hz(7(?lנH %&pmw-u-lzh 5Lx@U@e ?F09TfZJ&nzsx_tѿE )P4s9g^s+8;WI|tr#6^K慀ghb\(N NTRf0K޾|\p;{0'G(-^?7OǠNㇻS r7Bt+s;rry{N?M&0>-pD aч$t6Exn}Gjs&u'7;l ^ B`6S'e{-ox[Ua:X"D\C\">}"faAC0Dje'CY!ͳ8vG/<ƭ-k݊L+,弹,j_ujJ׹5;Y ,09rJhLhf^R& 0EtY%S0J=h@4Bl? h 88gV[l1f:c`.f#lH=X}p!:1K-fzץ%.Ӕ=(E *_.AZhl@Y2Лe b-!/t6i-wI@aDnTDA.M~c&\:n0ONGЏꦷMUSS(:>cHEX/RzQXB8‘#!Do|$dVH"R)k As8Ja7#iW*,o,Ȟ۴Att+F݊ulԀLZuJ)uڒLU-qChd#o`ެw"E ZzψA7WytP S:*7m<\Sif R,R;JNzˁ'$2sL'$1~!gA3[[Nu d |T >JUZP`Eg\:Y/NV,hJI+SXbUN)L8j$54XSqSeIaupJ!SdAPEnB+<q :7Ȃ ."U40%9&x@AjH J(d1x%>)^d(P)$j-7 ; vvAسw'7U-[1 v?U;\&C;TUϸ#;-Ɋ7FҖ$R_8+F6NE7ݼzV2|u=:#nhrhR_ƧOt}f~a?ԝcB:t0).eͦae>mE$ߤIw^J䝛?\suއiskCEkTPOCnu1asTni-Rw&jhtkCEw)F4eh+F} SHMV9ۙ8__뛏Q87UΥgOҞ,>Iop<7(Ӆ/ 3`Fȶ|xyE@2(\&}ET>D\JF79+YD8 Al0<:*{\mP= p-z xsmWmnoNjl\ ˝LXY9lќE 9oLc)(#``k~ALwdJc/g}߀/SڃC_bCQǥ14DkO6%:o7Jq]B݉`J0ZWi!h5o7-K){*X&䴕D2+HGVfc8Rń4%Y,sqeUOqnߛg |fY ojZX?ѫ}74+Vr!dj-' 2n)v?{ƍ}]x,/ub'[ Q#KFvKp.%FysC*2òԇV_8̠@8èVE]YQ,*=x Ĝ=ؙԈE־z;3W/bgjbmQV> ٙ$J9zZ`;@D$Q{uڣ˜웴_F{~֒uD3j½-=3igo1kΗ=o[aӲ'l<=ji>0IW$u$1QX3gc{*E*.ʮjB"R_w*Xˊ3z_\Bm:ҾAݏ&O"pͪl5:eM]O;[Qjh)pyz+jmF̝]6M?L369 ڋW~˾]|c yii›_-ͬl Eh[fHTeom٫dP{x)yv_˱NbJ[Q)+ϬkVf@41ʸ|cjӘjnɔRv&P󠉉 Ŏ,YHn8VinZ/guqJpըgR`*\0|eiRvUb _+%eLMkʯ-Q./(T`O,X:gcRB`Ț| Hp)$>b΃ aG0=`윟($X<6?)bBrP !bq9?[VIzS5$waKW(kIQBҝHӆ%ejURtO`dA.jڿWkΛZ@NәPLN.wahO6Ƹρ26ue슗 ѐS [lJr1e 7xK,o?Zfh [4EmxMl$ Ÿ6*spBBh!8^i?1HAV@4ê]h.ȊpI/.z F5=іѳG}}D5dO"׶p,$$a(U%lf?M&#Z~/7!^5hNurgϵfsL!3$x} Uw$@$1{b=0+6D2%ioQDz(G"g;TaHfqʜ®$T:k6'Ю kzAƃA]P?Z 6cTǩ/W\<\#@8u@QYt\Z8g;093CQ֔t{%S{?t 6m@w4P5y?L3؈v.E[ h^ǵp(?.=29GKiʥoY"f);;#s85 !F8 "02HYrc2 H$f2b>"Űc}W@W!/AHb!18H yL)eRB,0CĈ"5QPKD j0 ;gpL1-Vp}*$]RɥY%SB*J-NH Y.P'o5 (0De稜e^r}IzR|wFx)ʴ`Ǥ*x:ܬ):>nt[Edt.A66wx3oSK * 9S1p$SH€{Iq5i.k+hvt$pֈfuᐈ797hXĕwpXƊ!G,ٽqPp#Vv=Jĝ>a=/x}YZ" q [qyuuRQ Wb5Ovbw;b`~n, D`a/h怺>i`F`_5}8Ï=K+ ?>.CH ̥m▲3!1r>"RHa(0"2wm IiLHcA%oC* !Y@e)ɥ[ 82!>{/,jLϓqm8uFb_IGǯNƣ1Hu^u~N~0h% pЏ:DF 1pOg`nH}G&iй<$}Ish"M3n#T52=>M4WJb=jfg=*|&9*Lh WR:d;.(4f[Y lT;n5rzmt+Pͅp-Sv[\<Fr>mW,'p 0LSdr! ? 0 %a"٘ԇ hMR K@<ȻxC$pvI%;6'\T[+N)$%S =!.!^J huZh<75̩IݪQYG@6bsFQ9yg>җQ"VdM pۭ{gDŸЋ) ]J6Fxn ]@NcT b[SiyPsv>ByT{QTxЋXS 7xl-\k o5#&8>kC͆w.A ҠZ-}Ƌ9mX :881iC+cDvӚbB0v&;њ ?GDGɂ B[~eIْ/(p XUߣ ](bmyJZn)n>݋nΝkш $D&O 39L4%#*NytPw %/4G/-ŖgrAy)M]Hy3JD/-iNLd${ =ǩ CɽdxL'%v3.a%9ءv-P8 )#|,\ bt4;D;ˊHR?YO04(fEzsxLBj6e(:R8Py8*_uŤRpq)|TH G$H@!01Cҗ"RJDA} V)A'/\k\*7GBbbK1,p؍Oew쥾ZR;ZcwiCM4 aFOt=e;tIVtFXtJ/C0^D|,##)'>1SQyL#24#"ʡ }I,(#c .ȥNO M:WA)]fz-G /Ѫ~r,Rzvԍ8o7 |xz1K녝C2N{o=!QлvEfo͠t>?Ӵ1λ\7W!HO#.uyN:ҼU8=Y+LD7˰nYZP͂=T wNEɔt ӏ%4D3,jtLw +~8-f-6Ω!۱}uuQU2IrHw'd^˙~2k7[앍Wʷï'iy]-O-] %~h"ؠ,hu/VS>S }ӿu1=fG;i(p9t=§⟇ZT(#Uolۥ?^7¼z{ʨW{oF}YF pv8fo`)l:_f_%zPVx01 SO; Ѻ;L.qٟ7yF[u:KO鴙N毛t~S.2Myy/ͼ\ J5_直y`dZrַ V /CD`)Np( LI :E!T3aJrْ͞ӒYṃaIh`DpTR8UdU`bd\v&!pgʫf' D+W8p^r-)qDJAx[- RsA@H`Ij_;D.V9YosvГYN n&Q p: 7 )=.)yゑBN:,U#m` Te)q )tH!w2!:),#VpٖxxcUX_<@kq5(}fH|ͯ1H@3%4l;LV"S0ӛI}MEˁ8࿳?~a ׹pp!)1?WL/` j?/m~^a#@ʶM]@p n ݼ"p@.4Zh*t+(p& g9+JuÊ 0sVI Q 4o )*QoCmF7-V#hNy6iUH8gYW5;U?猐W=jtZHPdQy&BlYPƐ5͑ItP>8a.?T:ZG=TI"]IK)Xŧz, S0\h1Ndʨ,Y !>#:,TI}iDO渒x.pOVZ4:iOro.gr 3yOдVuh8i^iwHN Xΰ֒y܉(\0( !-`4 XeE@fi/TC(1amBrX 铪&mZH;TTNfZqgXD;C-*% )-NQ"HRA,M\2i]`4 vij- AP(jsvEF8(GVrZ&1tUe&rAYJ)jdG]Z@vp Vef3 酛鷢\ܧ-)K&]^%Zޠ,,Za1bDo3б9j@EVk(y/1S}p۷tnn+?F=jqC+Qb`'}(Ο1`275nE7ո}7!&ɠ rd,ލ=_18Q_} l.7e#+s/sg]m-?Xl9{m<5C Xit5= lJ*Ӝ[7T\tp"dxR*e8CŨ9!5$eY,Ӿ[^G{LC9tw7nI!eٔm *nBuPp_:!MGN248HŁ`Êhl:B1ث(Cjա-oglJMk:A:H.!Уkx78>:B1LMtz\ء hS"u|}eue8î y zu^3!ی.m G9R3XArdZrbBU2UΩ2ihuູ 9`B[ZX/.(YZS{]gz;FwZZ͈Dxn꾭i?>4,Wn1ZUh}z5K@i;L !u@^~V;&MR4LlXXg8@:ĘreS0k'Ha1Q xW뼩":S8 +0{W5os4ųoqh#}LMW*2fTw S [ghZ"pp}%tUF|iwRSQ;]V?3=}|ϹԾ`s7b繄uyv{7e-gk <zjΕZnfhsɫ^ޡ}0MⴖΕ/%XVVknigϹgqRUVe1SrVx?'[W[|B L\ѐ\E[V_U֭;ukAv@O9X}[/[h+m=W\F3i?]j/b%zԌ25& G 7L!otfbEL! Šh))VX|77+ښz!Hy@N5HC$13T IЄHX=\!Iew|TQ!*dc3keێ 3Z=&!6a; D˂|L!3XCyqq9`?S+)wH|LZZUKµQLj-֮Lxs6Bek}Jо VOKL\Rx ɷ1B0Y1'+4nݮ!`^|:v`ڗ,D] W*8KVIeQ/A*9S A#JYliї ;%0@{"}Þ?08NϧC\yНE7E/zgF5+RCqCmhs$,?5t+Ob 5ۤ!+J٪S·2^N/gWb&&Q&KV&ő2bHi}f@*Аu;lÏ8Ƒ8E)vnp?ԻLO!|> T> 1l8etw/%[%]f1j*LSjl>4ջfNu )p&9%z\pH:njeܘJ3f.a3p/se4_D(|c[gzń IOp$T-:2c@m6S򩄆|eB)usL zSs˩)LCI /xe! Iԁbc5 ],q_PvY*u{VR PUJޠTEX4%(!$YCKMhHiV2&(%u6eܖiFm3wӟA MR8CDŧLS;.9b&ixM0paps;Ң0#Ӡ/.-pwwӇ)~wo -]5m4GJMgGWp:LEMhWOrZ2*aUkOVF%>#C`4Z+Ēɨ:NsUt[ĩ8Ih#eʎm溿UG V}8C)Ԧe ںPm nMHSkE3&ڻd mI5D@hl"G;$m焊Sx(Zc(7 W9-)KlPoj#AzCu(#k{;?O|NV0r] 8 z[O'y5_uv]WqZ\9OVդB_<< <կ֏pv8g  s9FtyeϦXbJPMYZE/o[\K<;_~ *x}@:ll5φS9ѰM!)u,-Z3O^[x@WGf·fo.@2!8 0L_XH$ ;_ڈڮzjQҌFm6uV@Mb ɣ"B Hv:T^= R#֐4I2ΦC5 Jg' =P+n'΢G[̉I\E(=xYK6JG桮uo3CZ S| TC魤)N؇C*YP9D2MjwQfL,ػO.'x3 ~µfjo_3TQ XVD9G1y.s3B^~\ۇ kzooJ&ِ FH"g7(-gδO~dž\T 6I#EAl,N~ev(O7"{8A=r^5z5ZvZO:HlGBzG Rh+)!vyKV#ʼn71b Zp-R ]%ۨ$\a Qcb=ܸuc^NJPnBlfёI٣Tziy|ƙXttɝ$}yrg e6p}ݒVuѻ%mW?m-9 e''ڦ8󂡤hSŐTBZi#KkgC%|&'(ί~"1i)6QQF=* nErpU64?#<WNdU[GdFWn6qF_ٺHfӵ/MB miG|:b1'E 7ET='U}p69Er\ ~Pl枩}fp(½!I;ܲg~Fbeo3]]a1.Anv'ޱ8]~E')K")!XbUUܢ<f<'}H!Q\ؗT>`QK~zeNs[+7jCq2^r?jḴL)=YE$}`{o:# CPdɣ^[9J=Q\'<+O:8zTo\(XdrQ K]bITvnEb-Sh4f cEY1!HSylqa:u?,j,hvvDm"ۘ.Kp ų7{:}UVw ΅rW#Z넎 6GUאKqN2)j[Kn;3.+O(HUDHIjǼUoi ݊=KA1v\_eڼ\}٨>(vSqˉdB2ŤI{2h1YіatI!zY&XW2tj?z>skrV $^|'i ׯ5v{7!YƋwϻD EX82ӔMR뻗Dz6xmb UzBďm\^8{ʷxfi]ӬmܗWzEKPJІm %yDH kK ^hɚSWύ_0-7jPw( T?V%rj%r;HYpuWkm>T}CzW\n]"[Cd?̶!0/6}Z 2u򮕨d og;r-SՑhQ,K|ʇU@+y=w~8h$@o⎶.Ea=ڊ6'mR4l4>Jm)n?Wؿ(uU5TkM 9*%#Lq`Au~FM'-,JbY`D"0e|Mv֋$g3Kzi aҬZ& F#@i}< CŴ(ORIG^4 c4גTN۞nW&gqZ%lhasื,dv{'.eMJ_疥v1bfMKj$SjV$KT^"cɏylv' @ru3͝-LɎ5 C>d*-qap,K;26;dӇ_ma}( .iP®ˋR^WV7mHV 89U0di2K9͝$L\6~{rjS+5nT#j$I8ZkXO$`vk W?Wb9d;-s}I# ̾g+&!`ѹr$ }NL J&$}%ƃdz~T=劳y+ 4i qT-6}|3+M0r1jP2DV> c>AmZ˓2z2^Acf|{Y ݭlc:ﳛzi©Τš@w`\4G^`v VZɡaS̿`Zw \uzzNy2@vT8Xð[-DcDϻg.@feЮPXPBi@~wU$gwĀa`OxlSdco0q3%jwb8)>ԛYଧ 9.|=ǙQPXG$ nWF@p;lU.g嗱:'(F~z ,Gzn|}xI1Ħ*q#0ek3cj>Hyr YZFi?t4Et0d4g}g;fvL MM#PP FM5)E0{CQ,>Tp| K .x86F^] JtP^j<|+ =y`5`GOiU޿ub+^@eڵ?4vxs0:]N.QNG#XG=i7yhq>'XRo9(sN߽eA}ˤ"z< Sc~b]5Bؙ V)3&&(/nw=eq[ 474`2ZA*_sO ZײݷA[!)|iy]ag?(8Ժ,dv8qBAOό 位`G_7j´CL);o4_fҨ|T8-z:d:Uɦ!'g|DpnkSN[Sα⦨В CB@1 +Pm< >X]JF}B9lE;j+&["$0N8hxohfc|aLEŃ'V5[ze8@|X].oW.b٣E#ZeGMp\0rB9ua8%+.WvwُQ "fhs͋ iIsS Jskz7҂4]̲ nHaaω azG ɸg#xHnXh%!(WE7 󣶿 ~ݶi$5 M_lH.ɾΩ],Ǿs`LުBm(\͓1eN)q?yv;3o xӤ1vWf;X9ٳo@5t(٠)z􈖛!˧y9aM24KLQ*T ҉qΖCi1<9 Ot'Ct~5'&eecC#p)LBJ5iYPt72 `-Κcf v03 ^0r{\0CKfM֖WpaLpW\I[MJvWgsm8SuMEj0`0AJ%~渂 7^|k"<#I]k.K'\$ID[g9כOW^Hnyy㯅ULj_ ]dA**0X`\KP9 C]DP2x%t4U2a ȤT(R1 EɄv}s7uyqLʑ u{vWgoWnqy#),ޯF5VcmOgeV<\Z7/o珯!k.uEx*u)/__y郩@LP(%t ӂ8by_O_LWS䥆Rj,,(4DUIV,SRP1 !J;SVX݃u1]ۍ׆wF n:LJ^ePKb]N9 IHޫ%m:5GbVLJb(u^2uMɂ̠_l:R`w2F5U1u*TCJtqp:9Ӷ6F5/Ndo@/$4$/DKtpXb'慐9RBVVDBg(yGa|C{ё&Іl/B%FG=ˤ.{+Yl#{}IK\ Hzk,Ƣꭱ֨7&L"XMޡO @0x ކ L,C\bkcb]]୬m6s; j7B Se4#hբV-hբV-Ѫ1)y,eD!\rmTFxriA@Ǩ FS /{2[.J.~Bбq@h|Q/EU4ν%yJF٤tJ^*D2riB)PZHF6-1:C -?XY\p~R&]FeCb B$ NRUEsR6{ `b ilp#d1Zku!aܑqRnL+Q=0sיs}eЮPXU$_|Ya D(=R lI"wWHyPj/|jAÐ8ĥ =Syt'GߘCk'MD@jIێw΄3fqꖘo~o)_?}V~RG/{&2ǿo^?#~$MX駋+: ӫdDǍ[I,vhBx) vc$ MJf4ܒ,}FӒnu.3 Q.*U/Ƀ5E.S>`w'/Ϳs;GɄghlK݋wNoQZh,G)Qtj8}D0@ :PITRX|ۓS60 8Vf[7=5c,uངi.S(oK(oK/0ru%3O1rٔ pSGL!W2d19~x[w $h6Բ5l]>+GJ<%փӎFУN uуV,K1 U9Hٔ 0n-mtKzV@0b`emb%np2-܉!rk{'5I .)#ʻ6,ڹ~o }k*-+aS͍j]?X1`imVj(:s.J{ kV$Eܔ(,SJT vrJ0VM*%Grw=3X.ՍVDkNhCOPC_D֒-*y#cđfFLHaJFɖ <r$&u"cgwUL"B0‰b4EhAwKCQ9h41F,|@qt$#@^̭I DZL!>O>N1``Cǥ^z}^aϺ[4=b 1+F*Tnls%=Di5EI +o Ȉ^3:Vth L'$ *HE,zsaP8GR\)c "W>\dq \RF≞TOGLF3"Ɗ nfNA_lN PB`4\6ZѨ Rl.HQ䠓 JNri2wH̄4>i4U#HCsStafW*a/JB}*2~Ϯ Yjb<_JR+U:]t?6.u\KBhAˎ6M+h>V$8$P(ÍcI+aXZ:I5ӆ/0E dKy{,֐MTs4 srMoT'HKS,xve;QJA|9٥Gz~۪VXBrz!\x(>)KWU .LڔQ3eJau8Bc^]E7#[x+&%Mke+AMU M9L"_tx r5uLNJ}6sIhF4wU""4$=VVMj I܁֥nC]OTd Mv:b@ G\h3Z A@(0^z8VLji=ߦHbVk=֮ZZ(u"ҕVZX5Pa{زrRarBH Yca(tZ 3C xd,Jp#28ΨQ}#%S s^T_TFUR 56ٵ&2*"FykjRvD0Gk|u~/;*hCԡnFv:yۥ/$4 Orp{w BZ9lfjk]ibQOTiꚜTCK9){N ,*Wi%HS0E2/=627 _?ݸ<yS:?YlH6ʹݕ0OiO;&\2ȓyÅ# Jgۻ\%+”KW7*srC}*#zɿzy`v7}&֑igL!FΡ1̵0Ŭ׾6U`RɌe*F^G<%ŞqzG|ĆР'>p龘w{3V'?yH}+%WOjk聴d2LN^KF:Bg\+RگO|G%Uҽ; Ru޺l=:܅BTRw <<|-z46XgboHi͐p.ۼߟJQ>M!n]BO0o$^#+D>Ƃ FNK* V#wVm>'5[3҆J UDڂ'rɭ qOݐΩ\ѠS<\kƏ-f-{R'g`{Iϭ|{)}j'U'iC|9oėg*,r1Hs]f T-buuVoD\H!@ D99;ܡQ>9XrүMÖz c=1vnp%Kwd/5~c (-v23A"odaci HRvi;:;7 8=;[HNz!jZkېl{řҭȩÔrR`QxǘCeeE|%tA.&[y|RFmE;`V^e:@@z Km.#1]?`F&:)D}$`A -BC H0E,_ś+"ZjybDKyˡ8>k+y %roM7jY4G 90|FX_#aH2MG1*܋NɑˮPx-]ͪsO}j>}6z޽YP:hp32jޜSbhwﯖUc*"6rGϼ#/f=|K&ONi-Vk+? Sݛ,n0ۂ4Tgp (ѐd|TeG:i|{?"q~ )׏wb|8-;"cIww..ۋL`>6~uMoOF]OySw>w3kg@kϗ^C*cN(܃˽gEw _v&#!>{wPMKs2/U?޻32dZANy{Cx6Ͷ͙|Z.&8!7 ׸oH`fZw<ûM&(_ ~%G١GoEz}Fz[?C 58jȭއp$ǍÉׇF)i۶0uΝfwvu\+b^*9{~[٥{[ 8Cw*Vkj󲚌o5ؔrߠkCk4Y)-jA&2XPwʝmO Fe2&J8sPpt3`D8( Q0v- ,Df(vK$ ہέz׿jry` C32$C*x*cx`V_ =(C509 &L’e$HҲ"U)(מqRA XB/AN}C`{$ݼd!m$RIٻ'8=CR3C%Ʈ%"`lwu E!%t>nu Ա>5cs=O}~g`?Rdx  h0y//ވ /«a6\-< G"ut}f 'ƅ 3f~ Ogp:~\A~T΋\Fݳie}~L&cp Fߏ~$ǟBG!ғ/0+r^% y"D0"vk\3U !\DȔ@{n_M5ŷw9AwE. ynPny@nT"$"Trǯ7Qx{ێ~x|삑lLoz? G?NDDdf˹"=?x?C.`s%ӷBH.Pb02wTXYcr}KaPw ]qbW>BXK`73s7R_w Mx50m8G`Jza`O$HXfx{-#x3@|4dby@ m`q%{vP6$`mN]#T MHKT|^1~lu .R`WV@|cT?XNOt+͏:'&߮r@TJAZ\*K]5Xe \#٥/]m7ӑ"pR;HHwHç8+z[-g;1AcRr鼊?/qRNȖ"muWxKZ:Wa$ rPtʱDCRP-ʕ)J0*Ce0YΜp]埅Iao$OR(/-4(g&7+9ZuCUUq 2#mB<f3Ƿ >0H/l\*Iw$Ԝ0&Sj\OY3F8x ƒȧIam3 d& fr\Jˏ?B^ Wty͉j@5\E?HB>B%sZ)T`2qnk=MN=g%'JȕމS Gz܏^17c({=Y;Y2}X]GV9*K,ВtĂHe%{l#jB!%; $EWK >!3k|o&i`F5MhEqo0~s󆳟 }JqZl4wEˠU-A;iK^JchR 7#dwKȚpx~KFE)dRp;FJI5)WE^4  xH0M,DC` HD5(']]ޚfF3K9iu$1Wi ^ .%!9e>=,o\@ <8~1(n/h j%~ѳsI"r˪R&d>)5T]">Rי[/,!Sri.ۣlDsoq,G:kG"KyA{:DZ|2 Ŗ}\kaZwuָXn:b54_gvNyz/ӻ$N묇&7.kNau=^/|qߦV5!׀ŞK]~C&"9F)f]s PO%*-s3.P%??8>eD8Iys [CuHG9y`deK)jHplTH.w,"zf,qB{.  0A)c .7zG/Zxs_Uũq5c K>tMn)iU'e0@c>>SFʎUEr%͵Ҫ7=Kwנ"{ :_[#%R jnoT>1z1\vJ<?jmma +Y{YuI0 9nw<ό,5x1*vU49~;0 lӪNL)tZuBfI|ؘXw s3* {n;b' +M^5Ċ4ȓ`I?K< P 4MU?/tl8L ou5w紐NK U:fgT9]/;G]W`o$XFy^E7X ifֱboJM.VB|gDt\ޙqew.]~_̤ n}íW:o&.4jyCS@9nȟ։K9A-VGk)}~pfnc$ e @hJQ呏_W_(-۰+M妎<k"ӯe,z Xg\'㗰NE}kX%%dO#v(s{"I%9?m6glN&:,$&GAF Eq91Hߖl2L:zu 9X8S FHCEX%B:/Xā J ,vMWBT%̣3:lv޴2R/̙$UaI<שּׂ9y!t9p7ީh'if+QV`R]C->G>^o25ZI%i10§po+q3 [u?ı]zumECv}.Y¬f8[2sJ8G}*'Zt(hv*.Qy HBnЭt}Qȹ'^_9-x 8k{q[%J)="UCL |>:m_PXސ|QFLr#x^{B/nǴ/_h~XSxD[jqjE uι.ZnłϏc`#I`Q8mXdre;nELc5NᠸU8 4C[N?RD=[3?P ;,+ΑEIݿ?ShPpM4{9p3kg՟dcUcy,W1媜Dzx 21ȂIcJƁ cXD++BPT-]qWv>`j=A(FsfY1`vs{-%1Z cJdGbx11WUǫU1RN*F:#ؠ"#T9nL`V77|o1f ޾-_xآ;>6_Wb||,^)sMX"4GI;q+#unf,Mj$M_x5\y_a,*%#lڅԢHػƍ$v'ߏ|@.~`ʒcɳ_5%[D$dǢ_UףYI,dji۫w[A=5L]\{!]X>VWΧp5Y< +>V)#],5썦| fWo=L^$oшL뫋yp)_,q!ǫ h%rdn`5,7VvlqN6{ɒԄn]i2ژW~xVB! a"ߊx4(`@Q~/ h2yV./[}-#l F{:K|٭aEǀcrmjGG՚ݔ^`b`i"%zsFv{cnwY'W1͢Q-,/~ ra󽱎w[ڰss5+y;@N_=JG$.G2(OU-XOu%$$C"f;VQ?:T tA̒Q|$J\ne9ϔĺEIZ22J\8zp԰;.σ t./> ̢ճ5XJ)ۣ_>W-Yׯ}iy/64ZAD*?|+N :!Qt}- >#<)ӗB_'@g߾\mtOER bBjoOoa5F237~ EDҽQAIF˸,6{^iaUK"H_+Ω;^_|_Vޔ*L&6n rr} & aXzGm.0|eX@;n&jƄuo8.mR.]i7ſTWS[%Db :P05uE@JǪ"9Eهj$$0 2m$ȴ@sp<Fl"Zh` fBw6jv|AVjݴKwwK&I&18. HTU-:y'wSj]~Xi5B~B-VQ?'[pb,۫Ofvgo6IGӵ{o6nx<s$R#E(  q圈r!p#R) Uzj7oDDZPduXo~4+Xs`_ Y X)lK)Y.5Jˁ*{eAŐbTXR~Y5VP%5v]6t 5 f$Ðk%eJ؛eؐ-LU7MT(ÛLչjOR˵wV†!0F52^7"tc L0I+삧dY z؁kBLP c1r1߂XƁQxzB3+whTSiu;Qχj1*RT8K1X,c(X/Ɛ$vT34ZLEGy⋪0L@8`4,1z`Rjׁ1A21Jhx@$Ve~4^r96i.wstX8\uT}NAƼMFמjjཡ+$.wjsUY%*9Rۖ.hqǶVWDqlcM)_J!ԊSG~w@.xoezdc^& ;=u0uc Z'>,[N;6Ǿ)1w}6*`o:"E$AyB8``kdFiМ\A@6$ Lxύyi/48 *qq G:TM9)Ib7Z2XHXuV/5^W`35} QBReV/䚀2*fV 0Au/h:~Z31:n᭺):A@VS5f\v??ב ;NUp*%/>cB`vգM PH JmߌȚPkF䜜 ]`mCݠ,LK})r|KA4aÇ:l0B==]r1c*Szb[DcؖngfN֠FG1f%c}3Ε;fh fDS=|xV;T-jA-\| ]ui-Gƭ(W έ- G(?MGqJ񀝭Z{A;[dt٪G)DŽj{wޑZaĿ=Y=sK!&hB6 iBր#ߪg3ن]i9?={Y58yA{.+aq!BtY,FbUFFbjűTsd"H-g$uc^JG p:z5J˖57 zz㕐 ,mnAQ)}Tྛ2aY'r7YyͧbޮpS.{WÛ;؆`e9TϙS)G1LO՟ tI6k7 bܛuUʹwf6YE Wpŧe)$gSos@>SUEVi%~(P(>M:/t$fU&,lxfjՔAq28(wa4[ @$&!&p~IqT]woó$l MDYĵu[tZI4 j'1C:pE <n ҥ2(g2`.u^9"FtȠs{]ƺG:ξs  ]R93:"Ecz!ĥݛ 岭8 nHymPXdOO:W??uM)-W7-ўCg*˹*4Ni?^"jfhA-{TTkTWj,|{xӚ37uMwuȈQ$S;(:c+":2CKmunVT5Q^f ~gՂ3FEĀ"2٬/+5-2k_ѽccuVz-O-Do[ 2>mt_S+ڀqMzvJC/Bp_>تֶ[x zz8=XQdZ1~z TE_=Xk Z8e߾tB/_]8կ ,&⯁> #9_ ޵~NCwQ|uΎMhsb' Vn=2؝dEi|*.\#fq=,iKg+J ǁROK a_;[ᦎZ(ΗKGd.? d,$bu]G +ŃOٝ]vl*\|y-mi3^Ѳ\|4Y:dX*E;yW<ֱ ЪUw^b굗0B*N;iؐ0ا]HiQG2"G%p H\4(,A8J !yD ƴ iFD8PH.7荐4bRd/N()  Ȧ| 45 fvm dHL.U>dF1-@~&xX!q֯t$0c4hIӭMdJ̀'PC~_3Ν3N]| 9cJEǗo# ㇗@=@B̧A@~7g拍q}*dN8J'cӻ?aJyXo`<\yR`~ ?8)fngbL=!Dy~ ]ުzAi"#$!Ŧ)ϊb3\gm("bfCo.?,[k3)c=F3DC^2/&O"}x#-͆;D#-[u|g;Q@O,&޿AVHu^r^m!ezŦ*z+Q@y>mB+z\V5t8_uPSLNK u@׺aE],}.{dûDdg6ī+)Rȥ$WCDJT8* .F9j),1Rw@0 6 SK .#QRZr5?oHK!F)&Alb}6#Id{- 'qp[>KhiNOY? nFai? ?q_^%G/Q" #=q՟|$9_ g~W[0t֋UrW|N~+Vl>X$az֦BG(roBqM)*?{7E9!xT bL'M#@{n'"[6K .ӚclQRl-Pҷ@2-2z} u:%C˪#Kd rKK^Bn;XatUU_V ELBuAd{> G m) OӤW:}JfR^fH5H $VZtNr'=}k,!TƄҙgya0B_ +Uܸ5P>;$.D|AO&jq%*v9&6&rq}ĕCbtTq0鐵/YG,iqemǞSo|C|/W2GͫW"cP5Hy.c[ sA* /ӼYM; Q*BMh i?͍H cDV2q56s5rS3a*P ,dz`҂)8d1*Fv5 )c(4sǝ2 aBhRA2GDZ&(k"j^J@; n I@yzu% S_$J5 *پ·0m&kۨXd؀{52<>0GeOSIAA8jIIy QЖ LRDvXS7*HbIЧ(6z, 5 s,e&.ةjF:}_2'+Qbn_FơQI;ه9rXzgRs.(Trp^FT<z2߫M_9{g\:aw9^3mt_ ̈; :2Of be29~pQWy3w7ϗ_dswk7bay?7_[ޢ;4F!\?0ЃV8PL(aFa"GV[f%@4B!p0v IJ8hCx1AT~uT; ܐ6cOs!=oyd~s+/|yȇ|yȇr©Wyn\{_{s~Qi-XhguƷ|ieW5P#Eȓɡ2A "NA|iPu5Dn9iWn 2QO@P)n6uFOjlÜ@# "*Qڶn& Bޒ@'aW@;>.z9DkCBr85.*O>Map,@Mfϫûl. MON7oȼ|~E-vxV cZ֡Xѱ]M&NSs!LDJDtF)DX%׬rCJᢙ.8B,t2{t33cNI.SFeR$ "}@P~PZML+JnP`9IJD8$ښ :u_ɳƣ" 0C )XYM;QQMsV*ި?E&LbY_cnPގ/5`ٰ Sw`TrΗZb"@=^">oђ@jQ"Dj>Pʗ-٦M²gEWl쳩 ?Lfa/m>#d9i&A Çu"ɯ[&>c-yx󉛯~!:A~>a;x9V;KrV|N~]5}6pzεM)5=EI-Y7n)6wJ!xT bL'M)"tk]7һa!߸b FjnĘN3x%N"yHֆ|&bSg?eR11g4nhyrzneؑޭ M4ɦ$*n!򖣹h ,~e5^j(5Jo6F6)?]YoI+_f砜!@ K3Xlϴ1n<4̬,Iy}#W񒪘Ya6%_DƑ~/NsT8'$ [+$/ U{wޫ=7 ~]L%[xBWGCY/*"}H֗£JnPfJe66Vf`+=(mBIh(SI{9ZX$8>*mp+J&.S+=ܳD 3V.x[<:~pT Rlja o17*(< ®KǨA:EX컶ŨL:)h? K"9v(Ahwʉs(hqPp$;8mҨD{~pݏwe6urvU"t2A(YL^y{dn; rmpFi7M)viocr`g_8U uǹ?氓;mVsk. [f6l0gJP\^ NX?|3??>q:\>kSS䊝5Hu*O&`T͆-b`gy{'qEh&׾Wr wwJi>aaK#e2?~Ouw&f6q/߇p|_>_^MO#”s̼%΂pk©&B07bLkp'3c5zpfjt=XN:{{&Tb~4w'?Yl%DJE2L4۫\j+f]Jca ΀hixnsYsz"9uź>˺(e]v(/*lABL!x=s4@d8ϤS`!F UiExኋϱ+h8S 2/:"<MT裿buW t4Tm+9pq]^**Ů"JB+$cs:\I[N^<96s$((IyZ P80` pxOΘV*׹ ie4};"pD -y9`dr9ýp2qa]"ǭ% qd-\F%0Dv[<8}!(?y|JEɕWBV:vh@G t-8idXhqHiMQ5( #j4,PpFĦ]^1͑+5â٣[d㈪f-+ٚUtg5YQ '7y/ӄ+&WyFWkDZ5:'e `{u1o1FO+}s֔ڊ蘰sª9a>6'lS 0V4ݸ}2Od$Y>m$cdChF/XJsv3%J>g:ʧZg.ms)Pzjd dZ EL)O>AӹTmh<: 0DzrG #0ZO8 ;Z8iJNHkZhQs<hurn*8}mj"PM9C2#- %ZqDcSEb͈n5B[<9Dt͝0m/7eW EYHmFZ>rʩg"8( +9ֹ HfNS,|hP,6`FxqV(<ӴGkʩ@ \ZkP`*j  %dpr`80K܋(4yfkQ{PxJ$֪ɺ=uy[f2;XqA[_Y}Y0XOuY!pk(H Z#JH,AO+4+ '{ۂ\STt[spQ 4sNhFr:$ t̜RF>9U áq숊PU]`}AW#̴ lOǠ+T`Fy \!#B)t#hpW_{TWǬ=RjӧoW~tYE[**o8RE[cZ D(yGgޏy| ?4\y:kzo5<10bnd8A"_~|wy$d`2.3ZPX#%#p/`:d P =үB_sY Q4N8G(Nhs:#sV "F9a@VNI*Qzdh^o]Zc߽5 r-#$iKlIsmt%T/I=]~7'~%κ9-BEdS;ɷybWzlWc9|~-{o |u%=ڇ;ʭskpKy`T/.(nHU\PIV}Ow(W?oeN\CS"T8hޓsm'0W_8TN Jnȩfp=t|:} ^"Ļ+8jh,%CF!{QJU[y'8 q堋g:o`iLf)ƈ$Mjl_OX7}ÌƀQ7R+Vr4E`G I!(@1H{JFT2idՎa'epXx!( 3#CMR%q SٲQJ(@˶HoK4lQr4Pi'AS\l2G؆k { VꑇMM F[٢kl+"d|H3?_lgf;'zut7SKu ]A_OA;@AQ?q[4QTSy>u=U657oK˪>h:SC2zF&񤁚P q>PQ**tOA9:`I>cmbޝ\d-],M<Y6EMTu_K Fk87_l%0+ZV"?,*G@ac飇m=??>B_>;)'$-{V(9F-2'Ղw\ gi(y=7۝R{͖9u-$uvKQ1Ż15Wp!{kj7ʼnR7I}YKcvcMǺKG8|!6R76(G "deu5R)tT_YJ#ZM,\$$zm)L'['Rqi1E>] 3DIO|24O!sڧ-23_},2-adChCݖ+ePnUZ ݣOPU>I2W;jeԥ9wT 6X-6)},Zi/6Ҳ ;^f< ~:N]4\1g~p3E\gvO/lFth 5m+߷ uD$^gOx·ټ7y6X\.M#̗iUX0r{Mnjh?-hG'n-g!J~a墓{" E4Fpb_/ jX Ngn% 4+jhL1${j7-x[,UD'3hC2@i݊-|"%S>nZ,u"1$:QKuwFM{k7)U\:Vl.{[JV@Z|1KA׺%jW0懈=t}=%]"ԧ kWDҩQWTN%D22xYQ@S8+z=r%Va=@&} }ЖMl% 6_w))FYqYKA x(QoPlY>|ݟ.>CY۫g1Cޫ<, yy&̡\Y$,kiQ8 AMnGsXAH 9Z7fmMQBR`QInsVj0Zo WљX*8_;IbRVCXRNE-ߵ$U!O{0fM>&j{3Ov4t %%;Ӝx =idr_E-*Eҥ9{Pfv>)`h1Y >xp]pa^V25 ߵTy`QrYM=|Z"/B9'VVJT*)+7)%DbqPcf}Pv튢&HCpHY X#Ms](w / [CWo Ni E'5Ȕ 8RdA5ˋN1my+ʖΎ;E_Vҕ;̽4;Ċп]=x c3uj߯_ aXva'ŏ/@͟d #&0n#j2sJ{|OpkQ[&$6i5L |,D\d*n]>(2bKa`~3güdu^a+Ɩy5jΫZ\2*"KNPꤰIq׮R;%fH!p1/i+Wȡ+ 8ewX`2-` '~WW߆wŒ6aސQ?j E:1HDb'Ωӣ2Ńqh^\<y@Fb\X x̵ վ)j9Q 1RRRqRd Qn-QZ d`E*@+2Y(lX#`'צּVdZ2c:B2r9ЖE 0Q  fZLB;QF** 5@3M-) ]1!V9ƌٙTBܸD,sY.ĉ¸^CDOXP]i!G{avC/Mh mFz+b!o3QoxG!'4͝;0=8pf_0Ol =} 1l0ژF ƅs"Ɲ SzF\Q/^[ aHweε^xh!:< vVykj(9.?j:fn!֫UivC)fO34 TIF4\}+t<.{vv9Z&--W>ufRȮzF▌LMY5e86:t<9+! X-dGl%Ank>>PIQc֯Z[EɺYj=/k=nKCtTk!u :dߡ_myN5y?QxOXY V;,\&S煢PN,s+G%]|3 -٠,NL g֯jUEԑQc&L"΀ҸQ1F\i)+rN2%V(b) UN49@ a1Ø0cӚB=#Ss mr3 C9La`cͰKK&`RgӘ4yқRǛ:qLy gk?,| R8ϯ* &#᣿< "2Djڟwgb8LSy? =@]LO ~aOO>N<<,/\1 gj 㺇*ŪX)ceNamZP2KHx B\2[ *AFk簔d>|m>L./.0:cN/5֛F?.˸o0uEK`|wֺ,PAA g =,zM)cN~օ:TIBJ=j@*ťOK e9a'wIf['Z@ ]4jCʩ(Ŭe;b`u =\x|"C=`DfMa~8&7يq(lxנ*[(ͬԤ.8'}eVzcU*;ehBdj!zA2,S^^p.)3İP?G!N3M*(Aaz{ACo6ԑl >4bZ:Z:]l)g])J.aEYѬmӄ])IbJ y)eTɃWAO)ݳ p!$ uB%fk:kCS>)"DV*{p*XEʮCC Ms]pNE6Nb3k*R%&EPjzH~Ĭ1QcB1vP9Xys GaN,&D)s19nZBtf;o&/sj4nq*p|ŶKcf(LيY-dh#ț>8x?ޖ7ak֮Dg3 2 vPJ|T5_%L^{c}ďs#q6Gl&Q;(>z;z }b*y͞@n+`l i{± ^Z8uwQ]#- #ztF%[G5zLWN׉0bIP 54쀤 iBeUI#"LA-9!ރ!>=QmQhwD~D֝kqmm/KŬQXc! .g$%`='3'S_ҧ?wnMuoX2sK1z 6s0㹓9c'!hcxK]?.CjEӧqHm3-9K3y?),#Z!n"ߩp oRi-ЖbܽOBPuNSI IAp- ҹI1QcƊZפ]פc–{){/qWV?kUB 'g5YӴsg = +I"ՈrFEPT47҇If&mdvjGud9c?04$1cX\sC2G0~yEv['j&PsNƘ*QCWPւOmkXkiUޜҬ+ʙ ο-O 5346k;rFq -_[y4|*un"KNPꤰ0BR.rƥ^J(Y'$"lֱe(j3Ch`/E8b\Z>adRn.<,ph^K *L OцעژϫMT8Fp0CW:[mC8Ū@i VEeuZѼ BI-`Bĺz154BKcbU yR-ϝ1rN1 dL2A2Fw!&;rO9ea7>OSM!]~:kd-Ĩ cjqd`TMr2*ʼn *גsr_$W{r 2IC󵿐KBJ+tL-0K5%oJeK|?f_LTQYWEsiBXJ +趠͛k!Z2WT"n!!Uضs%Ac%Fd |̆*Nydh<:梐fU͐WF8r&FJC#'J7<*gٲ҉R-*?I 8~Ux_y4ȣP{҉R˶cՐB^𷼡$恖[N3 qVٰ#c9+ZkLzáN;"pIa> \ts2ږdI3txtd7?!oٚr;yiIќyŊ\ԟA5+^ryӶ'Xn }yihyÎ}fN"8CAQB,XqPc9ZC:nѱp  J 2csTt\A3ʜ&11YpL9])$D&p=AQQCs&. Xq#'TaSDzUefpR LlLᗔzo. =S" Rl@(s!9߬Rf=љ˄x ݵ--`\{e*]wT5U\c!C.H,IJssûj>? R#fR x+7nP]cD;\h%!Ƭ˺s mtժn|8 Cbz4 jW=k'{;Z"neD\<;RKFՌL|?|8aȠ8Ɠsm'L=\akd쨩dZ 5i6l-ICռpݻ [u#=侢S:^'ɂN1QTw =lzHZ RMv6@uBŠ**@%u2“rjp68` `6'C !ZA$ka&J_= h7ypg58pa%FiʊpRC#^ҲVZԽ-5r94!m GZ !G&$Cbj"N1}dN:-L%Ix"1xҜ-,VEllh|]w&m@z@j9\K53T Q%Sp~02! -H=l{7ɉ0N,vwe/QJI$+;Jgr}nCISBo}FynXopB1%]]8Ƃ47pQmz{7CSڄIHeY?wFUuwZ\?/8:-ǰ(*'V.ZV$!\Dd sC!}fڭ)v;a6Qݛv+/ڐW.E27[8v+ GtJhݎwxnU2[EH@ njGnNMn]kХLֆr=XbufKKXkS`ڏub, :!8JfXre-6[@s:`##3σE9>Yo34\jk<)_K1\W~@ ,T,r_ 6? 7, -A bO}`Q]lk%FgCe eZkҬWQ9/I9v0@tތC}kL-uuPng$%.\kjX1Go1| ͸2CcB C8]a_f8:nz0dILO*H&)mS^ srް3䥢L>9ܵ5JLj$~/*yHoh.9$/o>>s]w?vvvvRP\R>*Bc*g ,K򕫄JqOu Q L02gwĠ7oFSlM(m +AB(de*HN+Z~bzT/ _.`*Z [UT$:kQܵ`0[u2tr\ߌ3>p:}\P38WopIqJ|T&2^ۙL20m(u>m0j㋜-uHÝWhyyE BaNJeןdU /pT fb6Ee{ԆʤS]"Sb '$9ͺՊwÉCk'0Rwxk@uq{vwo/R {,ռI1Y7sI]XJ~l8k"32#ʟ aw٨сԩ'+Mܘ)C[qyP2cngF ?A宯vR׊ԁПu9pR*0R P&=MɄ.SJ3#\R{hiƙCe̾t){?͋Wv8_ݍo^"fE;0Q4!RS C(Ӏf%87K$psJCΤ\_A8G)-3e#T,!k&)k&zl4 x浖[Fn:hOTFVREsF1TŲK**]:B:j* 2=x҃')=x҃'WN{hWӀV 9UŅ' +"j=UUI"KSҁDtE!tRxJ(T?\ Mҕ$] MҕФ~%TꀦRhē\EaAH4"n J;O< <[K*jF0򩖜3 |yG͉dQ):(qf &r:ڠJ‏)nEDpLJQ6|fPJx9Pj_+8Hm4BR.:XKa «hxe c,cqS??@vlzqS&99 O/EVGCagBq^JB]GA*UemzoIЖi*>Ymϻi]v!,A 96]kQZ?u7U̡XJ+1*DC|9t0; *:ќ.ҺAxy&>9[wɎ`tXw`Q`@Y' TUe\V4|qPh*a'#e#Z3',L煴SI-EL$+Y RZBA/ RLu'FqڕUFA,wLrc=;6)pTR0d"B]08 .(KCe2#'\P rqד^32<L, 8cqBsN""F+tB3XFN }X`}I`jYs a!ྯ&o @YPB g;5 FwrDzt>}rw/O`INq˲cQ_D7R߿yx[6'Q)?޿{_WGO?E$g|09F^uNR>HJ!o߹w_/\/l'і{BgۣOoHҩy 4-/hg |o$\$)@!KZь˝ˑ%U v2d;e-/epYRҺ]pE\*)v /Gp 5V}@a` [^Ly I edΉ$2]j4%5hYRsVY&,=k#6L#)KerH,{ԻiA{֫GI4 K$'˙[b'A ëB$%V)zM #L}7kzȉ&8z.D(eRUE;$'|i*L_#i U7b:{ԫBhIyR$r )Nr8apB/'.C׈|˾_h@>|25m٧+ܼNs m=2yd.\r>D.k0DPh&hv |}7mOoW# ܣ/xt8g\.P{f$lrl.4Z/,b*7-GOGbgj+.~? XR}"n.YRZwMfa}'?ݻNs}D]잣ۢ]p^.?cr4RZ1o/j(,7H滵+"T73cҥk ʫIhpZҠˤUR(<@V4 5Ċľ-'%0=EgwjVгXK JHH NF"?{Fe/0H8kHr.~-jI$[-ǃX-vbXŪ*5NϲzH5Bd%[i+:VaA I>zW bXS\€p\hFMn͗~Xld#,޳Edcma{T"ԩfΌv4O1֤<&Ճ;nfV;LRG$ZS-5%΍fkeņM#| O'`R=l$:f*Þɶ@`W49 &k‰C=V,CJĦ ,<ɹsؚǰ?29zʕzMq0T$Ze(4U lN)k!O1GLbdWư)XLžFx³]W*&V>@@} W7 &E;Y縳=u`.,Hwgf;傱HGw](W ]fyM ơiPwo1`2x4wP;"8 Ԩ=αE|%jC~jzÄ{DŽ5Mb@|@3B;ye֬ĺcm`.>sV1K9r0ȏϮI{O?5 K1;;ύV C 3hT뜥ڮZT+.pkk*][I#=:G]=]e5G芌#pTS_͆8gXst 9 8َP)t$GY' 7qWzcrRjl 8n9ᆤj(b=;I|nH19/92s&Q\==al nNb=;!-رN)`F2LQiܪ4wFZW N7;c?3P9 С]R w_s@at:Fޠ7TjTsUeIb,4WKݤv:VgC`+s|.lxǨ!g ߮ڠ[~K̍q -  Z7C *#Ĕxw]Vɷ:(g%|=E-43DO!Ҋ6bEJysu:nx]iu z]ipWOe7>"с[!GƭGɭskcpTͭK'v+p&DCss6`E#늾g`;*:pdZs&m{ruG+#:uLb5bT ~씶G7ܻ^ 5 n 0 K8VEDൃFGWB\e+`{"鸬MQyo&yC"ɬR v:Qx)e^JpFI2)DWY2^PJ]a)i"NP[0c6M~غ^ QtS~X6 TeL5&U. E [˼# |^a\pi"kQՄpQ[ۋ\>Doq7kxVwMTS"n*{ LPFU=V@U"yKF8ha;ylrPGsSb)F=Qi.-LjiIO-;-,h؊b|:JmÑ EǤng˫*!: œ` M"=.TR,Qd*(@BRtpi;djmIb 뱖ЀƊ /h0^$- pqE֒jdh`kO'!h+[ pN8O"/-y3c6uyXV/ɘK*+&tcXОoI?'hmV*z ꂡє{YEОӉ`ՔG?^t5[ٸgz$L KC|G$a+[(u#DYSvX =>]L0!C4}9oC̅`4ЖXcyj숗Ib ۓ4 3FVl(fZRB*ŕK-EELA\ff B*(j]nW٠ebbKo_L#jpJT(πќ MD.YxL-571SzrfCy7rn@Rw(BzlE,<=͖PVEQ,H"xϬ7òYiR&NUV`fe)w< l6۵%QAԔ^xԏGp"c^xӋ7sщ!i}E}oU{uwSw3S#n]s:?ׯ>7U2'1)Aj$ItKp9%JU 6I02cy@P4cKC@&a` 6{ ] FKMӚ0BZ'"j1]\9$+˲XvT٥QeF]UvTe,+tA%)YfKK瘊"UUfE2LE"/|/35^\糺WD5}ѫI_Lb`H,F?@9Y~*/r4*__č:uؒ,<1F+m}e+_+B)M+SA BQ #J+TI2ҔXʴʲ̨D(K%,S X }Rhe y%&Ϡ YST$eE*iqMK!( ,%HQJ`VDϕɲRh5:/9'hH )U%0YbQZ(`se}4rO*Qڤyچ_?=)W^ _AyM4e/ۇ0Hcx3/~~ܽ>#xhl7M~Xr͏w`ٴ}ǔ`h6E+|6N˿iu W`IxZLRB dF-e6a#"X#ƓS,)0>[?FuP=WVuQ&Njt))O"!&|pӂ 3$| lVȉR qg{mP fOvOY "DAY۩ Y#Qq[vQO إJfϽ˻z*)BVA8S;ޑ0sNN*ez4C*) oKմv?c\po~Y@LqsQNAZVխ,v{v^hgT0FtW突= " ҦwctJ-_ZK{n2FXa<x぀j! c 1S{{|C0pQe t NP" 6%<ҧٌqo [,x8jKxJy&^4 |w@M _ `cmoA>B_Erj>ƝSbK_565L ?Q/Pr@"SRhd*Dا{ih/ bF  ʋ/߿ڇʬxLj!gАWBx1{en,5Ew"uK=YCkVw]{xx[7Ie{+Ӄ3@V\84GrLkgXϴ%Hqqkwyi{w{4 B!J~.#'u\"?EI`2AK&S,4Ua;J{g:qza L;Z@j_ND]C. Rc=ip۱:L}2G.s؟0M(< Ēh2%k?RKr%ٽn#S6u3TQ"TV”V\U.E*iʉ((~2E<'8KpōugUnW8Jwp*Mgylpqt(O(3~ uA4*4h4!xp%n-S`m`YEU$O/Am1ХpҘ4zF4A`4޽ ޵q,Be7{2r_op9A9k#Nr^} RHʎ7jR4r(10bKN|uݺp ]F@bGP̍{pcPb|}"P -J;hClv~Y{/f_0Q1n^( ` Z0aRSFxֶB&sjBL]Y|/gf*`Ug}nϲ|t;,Wk%{`3i="lb_ZUN0K] ؝1J=m-(q0e) WT2eTƦ{V\έWOV(c欯]5]Z<a$0s!3$\' w}s/0E)y@iqmb1}f!nz8*X_}^鈉ԤDRm4 _x-a/)CHh{o+ ;<;){f9']PHH+l ű#PC@R f=bIV5 ; i`Z% 3*E:H Oʾ%:8vWaco ) ѡ0REYj2ڲX1ƣve+9g2*I=T?boX~ r`lVbxϣrVLH+^^*NZZʘ3.;[ԎpyE/9꥛p!)lOӰ8HqRL]yx}W #,"={1HW3~e[C{o(tVΓ6_4u>."O%0~Q ]~ڜc콢jx;2pN[`^ZMQoGuLJ<]tr" xP _\;U@P7v BtyPڨ]ʮR집WeC>ͻCy=C4; 3(]âalJVу-/\?9hGLF.4qV!J|TMzWRӗ%U~fZ2!}Nr l7:82aj^hXj߶s11P7e?<_N.|!YC4#PzxWǃ)gtXM./5txß\E:YG&564m+>jkhJK=w9GrXHףTN6 >\*_()#j{Glƃoԏhtg>#tA\1{`)``qIR?qZ2m|NrtrB  DЌ{v=n볔_,g=F!F % G0QF 昪JcI}ЊcƬ6CGӁyo`&4{&q$E0 8]pB CP)>\u|z\RWV@>oUΜ:Ov7C5YZ1 Rײ Z[4S6+tu+f_y.0-u1&.ʄ^(Wm8Hs ŽOq ޭ?oҹ34L! eF& #iT `5׀3L=XoX.%.EFqXUIyGDb#,WQB.{C T\zNW!=xM@/nMϭ /_(GIV'+H ˼&E_HD1h$8ti%g^OHo1oL'Pʂ1NGU4{ݣj &P20@$V{! MIb z-[J32`9ִ`R2`ergQȌYrGTxJ`)+Ml^_r6 7]$w݂aU\=׋`ݠ w8aI,Zsu[aLFkG7CKFQ!~J!?~$`lJL*R.NA8q!]Mp)H7 *!vڞ#2G`,̳;aOySgd(E˒e4!y$w8t,%g)!?&!BxB`BaIkU, m@% DevPӁ55?T&$]e]qN&[j<.2iyGPK9216O{QZ0+^C`J Hג8" Ju0/-PyK SGk̀cbxٝI0jC d1Vfx)~INVlZz͛X/QO8iw.X Ui `:㑁$xh'S] ¡ B5()!R+EZx%Q4Q;' :Pe37z8\j8}2  )dh:2bb JV &Z&N͜8TAXO4Rbf$qDLvKG9IqzE|8)YɅ'U]N%"6sWeWv~qk j[!- (pNOOk;lt%Da)2\{)W* ԗhVD0GX9PQR3 aS )Gbl4J [)?Ջa.5ŕ ֨׋h vՑTDb"zbU榚` 4〫JGWZ~PhqF$QB 5$;1I0G@ 2=<-&J34z1)JYlj\ϓZhTw|(`MI{U5|dW'FvF1[oGI"|)vJ$@:L#A%R6n)`i@:@{֐-M-6nD9ˤ3euX:UVuv}BjRsy-7Mni9$aթ}"VG:1 tw%{Գ(ubq+a`Ǻ[gZgdF&O ]wiVR$f4'4AJ3.뼠j-/ RYӃcĴ$JY0c  ܄bPUQF\pmKԙ#YޤwjXJ2].h ˍmWPPm!h}">/Ț)IM8.3TC(DdaCЅ D3ON>BkLt}n+ ݀tcKK*1 I=V9 H9'!hsKR-,>7#FPiGP04` cQJKKQ<BA G6Tªǧ^tx'u3ys ^8d&-0NE-w;z,J=i b0o0 YiYKjKb-1kje`GRAG8X+0 F1FBH]Qm&5R ʀjoOؘ (%`85QCFBE(uGOkύy|?7غbsQwWFs#;|MY`?)H՝!Y>:| 落랪IۚS5za{~}6 !/f_0[owg >"&qγ⋽8o(Faw2\ξT qp2ٷ7qgy}˛]_6~0vc,$Otn.th:QjkХ%;B->w~l\~nk7;pW)~*+R-1S*' кI”ukAuuai75Yk`+h#N G֍>XPX.Nbk`+hNaPeG֍Ef4cåbԩ[n4׺WNQq̕Ϸ^h2Aۭ2yjyb޺xJwQы՜{Tz=r[*U{iINfM2r9Fc׽~5m[*CS`]""IsDR-1;m%h堶]+^ͥ[w|52lJKwzO(~-kKKF5klvܙBL0Fig($F_wbq zՙ*֏o` ؒX<=JeP9_]UaJjf/xB0sUT5ʑ)Gp @5'uy|=oCSXS>K*ܕnRf:,HoQ'mY%KU \@0>]gyV`Oіוs;WshW\ߤ,h_]}G͢ҥTq/MR:o.d ɴfEL=WQ8_RG"3bɔY\$tԎdҤVnk OmpUms7?5B=7Xc7Y^nwR,bS j+[fF !gFRϝ}[aKfbXU~PWw epGS)V/iCz=`1q\DJ?>S1\YQg!20OM!VUo] #t_u1N vWu%߯YaUjCABg`^[?<z=nKb[FՈ̵aUR bbl]r_U m9~mj@;B2ΈtP*rFB2͋`0(i=]/'łG'sB)cJ$3'=X-(ņJh4|嫝^3#.nWq{ǵ Ylů<~6okdEF#}s储^YtZ2? ͵y4;^_YkatCOBL 2)ˤ. iΫ -9VQƀ4Xztz* FPq|K%V7b,6BHzeQg^H5|M>p9bKrMr{unrysGh{=QOίr$3YWIΗc՟|Jk]هsڡsd(wJ`gSQyd` PP7Ns(VDH74\% N ˪Z'kooggTm Xܴ7V]:(;$V( )*1t`o֟CK-$xpI8RE2Lɬ ?0$Dڷ˂j/7ռe캥G&qa{I.3xA2m(n^^18?Gƀx`@'LJ <(ø0 D= ql1A|nx]rNh{k9/(.Q iNUo-n*x1Ɛ! IRTo { [b MpSiɌHr"e@x`% u^/78LRA }z,~(/\;2?wF۞,F>V"/W^;ٸX\̣Zf3k91İ^ay++(+c-~~--h&0ySή6Y%$Ȗ!; wu:b3(IrsHAr툶Re5h'S`NϮ8yG$crRQBuR]R*9L" QzPxdtYdVEZ(6h URס|Eqꖥ//IrAnkX(#8~8goH~g *9MoxIrULܿ9Àyxѯ=tʶ~5RNx%hRTb*Fz:i>6"cq B]ib72$elD> 1.4;R^Ь͛4{{A/S{pYRDQn>G:5f$܋R1bۡ@ϡ*)yo^n x~q_N>DEZf1G~וkY׈8&+#a^>휃70KD?&q K8Es=-&jpcɱ&zy+ E8`挴&"eՊB)rڹd.TWmr%XGV -=?1Ɠ (%PM\@rA+1|}`m1s?RI‹WPd+U$h&fVb gMsl2ܫ+&#Jwӌ~Pu`^AiS' %wԑPŴKMzlZ y'Pf,* P)jF:KD0;^v^T`z:j(0C(z* }X,WI'`f|hOG.qp9XeYURva!q=eJw0/Գ`v0yRbڟ iRQv++(c%7CpRT$rbE H:$e(0!;E8xUUĭZa_Xř>9m=ݶhÅіXDl# Gܡa+mXx Ud1Ɛ0_MKeB11w'|H~o% ϗ~"{_%ۅEݍ ж|W1 F|CZW8piߞU QUˢ1F;/P. ҋ'-(R*$HN\ HQ["3L:b0f9Ʃ#q$PU\$EЀ16q#җhUxSM%֮D)"%w~ FGp^8tTeq8h uW1%hDkn18x3'JʢB8.eM%Q(mǹKG#TeXH4e$ AHi!h!F\E+Af 8mBB+M'5zۧ$hEjn??e2)UP2]۴?o6eTqR7n*bJ3`z=d R 9sIN>ruAʛ`)FD'w {VKƹmxFX6-Xi]Rڎ邵LP)(R<]Jѿwb>?7,=I  ś$u@|_|_O>.W_7}*uxP4a>zd|OlJ >\Ͽi5'jXvKƋZN pU9vlԶT›ҁcPj-BdҦzgqg!w(dؔ:hzn l~y=+ ʠbɚ|~/t]B'IFv4ܔ(O@ReF/.gLYBL8J.\i4ƚ%c[T[ơ L&)8T {t3%rU9;*L۫t2NG5 D\ I&;؊[`+lE m Y%>0?@*QyfY8J?ڠZ@8@U$ACi.H)4S(.xu-Ѫd#~޵2Ri;%gXXSqz0 r.ЀkBK JqiwGK$ ` Hno@ghO5Ǚ'" V--ƅL{_$Pf [Ҡ&%ؐ]gʮh$Q9߬L=!gB\S1FiZloV{%gLdrMcVg+N*K&Y]Ƙ\IŒ,fPMGX:2O~#DZUάyC Y_U)=Ҩs5 M2~5:%;U;VOu7osbYVXIՎV {F6F8)2TRˢ&Iчۤ Rs`4^Ҍ7HpP8;8Bs!5 CL۵ gFۭy g;a>fH98n6 :Mi+&sd 2=d &2 R)IvT(;<0a4T*&J76? JBb%jaY4C>::du!f{g߬ LkYwkHf2%3SٖBBJu]v[ q@MLݦXw[an%Cw?@ۈJJ'O ݺjY֓c;tjS$2MY;PPm;a+;T?n>`էf8H'Tk"T`H#x,l s{^o]a&PMA#WMQJzPt%My~ZjW^w7VF6E+0~`-qnV gK"tjmؐ97wq>/F37wY`Z'HABхTi(iQv§HV"5sZOUإF qtgF!gۏP2J`ֹ#A[tߓH9SaW׳хW  e|ei8 %F`L#L:vaK1ё9wԍNR{zUY6q,| JmN>OYԴq$tJڹ]BKM2{MJ5g^# 2(B@Ok^nZЋ^R9m_YFd5Ӥl6%dS/!/;H=|_fȡJeBVetfsTH!b4zunIR!dy;t@K?閐z_ܗhöl>t,v^KAnI&.<Z^a4S%fplm45 ]MdždL nå<i~uY- $^\w韔5,RzZw4 4 NӀ8j=-ˉDDh-tI= b9YB8jL@Jy'#דwח˒3Iۆ^O/&GN7שsXjoUɮPi̷n|oE3T{S*-Y^ )qy ;/(;w!缡}"c*d^XDRKb%^!TВP PA U> EPQZTt;:MpKGc/G]c yr09QJE( ^R9S*'\k 0xЁ8AZͱǜw.IiA%Si \;[;6}/p> @^gv;_~oeLxxxw}_ML彏g P-Oҽx6ϡ8H + 49d)aR*uOwb)ŝ&;Mwﴪ~Κ@ KH@|"Zu\xȆeʠu*EۆZ]h~L-SB@WVK xC8V}=CBs FWx͸d!6f6;- jMHwd>-^086 ]{[cG:&Dw-C v}B)R3^Kfl,  22-Fht. 6J,'\7R-7po,H1!hY \F@ \{ ?p4ObŔm,*8ny)0Aatݡ@UQom tz$!5I-t}`3g[#Qo)2p_HBEh(iYx޹,ʡ9# WJ.183{xG[F #e(Ep14'F,?.WYq|01p6̵1Z8C\b 1.cGvpסU4\c6jQ2y0>>(c إm9.": 9qذ_ * (CpJqՁ~r2XN!2J:+ ];6@UjT p?\BvX+=FPnA{W A{#-xC~ zzx>; 5wJ\%TL;<Z[vP#PF@p X(B+Sծ!֐ٟgfqڧʯEw$9߫Ĕ=ͤxWP1ՕȮ1m9 ׭YLjUV`_yy 8oF߯icĐ?ʇܽ3/?JU^{G d|M#^{i=#n]*1܇X7fզo>VNVoboVlfmezhl1f4뛭.fZ?=8x{T:fJcuUKSvs$:-CTKeUD=WO'" 8O=Z<:8[dBծlv&mrVF伜Mf~ Nſ g.hWG,=zEtkD\~ J[ڽ^ֽ䬤?u xxCOEi|:2QPS"2`>"t,.XQ2cӫ?;Jv )\\ކ{k#\9q|51:)uyXF]'֣sἣٻFn$Wzu̖+qtD?퉉=창3/@֮Eމ(RRuآöEH|H <8Zg,7"[F4 Y]Mow-X{G,ȷčp%T&䩕aGD <*^ͅ"Zt5+i2^#'tag5ryg Wo#RgS|lsU#;Ft]|Xs-;/vֹk ԙ{FH=+¾ qgmcx'k!UJt`%FL_Y+$`~o%ɇ@ѨBO /JJIJC2><] .IpQG=`eMrD@al6AM^JlgfjGɂňҶ൒<ȢUu @Kˈ4R,)/rKWRIgm6KKyl jvR 5F4+^4JN$JGTk7ʕ*Z7F0v`+hvVoE.o/cgDΦ U݌Z-Pk4lj]q[P ]J KAkM nւ\Eg Y~H{3%+( 3xYXBm9gC,]>EsR;5߈h(c#>kHvts̷1Ƙ!JzqR%}*Rs|V} )+񮰯jԎ7an`Gvޢ NFGq{8 URjRr8u] $ŭ ɻ^ 7`&c=o! wQ/˪שz-l>1awc<8R@ E6RýR 4PNjɃ`5"Ƕ%ItZ-F-Ccp$T-[7hm/>Zh=it'^b[,^ o Y;YQ'TĥHp\y!I yRRx.h,8Ӗ@*)5 n6TZQBWJCI>/JJ<)%SI,1RAm.=R)՞ '/e+Ty$g3=?ЄVZoO9fi-xT_+OsS=Ǥ$?)#azj~RJ~^2* {"RbVu->S!>adAxQwP&˷x ͗Cnw. q]?Q!ӛ"T1;^~R|7=0 #ZMai, $;xt}>os:Rڅ=LkvȄ4@}w=`f.UYpbLtmыpo}K=dj~Yo'ښ@"Lwj{͕]}-O82 P}k2!\/$ţcVgɭ.7\ n,(35WdW1XG Z<=)b FQxt#SoXS==ۇ_/R ٛ+V6UAJUP^=l'"\2u`h-C@|D&lp?bG ~zjpg) 42}iV%Q H]%L=ߝ-g\dٗxbqW&'}G쌎!=s Q T9rܯ ƅ]g>Tʞ340f(Z1`VF/52@RqguAYg@580N؀/j˫. i$J*9; XE<нygo—anҌnAQsk׷?sNkTޗ.~ w>4vzI9c]h AS"O R(_J4[c-՚ β^--FfL{*Q\m Z}S( .״)ǻ#LU^ى͵ O-0?i}O}ZsBBZÙ$*FNiTI#M"Q)Ns7w)&8\ɢ3i;Л2zWۙV,zdo=B˵`љeɪ\i:N^Uݔ&[2P@}*H1BO*(s.z,':#ebC75Py]#S\Ro`Bȵq wRD9g4q)4R83Rs<8̜V3 (G ׎87ȈGB[fvDB3_X yBX UQd)f}/M𨮂.Aic ƝӄK+뱓=f}AmѺ 4bg9ZѼx)G+~J !oٚdZM3"TViQ%\3F7˵Zq[Jl#jIdҳ\{;DiX%* QIC7۸ HNj,*O8nTiyE(% {[Ov[?K% K 5]I['2OE EhqG2Z¼e9NJ"M&;5Lp-^ҟ;n âWj^*ΨtyZTI+ -)e*WZ@ 1jY&SJbzl-S͈ĆQ:+CKXB4xVwJ^jN~oL؅rm%SŒ^'ojz#jN1hcjn'ݺW.˔B) fJҘ}bHٹK^C O*)3Cu(7Vp߅t_Zӻ+==ɽN_lpnO8n1{(5`umBzGd._)ӻݒ)2TK.I{Wh %Py֕%ŝ,=H-IIU(†Z#pAtx64⽽Go,{k#>Q%~vk~xksYGmֺr My:3G&8J-2B=wZQT;K1cyJVT[g;Uy;_kW{%X2F[ :/U=aE;T<rB kƸЬP 7%p@)aۃTE&PCx~0U/mΥ!$Z<ԗDItی/Õ!R2~k5Ap0Dj+.gxu\Qq>Q;w)/gNzB&t q)}dgQ#Lw`kOVpz,o7I8όA a&jUk3 ;:*=TQ;:Zká 6)`m| fМo\o5T4$m[ގ[b0r1F z8Nc̵~ GMq`L;KV?qQCkj*n:=Llv>b=ղ;[,!qYw9=5iqFƭ'u[-GuQΦǁ1i1n9Jr.|v<Ӣ NFWL 84jTkn4{m~n03"km#GE+< ppΜ 3/;0$;HrCdn[XA`KrUX,eHݑ<̴siCC,3ڿ0"tD:{v:zsECC")d3)&&cҋ;(g4<8Ӽ=/ )XԬ,D#$ xAR`-aI(ZI$a|DJ+BEv#Htf5ULiO5;QLӞj8R6$dƿKEm(T#AM.wuM4($os5?neO6:eH@qzr}Zd?-r Ĩ}˵u*u6pJĀv$@Nvr{wʕ>c6oIDrt8ʧYF<#_Nw|A,|ACaJs0 󴦶Xtcz띭3GQp3<ܨ!?0;)P'JL @)tCt>Ƚs?6(jJnjM5C%ޖ_󄼉.h{4 "Bby(';ꍹaۍrCm GZoL#t`T'fJAJe$HQ.EYZ, {O`v /&> 1 g >db/#~,Ul~Fz[_r^U;t ݓ^~u!:˯ݛ+>N&#_ģ+B`#b]ϯ '1P"3'ӍaX78 1wdAYNqQA3ws 24;ȯ~[Ž?13'A-ڂh6!Čqcmw 4t5hcr?+uL`s=܉ژf١ddl^cS--,HhenjXɝNyS5e)sd2$O 5+ ַ%^6ޜ&rg; _=No<}`nd4#J~f8^܏|'9w6p"8qȩzHR!1\h,hj̥%BH2D-`PFJ 1,eAw3irb.2͆:J*-yO0HwY $P^6,1 [q֮n-mC3a;!b*^Hi4w3 dW=垫Uⶭ*sQG{U9m @`[b?e-ބC1$,pl-Άŭ ]7P .6mX}1+%`v piQv G(Zf#`ݺyi4do(U#0!+["%bE:.ctQ3~ίozU#D-s(5H:C@"d)fHPBI []5E>)֙ [BTdeCgmoM!Y͖g=1k4$+iH;!u&䐎c@4=DlYwخ 6Mx݇tb8q́>TH-|)xb 52 x,&%{!z\C r>l+gzrQog2 c*e+htew?VG)Ao'AUz7#,jG/䣙zZEA/Rr|&7N~<7v[?;mup;$uOஷ3VWgwrTr|cNtlz Sp;t$eX{F)ܑ͢)& [M&'p{'GPӨ &8OٴtY~U $n;?jsaۡga3_Tx(pڧ*ZɉcԌYe|pKXwsE@ʙT{!fkm-՞jfpM5xKex~}+rQ>N"\?.claX'?]{;>r|8Vi62k*9}rU\t$䕋hLIt2F*à1Un"c9TCISXAB^ b_^I89fBXq~ZNkB@uOW?kG6Q"X9b޽v F?<#+ 1N(')9 ڟ\Yd?+/}˵j6p͏C.3xɑ5Y<8T^GV'>/ H>H_"ӯ!?0%{fG5 LaJw{*DZ?AWy9zic&~]3ljzggx~]a-d?i;XqN6]U>J9ڨ#4: 9Ŝ ݛ~ޖ_6nz?q_n8[2?P5Eb,H)PltYtsXZnoU;|+> g XDŽfYQo$(C~X0gO2t+Rח$s[FAE!__d3(v~,pU{do'Vy?~zǴpz4<7gbr[\,}᭵ha)L( K}KVb'q;BZb@R'kh;'+tɠZ' + @=_皿RK‚yp< \yfp_KxMғ,ed VmҜb} # P q/oDap{(F##Q%aE˒͸e(K,BR0+pY s! 5k]zͰ^A 0BF=nb XqBP"`M[PkJʁa*ے)PG!IAИS5T$(zG3!/58g ~b@9.4 6NnTZ~ V(p(gt0Oi3} ̭Bڬӡj,-0W !l*>HJT%*]`8/ +IJ[0E Rma4w"B0!S@Ti$@JKS*ύ^BY:+ pZ|\:R@;̘*9PQNAeLD D-o|Pٝz3wKaa ƺ ]{onwO.qqh _~~{斘lcĿ9sͿ_.M߹_ݝ67 Eϋ$F,<,4kMfV~Sһh#/o FFAM/3.˂ J# jJ,vv j8Z%3[(&(.ҲL2S͙xT_mS1DT bҮSѤZ1%;[52@ y"ZY|3<|Ud#l7m6!"-Pf'~hphw3 9m=&W/;MU.6 #WuM`vjѩ?Fvթ }}aU>Ke<(b{аwL88{^W.:w;{):T/6;L(9)\XҫՖ hCPb^ɸV]@@$@ޡ" d0)5,Ś)v2D*1ihRx:4U2DY"rM)h7"#jT bD'MmI"zn#[ y" Sc3I$#;_l2dOU8N*I~؇dKMbśP= 90 feXU)3_}.iQTCQ w|laT>yeo.B3>\HM}!,Sv/'C>)%p}Lm'`ڝÜh.=V.w9=mvM(Ip͡P `Hl4ȵZVW5gu  ʇP܊/&.eWl.ҼÄVhxa{'å{2EZCٝSQmN巩pfs]^$er$z,r|[8_D^4!3Cfufӛ"z]rķ%FodT&mt ,yIL/:'S[xr.llZUyg255nmдxhxv+[ #CJp`GG=B)UD dHAlN# c1Yd#h7k'tN'`|}p1qك`JRh: ۢ &[H-ʴWV$վn67-A93k$k>P$;U>$Ĩ+.h?|ƫOm-铳jC]LӢX&g3Ua\wp7N.wgg\)?]v.\\GXh,Io 4y%, /'[haKHXln} /q~wXVgNP/}#E+ĢTa'|dj+tm'h=p4#ʻ%79eC}-S Yw)f3GG9ϱDE񿰅U$[%Ӳ )yn:>j)R,S iPgYɩi|b>)9<#'{⾸ieGZ]R;'6[0!|FuK]%*0bTHF hƛnhEBRiXER^k\ɴrY鱕[G۠CYa߶$~Ͳuj;BE"EQ7L )W)[”Wzmo'X//̖&.0X5`BA曱tW9V8grT!N/oF\c9{ s\YfCtzf-n05}&|6(lSU\UXbZt;A]Qk ;cOF׵cu?%UfQwf` _Vm͍փdI ׫:v1E!EN`*lHX&X4>_%jۼn=x/*HT7U{B=h'In폼ى7 ȋoAPߔAd?$fTlI_}y*b.~].r`0QB$ ]\6)W&$!v[n=/'{(l`$ ʤtN6/1H5L'tG=:j_h%Jt6Iz%Mt֩m"2XR -O؇mԵNݖF"eSF[^&Uی6i5mќXO =e`!mydIp6'NTVr#]&ܒBZl{U 9#u&ȻZ[͌l$a4>ZZ Zr0+ oS=E2-S1⣏"y_JM/4 55z2pV{wGw#SRo&dzK2_sOztU~C~ `afSlZŅˡ }4'PB/dqVVxBZ\<2whCK1z]jϑ%JAP0'Y4)fA؄( ~wԹ6 : u%8 [dqՎ\ZrxR.vEURB׵g'dSdQҏO@iqH&u)7`!HtZ 0X)6DUcc(BI?#`PRKRvmID~CIʮ>\p%%\ؙHpKoi9EZs縠E t%rZ|2K?m 0fׇPswR%laIgtUIhsbaRh>𢊰XvܝBhԼd>rCu퐞@YZoRz57k -`f:bN.m1╻= y^(\bɈGED6l>I6W#jntb\:ximҢͲP2t&Q]GK@cpm}n4^X:0)-LԜؑRx )&}ZW_!,>WĤsɡ/@ʠ,R xv6onf[f#>h<;)[p\[I 6-a0G6h M\nȴn]!րyί`(LѽROyG-S ٓKstC2jCBHqdBo/CdY}x?8{QS`ת#a.̯>{J#ߩ|quUxqfSWe/֌8i}l2cıo_/:4*c<b{M7 \-=`:}+n> qhŀf9ץ`#;8&cB:nU@U' L,ud}kU.d7+VwU^u[%J^R>Sq#`C0BIDH_m^_{Y8IK^Zq0 oAZf{*2iڇgNQj:=N8z_-Jk=n*'"UVRׇ;a=].!%Ŷs{921]+`tE,D+yzƩ֡6o~yLc٨ڣ:8ؘT.SP$bcYz%ܬէqfH^VM]K]MYrWƐnNscq}Ȁ0kizoOYni~'w2 ;o7ٔfmKs;n? L$v{)-ϧ+;Xbn͘tpؓՎZ$tz,;[t@-Qqq֡V-R_OoP?gJyEM_eg-̋TjfA@N`Bab xKZ Q5,H }, f McD(RJ !MFVd/IWڠ, Ȟ69lgّU%Q6&ݤ9|9K> ؕNm*4Lnޘj#ms}Vb4*%YG# L/l|dxh7Zp鹘徭yim^K_DG경݊s &هZ€Ƿ,KwrMjM-%{kri|bc9:@.:@ }8/G)qSzkK܊TWhZ o,,Y'o2N.nn>Vqz31~ U(%AO|wv7/,le$ps1L;⹄w< 9#Z<>Y{_7B'K˳wK{B-)Q<}{'?i2ɿ/l+Qrxۊ2iI(]XS]&!yBLL0d"  Ř%V4K m P{3Hj=E^U:r|X2eq~V R`8^G?7"bM]hvUx#Mf(due-_K?@}L$C4 9{Rs_XXʄ{,lG5h #3TDCrL+Q{T^H(\ fs |0,Ra(c QCVd}|AY 0ŒI 4#L|XlIr€P_@b[/6> %v@ ?Hk)8`gB_G&}s6/*p{IeX;A}c Pj&%xo-D_A r˒) :qH׋|!l)m(k˷J q_v)`a%/wXeH)pjTs&'/cr4'wWCaR9fB&Vp+emb'w@07}VڪZ3U`GGN11 D +K*~w-Lm̲8bLrTi&}O$B3"KvhQOOG?RFrHcf=3φO1cB$g*<4D3PP8jJ׀,#ɇh" 6af羄+qVߍNk{[-,Mn<V}Nb|1u:2G.L-9L#9ށyD$mm|0Na7ƯHfhc6n$e1w1sfh>&<7ƙܩ%TxN~^)ܺ):J)t{)rIZu+/PB 3;7Pu `H0gZ:/PB ':'8dZ#3!Iy@TD! H mQ,rI!r<\+rrHY/"%!< JSD\[S-zrl0Cʩ~~wVx=ks5"T|ӭ:='>U߬923>ΡBE5)E,Mw:9{ޘGyf̙t*RB{ Ər^s'g[GNj6ذEwkпʐx3هԚ^]=ү\gYbf")bCwjimlx/+{stgdݭIKp TK=O V.l)Ye //@2u{YNkNaVlT<.+t/ZM}i̔ڎvOvhpzfGq|[7\M4zP~ḛH''G.rW =Hٍ݃׏/a 7 I{'ًz_ELW_m2?ukWGlogݝze*{߫"[;ogG{=f]޶W//M_-Ճ{15d?JOQg׶i}lH&AV=\˯l-*޴H7jzuuvLlQ߫}مm+[H45ޭu6z4p Ϲ܋o{PP$@V߆bO߸|eʾ еϧ.O}w=9N^c_tGJ\ҍ7m{>FOJ*/a82й&hWFsUTW]t5@Ҕ]súo{{WChh ?]_|?&5JcUDn;!դחf0u,}]”_jzo\%xoV Tsz3"Y `mT .ޕ5K-FoMOGl|2VXEN ^zG0Yʟ~[~hC@nj<ޭU1ջ{c:wkc8߽{D Vt3 4*TN)"/s&X/PEn^I$GJ,%4r 4*o>0OH yK'DsTk[Y $# H -h]탷,$foϙjMt}@q3 p -bmOFp^Z3 ksvPs5%ІL6[VA4e>Czjj)0Z q߭zryM[gF^w]&a$V{ka1cr~&iQSkD7i5R˨a BlPLA˝Za8|9#UwЦRַ+ &EǨT6A(Qʙu*9ex¸4y);RPa0-#A"ǨrAh/EZA,Mk v%RxZ"/TםJ[0LpǨE K)剔A,= }V'{6=A-R` &)N !!J ɤM|)V\fao/1 ahwCڗ9mo:<|(\O4N%>f#g-iN֋Yӣ叫< ^:(5i7>>:^YM͍)@eyxQOhTIh?5Q+۷fD:Q#^%nr[k7;ߟ萟c 椝q3J؍';!< HJہ@c^ygvڸJUZ =+kY+;>:3,IL,UE455xKD(0b] 'e@p/itX ;E5Hd@`]>w%A`9gbAp 6qYOqSEG3jP]n|\ By|fB;t3o@z/4֜FPi|jaCERGT ʹBl9wqE**M”d\Fo 21h%G"Ԃ{ #B胵I4BbbȿMo9VJ9mOXJd+Dq+¬]}lu_.Acݗ|2}{VXW.'bݗ+f_ vbݗJZ=X2+2+eݗaݗOͺ_Wú/n˲+OX?bĺu}cFl9ĺ&#e ]6!Q' SԩdF IZ?b>Z0r5s2N]~6.{YFb6'uࠬfࡷh@bvr> i.Gl8yXsŔ V <2n ^hp*mvЈ@ilk1#+}Eާ9(ٓL'[i_!9SJ6N&Hb@ ݕ0q뿲aNcՋ,+D*y)ۥSI4K x,b–]9 ZYkHY"Jx+aZx  t2T< |/MobE[|{hT V$sU 1ľ{: nmN㞺"FNz,ٷ ϲx;4/+̜d6 ;PTDgN#xΚyjff@Pݸ+.gF d9^cuZDrԩĂ9MnAX=ΐt?Q2q\zH9T9CQE6LWܪ;2b^QK8 b!F9A0 ,*#F)Kweljά ULրZ0mKjacf`U3+IpAPNo?w\O$b%DtA|@6"0 95جVP~D^Zp *|ɓK5i{*Cl12g؆2rvJ01O]18H cXG9EqL5 |_m9CU@KF2rHZ,s<AaI #LJhHMY.l9G擓x\0 G;'))7,Kv"Ҵ/U}z{? Ŏ<|qyWY'*lo cJxn OwWnmVveϦ IRbg/6')Yz\{ 8gr*y`^9Ĝ՚9f5C'ϝ4m7zdNӥ i6+8vB8},o_bh_M+dux:W16YkӞmoE]vr,9jWR +q,ʊ>!aJ&դxRҲye#fo;lb[5-$q:,8JG7E8Kv;G?/5 ŋ$QO+fydG(yZow^o{>~RUK!vsvGi&[ {6c+5gTJV^9Y;ٸME$Z,W-|'F0@5 PoX6߇dX/{=z}z,INl}IMc$VR=OM,Vͭ;ÛɸmF\ixF ⸻]b-[wn@o&H%]pA..\ߖ]g[*Iyy: [p:93lN3vHWbwSN 0r7PjE[mޒN]kڹ۴avZQUl9nn7|Wv@9;YQbQq] ssߚ`a(9qz'!IE5Ax'`N㴷̂Yh ˤ1[i|0x̃gy-%Ƃr 6Q8xq MsKuGp5,aLn T6g ٞ3]8O{IKq|Gu,%3wxSL%Un= y0hX\S[8qRwVŬF^Y.]=rĢU&*jd'4Tk񇟊iD4bA#84"q#{~DhD%W).(C,|3FՑ)4U| A(FGoq r6W\zQGnCoԽ3@65nk\CUak exWmՠH-{v2J&N!b㧑'|U7cJK@5X.@DkћVuw"wa۪"X[(B5Ҥ~kzuc.7y%v߆&VfQ_+}sm (Ηx7atrCe*2Wj/^kebB R<WWX ݜ"Lambdbx'LS)[kaIrID+ ׉Z*Z.,oTusdk&I΅Y\tYDA j 6.n +b 9qY@rpRQ)18>i5u ׌e3c7:cA0H2/JJ`^E6^ l.hnT30ogD2x׃c,/ϫoea4zc\hO S|bF=!֍ }/$9pgc|l,^N,a JBHbHS9{͝4>%\ӈYp<ݟ8>rO֌ΊƳãG[seJهeZ ltl&i3tCkt:A~j&o<8G__}?[/dc4{ jCM;[g` eq,q1_f"!僬Qod+8 [Ϗ<}|LJ{;^c ]cb &;;Գ~orQ]c=;IČ`t3?iz4tB$uYou%0lҧgfm=yh/{ mmgv:a9\&?ExIgF6;:8,?>8{~qxzZҋ#?ܽ^›=qaBYߤ#v?twr n}lf7{UY+7J {j{?f/g__wh\n0MX} ٯ1X_{ߦ8ngF"VH;i -[W=ʶ 5cOW.z>:DdԴH(?,L0Ƨ}szḎJceYr<B.&IVL<5 +hRG˚}U}qOn$ǣݟ~pio"pȦ矷)o3:SWk-fwaӝ }*A*ޥO gg׋ I^<8O+^j⥮&骥d4 E˴d[VoNmbx4<5oro> n:>(%um]ɻiMEëK٣<3My +fb9A/wOG CqP $އ'ߥ*ir~|>R8HTA-|95EFYz'} oӳ?{q \Q]0b8AOGWMJ∢!ٳl&ytU]]U]y6pqy{_c.?x_ͷaI7OU+_,yym^= ]\Qe훋Eb= i8]?Ϯ]]01f^}b!6+!U?ɛwkM:;~]hlO}7BKN_Dܠw%0/+E}VT9x @ %EB)ߜv!]m:;% \y?սگÛk%louVW|z]"擳GqVޞvEG-_6_o߾݉i}" V4&=,,(E|}ÏIqJ ՈhDdnh(_:Sl=u;J=91 ]0 Sy}"YDK3`D Yv%Ⓞ> !sؑB2 ]oYc-n/w24 Vw{o!eY[ *Aw/>=k,S,T * P|ZWFe΢HI9')X#ފY@ R=Zoi' m8`VB.}h[PԺJSv4Z/a8DKAP|CLK%H!v/& wy@ nEڌ0\np5&ü{Qxhl;ImP(ԏ1sU+5Y%w.wwr;b)6S˛uf]S,,d 33Nd<ם;nϼc[mE ZMsft43O:j'Dz:\ 1gY wbQ&e+bqGy;x?x;]gJYD628]C #ЕvXaQ8rFd9J#"}X(yQz.0K :b;p|$<"tG#dHx|$$ivٓy1d1=vϓs1C|#yYfuݟ2)D3UI!¾)D8U j-uM|7tǯvgHKNJyo0^Ӈ&N"d^Ӈ^Ӈ^ӇپVΐx]`s&8௖R־1gD|>":/n9E܉|Q;DQ+N:*?990:I& "{A׹Wa v;*Z# -tt#@6nXI/D]/]~ZJ0)7=>]_==饨[=ǵ W*zT(5:"ldҘQq0QL)cqIcFeR1ډt8#x^FndJݿunZ>@_xpmie\7_-I"ج-(4\Ve(B^Ч`b~"zbԿezAj>1җLUQ"z&"BD*yT'CQtYJ9yz;us?x?1 ˣ\>.#IļeTARgQ$x(I @6Jꤒ6!3fJ*C_rgrJS?wW6eP@ " Fr#>lP-'KB*ki$Kɦ^$CJ-IRA6QOQ۹>l"iE2i n4 sM}");ᴏez#ׇe,KmE\_]UϨ~o=yĆYhdsŀbZ4uK|Tx~[e{YEDpYΉҚ-\ <',`,<HwPRAXHOPR0e*tģ )$o :yrvl 5vI~yxԮUIcT*z1XřeMN d B& S,L;mVdsH @枕*̩F|iCk?^Xkd{8˭}~AG⭊ʖ m^pdNoɪOƚHZKm|3+ld`;BM{g} VAo/TmUh]ګyeg,)xч~Mef`Xc;1g6ȲE* ̞ Vn"~x65aH7KW o(.pOz؇{0ܓ36o@hjEژ-+[Em3wN|f-r̼n QGdrJlҴI`ky4SF#nG]99F -A QQ"MH:7kw"#)Jɤ62dcG VcnbJ" DBwE9aUL`8/ C"0$r,vX~qSoP{pemgib{ԼK+r;v|FE W^Nk@yם}jX?=wT>Ky{LNiմV=oU̫nCwlzQ<ӓZD<0oS]ko?7yvonٴIkB>O/^_4WV voM5 !pSJ ѧ͵?PŠ }!2I g^<񣋒J̩ik_7VLQjjn6n#X>~_wVV.hc p)J jULŀ,Ru/bt8Fm[ux) pM 8O*&y~B#|% R@ hc2Ylkv( iYAyʖwa բL5). }pH:fT w܂xcs XLLVF4ϒp"Ę!9bP9 i=o:gRǔt %gWIb~Sv<~(MkG4?~䩛kT;<-k.[#izE8Sd x=8dyF[ZgmDX:< )A^hYPl 8Y+˧Y+eJH) ˌ)ޯ xȼA%6PͬV@Ti^iY#=?t|抉=u>q4hу |JmXGC|v܇xzq)yI*( FFXR!6no,ZmOx4Q $ ]J"H䵊JBJd^d=%9C)@$Є/){y\,rg'o=q<~{huWdn2(k;A;Q*pfr6 Iᓵi``)#t`h D'@K3[vs1YӅX*sLGE -A@(,M JK`D b()R,T +O}JDP`&^~Yֹ?sGr;ڙyI<`[/~jrRʦ^UHq5jcasvSΝ  {b"Ʋf"R؜hAsI|[h rG3:s (' 6:8(qDmG#dMvޥ ǯIe5}ɾF+b?ߝ'K[>e9> ›W>RG5:6S=$e!;M :eT%N2X:gK5F{_Ě Kυ@ki`9e:3)ht$ ȸB",=<[ p kՅ))1y侭)jf-gF d :)Li-2l413ϥ3Z qf&CXLP(2-M px-/r"!PB@_y\K}Β7IMbѭ"8toVãۇ?l.]b;yhfX}5yߋ{81'a'}0m&a6io$/O˒2 p>Ii0)`2c@[~n tc'k[ۇdo?o~֎]\'qu1ḷAah?/9p|x{ {tQA]ȍ[{/nɫä;ɓz3ɟI:'~LE9ŀn 5yzub<7bߞ$wQ./;`ζW7gW?ýnK1R+hi%B( v8B6ɥm_h+.\R˯]l-Jj*-us켆}s_J9f  IvT:ޯWLkķ~ǜCt=QUKQvS"}Cd- K}Y(lBJU5 _K/ Γl]cwX8F |?n}{>|n`s5e7[ݏpZ/ƒ0ߖًvmv۝PtWn>+^e4Ž" 9l7rÞ ` _YzR2~]fͦS|Ű͋OQ{PTUv'w<" :}!_8!חq.Ϧ?$:П{p0 œ\y`d'z}󾈶SP~6n4äkÊ,X +y>o`G󧳱X61]مm3vYpS^I[3C9x>>^~vzhQ~NC yQk[<,{37^vlv՛o^j; K ZsyIݨ H^^uAGN/4M:h尸..qKR\zpFIo0}|`p{qiׇ?)dM!⧱9d3krgs3ӼhaJJN/zh\Ax+Ҹ @[C<L5?M}w1(81) qqiJqfhd2R*uU1v`.qp.eFVrBy(_xEI#O.Azhp6[tZ1<ޮ6۵T۵1o߾%oט_/.,f:4Z\e] Ѩm }y|^jZK_΢5xzn#ɝ8.bi_bQDpqeZ%[6!FA7o15&?( yã+1ϧiLΥ&%򋯩'lXnw;jw:qͬCgM{9C2XJJe,eLN J9Lm}?.JͧO9FXIp3cH50ed<1;Î3Nѵi[AǎtkԈcn='^dJK Ց`&+u!LhcrƠV-iMc"nSޣe` ` cSwX!*-4A<5M<SC`(u$R !ne B#K&^T(%kG: -4A<UL| F&S-iL*XT 8&b8+׼RZ)x $QP; }Gr{7pqmXn 3R0S,+Ħ_0",`*-4 մXP*.ARdTxvkvx!쎨AWu"Ω;6UwG4!,Z4tAHTa}F}=D26CܵtMhUh-j9@,D \X1IMt{V"DU*1j'MWR29X7i7745CX y7"jjD!;AXrX ͍x`enޘU&5Ƭ1G\tpjusFY}|ͨ5fӯ3).y1Q}A/J>;!:k?QnxZđ"-yejC> YoR.c#;b:#9- Q#-FqlqL,jôM3kg[b94(5S,DeB^Xdl.eb'ch⶯V&***ϲTjs,]eXZeXZeXzʅz;̯t]iw[iǚYA>*JSr*r*ͩ4`F+lʧ)|lJMI dU6U6U6Xˆ\JiARQ'wA3Jj.YP2)i/, (U}t]S.WU4|  'ͮ"⫈*"#K(Øl?_!*>;_F嗺㋣W0[WFgXy# 1U*J=T>'iƝiiƝNqbD)bFʯ#0V^$Z^`ϑ h*Z? y[6"櫈[,֣x^kUCAQWmowW{*~\wj=Ԫ켊?eU2WQU}E[-Tix )#BVx`q+ݦv(|8tAr<3V[1$^f $Ѱ݉XU/$ Ml(H^MCa>(y;%:(ry HX  >IeLL#XcrƨRJ,DwiOAVz*ܭ*o660>j<Y3C΄s)ƲG#jAk4@Ŵ>9%De UUr91@5M,T;2J7]oתʏzZ7 YEh:_ 9}<\1gRLp=zp7eR.a0O$_@TF`V+':FxW{d۝Ǽal$;3_@em=)[ 9Y31]hM!4w2=xxm7Mwm=nrvϦ/}IAޗ<K;Nj]zF7DPEV[ )QB^Q&hX`.cf0Q.$^:&yMXZۛӾ8^ǎvzuEIQ wv_HykJ,&_-Gn/v ss9[FOO'oJ/`?sm TXL;zqך2P.=ZԹ8\/pZ=Khj.$ Mިi7v EtrqvQќP2vhvBBpm-Sb O0FFRYޝYƆyy6?&*@(絀EZp1"͜1$OcʐsoK>ygA"0ΆdԈdH%@RqgGR޲@%5p`A.,^o&փ܍„S L*nƀb4jy14aQl ^aLpƝB"J (,D_h}'y MW{nR_xǤ!gdK$O/ & wח仏2R_x6^;ƹ S5[s\}^>8ki@?{(S>Po:\ݬ?~<_k2G||A-v)MO0҄!n[=`򎚎`lN)TT"6% Nxppz00y S)DBynodF! ^x,w )84{ń=r'U_㿍;;u^CC֭{D7QR$4E#ټ$s0rEWXr{ɫliÖ|\8k~;GJԊli6Y>_~?ZOP|vz˔5,gQ;iy摊La+=r .@saUT{r@ k˟\FMO !,;&V{{7} y(K'lQxX$:l#G& rX"u,Rb¦>ӓe9oŘ-{;Ijw7c}Y٥AK2Z;J4=QZaDКga'. 6 MbV7t=J\@ P(T`b%(%zqR4 Ijsfd!>}:hHrܻK_rnZ\@`#cAi3!?7^q>7$dʢ.iy`ͺ>a9;I<'K8r0})iFRjdxǔ[ 5!/Q ~wXZS_C0ZrOEp>IacTJ#3טrʂ ȑ 樛OwS$ ;GjD:Br"<(d # )FrzbOg!8P'~^!jBimm“O!Vw=׮gJSU7?ڀ/g^'ՎOa)FV f.hsX,1'P`D$-88-DB\12hHUHy~z$5NU7F\P̻V;' ~wІ3Ї)viML[ܱe*h "X$SEEl&F(* ]D@7}wyL)lȂK .".]D0\),5Pk@AQH`6N%rJ%0G+{ۃ#&WFF Pg˯3FGddc)59Y/}.|x(Eo o+Xxҭ?뛛7_߮m>sOZЛp$S`k V=GBƄ7:sD#Y˥Ge/ .]-37߱+ڽ^Ê,w5KF~2^۷+ǒu 0FJ#c.x'Cyֆ9)i0XE]?l-ޣ6^XyB^kNiL:It NM:/@p6PqbozZF=^ˇEN\//ˈuAZ6 _ą# )gPBpAPNk\%#" z?ZdάϖǑSwUU%oWm > XX\ĸs2Hae79LU`IS6&ךPT֣__/.fۥ| &LB_ZwQ!&e׻Q(;TB쵫X>eJI}HDkC5gBַ\K_9|{&u a||]sPrr`/5@u`!0E:wn'-9z8?=2Xށ|"%Snf!iǖ,I% ݱה>J <'*'^5yN*R)> *Oiܧz}suIמ;ih ڰv&}Fhkϝ2w;cOcҐͭO`@}),D:s >3_?sُ7=}H3TKi)cp2WbTaTeGir.1[{rC:z¿WGK)(Q>r' `C|qOU+(nn__yS3!ksYk $/oʕ;vƨGALTبy'5;`6 v]ʈJx}rg?z3Z,b= uIцq'\!8CN&j PHp 0K>^/כZz#g4;%k«5t=j [V6 *tT "uA•=rlA*:b+FAo/@뷙Lhp?"B!&2'+ҜE F7)tCgb(pk` QR0mEay &:x" T *G LQzUL?3b Q\BK HShh|E ɖd~7Ojea׫57:j?JX[ujC-ICN`o2gZBG -@ L%M<|)ϨA<rc$ d@G3js%/݂zbܒpUa25ɔ̣ۄ̔2J/<#Tx0~D JCXV*C(0fur(8Pb]݃ռg6 aypC'䗕Tk5FIkGbM; 0{rwjqFBis뀈эN6f^2 IF4Ib5oPő ca` V*tBspf.~9RT!cn$;B[$$o[%@焽ڣJ#\n<X`;(QR(<QKQ„qIX`m7_ոtg0uir1qJVPEpf57 Yd ,Q8I)!N" O򠒓JQl¸7 PeJ@JH!CmZUY-YIb%V:x,Ys!mpqVKC@] >f.K$ ~Dy6*}߳U.؈1 ̅2DQ'F2aR&qZP(,.fXG4NXz=#Jph hBwqN"U?Z[`H {k96QŊ NER* 1^RGiZ:_>H WW i>W^\βWCUrh}MLVb7@Vnbl]¼Jϻ0o[QeIp%rRJ‹VWV[Lb i Fʪ㊀L̢p~76)(ϹQm10Yv_8X~KpzyC6*xYn}31I׬+ ]Q]Jh>9g0kG1,Jr.E@x|oۥMvK^TnD&z<"Daf> :sU`Y+X?VSÉ؏Cu %3s[}TLiɷ)`2St^>l8$Zσജ1=0јa{׭gOZjP`zb^מ7uS~ƑW2?81t(PXL(crQ )\ͅsY+JU69 Dd%^5F }~u3YwEnyɓ&:<`;_Çw;,h%QrD<^G RÍ0;kR9pP>p%oԴ(5ԝ廚j I;gZ 7Oî}@jvkø]t3C :k޺Q6k(ZGm☻$B0%Tke% ҂x?a- B ;c9M*Rܾ ݩb@#5XN$TE60klnM/d*ʬ8(͸ئlȜæc R]aI=iO66\v`y5FD+[AFJ$Ek.`Ś7-DjǸZ!)NB$$FL`CP<{ZWAoKBI)J}(hx6&hȂaCZpk9oG+a`5Q8fJ2.]b a߰i`Lk:!WO5M~Bk[;]ɧٳNL( &7H!Tp@s0RYU!uð8+q~ rGT!CHH.53T,bMӎ/ZR(żQLl;LjPpe(CJMB8q9 9`sXnTiO&[r$lS݌y?kTi;&Fۣ~dŌjX} B w-x0dDD@@@l.Pudۑ*=77b6Հ_t˛M÷qtC8^Q!9`p~$LwAhI"ㄺ̉yjdy:hhQ5( *v6W4C12Zǔh8ъKS'#a՚ Hx`+45M0,0BM?=YeN)РHH '2r&qse"a5(Z%WBDD {Lx _oE5bPYnG=ER$2@ȓe;EO{w]\,G23)dސسABKoO.@}. O+Z  FitvwrU{m 8/:,1>>%l/#S6Y>ݲ.Uo(FrZ]Ќo =/v}DQǖԼ8<:膋 ;1U- [K`L8pNl 8& ^jm$ۋED8"*G(ԊwHhid$=ں4VءnC/%=G>ms{Jڍ<eDͿ+DcmIb1=Q)_O뜽. 5I4lyvj~if/Imf^~A]/ӥY8_ٹ!\Yl@F[#@Z !Q,ta)'h X$,VkpepA!iڿ4;8&ty(_NO:=|ӽ%xk.7W>c_,${_]}ˎAַq=[&xG3xYu>x;/σ ;||K諻>i/4/it|gG/oʛ N\AV g.`.3`XDT{T^ԇ%W-\ чZ~ӣ?k?0H6U3kZz1|2*?ߒW0F}<$sVizkWҽ.zoJ>|]p?PxyX)c=(t^1rmg^y"o7eH s}LM)b0}N)$3 P/j^iǂ&/08.0R޹-ÙQ"?G\="9x(Z b3$Z_6G%.&eQR:y5o cÊRh=YSO}9R'=_uf%J|Ts޵8r[""%2/`< י5R'Z]RK-Jض)saߙ3¢s S"7i O*05hTFhs3CG[,>}{~q:q曯tww5|ӎ\6!"yv$O ϞZe`>'X#ǹK*]RJN'O)O5R ⟶gpJ$"OhRI$[<b2)dpԄ@)12Y!9X?'y)kVyQXũD1HR; 8Q$|W'.3N8G'1,Mګ@.eUA|`rZ13 ( &b,JDmsM`ԒP^j#eU0}wg)Ҹ7x1oo~f6O?㳛KO]e0| S*PPFTIpS 1]H#N|!SJuVK!T`BPrt!=i 7*Ĕ#YkHY"2qV)!ZsRhOv>Ds?]>A1%)^1v&bJ4MxO*$ 敐2穀0BzeM.}p)ٜӚZ쭯L.bW,~R^s¹0hneNb aEbVxJ5ZS6KM 0CcBc،]>9Por%6#@;ĆDuEՁRbXS( |0x̃gy-Lm7>ȨM8Kw`06C+eB@B1dE@iZb`C2CLCF12x% (B#ZJ. 3f*Iko/Ae+Yooe) DHGMоl i0 dv<))y pXyB"dt0K-^"< <$eթeZ CdD>$^hC,VŌW`^ )Ddpt^'I6 |'[زsKހ'M5q4 IȁR+Qu .b| @vC3̋_oJXA*ݻx :bHuLiJO+ .}~R=r9l\,NDloFX0;W"7; 3E ֭)}9zoHz1YZ!4wE:/ncd{HLR֩PҒD )Bpʕ#Me@ EkNug0םu4"&^JQH(+l02)': ƀ$@+܌BdtFEgwP@F-RFK gJRj*)c-T$P*^""Xɋw JEݣC\ sZB:Zujg9Crm XNQIJ#"*sXj[C8:" ^ C TxO)z,Bb!­J"02!!L#,h4* zgW09-0V̻,wjpz6N~M}n9|e]*(4je6B$S"FHW)CĈc}c j@ ZOs#!4zkƚ᪲E2n>[I.uLw7bav/GPң~*ȯ WK z;Ar/ MO_;s6'A]>qO%`(tbaΙ'ƸzA< 6ye؆%&kǬpvzD:yK-tn49p}D˘ ,vhGӎ$^.)ްmZܻ}]kPI3 vzζ9b}?yPooN="-Pks.KrqY㠋kVH9^8M>{H+GCzݿY3Z6JsY[6њjO/; #hrA.?X~ɇ<~yF&\!sYlPq\6Wwc!.0YG𾇠zdwotkB㿣i8 F6^~WSuϛм }|iB?mlyulџ&;~`?&{pQໟ.(z =tpܦꏛy6 wc7Oz#S|d񫪖7%QwL~RK^QNJqlT*qk*0ag:z#<a$-*CK:g^f+酲_5g޶ո9AL)rX.øИruXpmF0uFJC"h!)# ;Î3STd1t;7U 8qBݑSAG#111yU\Z=han@s) RAv܀qo0Mtk gi-"zUj12t)x8p?SgA{(JbKQ*8ƺW%cLS7-oz/|&K1I)  B6Jvr }"<LD#S7ceh14NE-!2Gº1.0U*/w$ [>J~dkr{d$91twM|4\`IuÇ BR"vsE 9oI8eD=+&Bz))Su|Q?/zo"M Cc"Y,$3Q=)47 IfaE0aBh`Fa%2Oyt4h*a/e_EJ{F,SFŇ+@)ӜN AĝzN?mݕ6f F@%o/ng AxYADaaX,s@I NKm$5:xȶ(2$U~j:9(&Ÿ=^ZrJ!! ƳXZ%TdƩv-ZJ%)CHzGuG $%Zv,4.cxIk$a".'Aj1~y'DIQZXE00 gV'/>e˯OsL8}^D" &Ȓt'5څ_Og5/9t]&\~h;gwmz5/VC˖ՕiOctC:y)lk4덵SNL6 V35uλCn}zH5/EZlB&mBHs 2@E{ iOsx9㲬Hn0 /q&43Vn-[crYZtj=;ZNw-ӂ@~I9i ޽ߢM ȇldY36#^݋#7dC^fp| k ܳ|؁-]d]7hnRMCq0v־Qw1P<$qXss~ك7y)4F#"<{Ѥ5PYڧ_Yz{_h=={-6ޘ쾔Q\k{\dmyF[J ginܢ+Wd,hSRwgoGr(he,Ὅsl>L3ݮ5B-Q? 7k{ekDhDYZƗӟ}]N֭r5Vƪ"] F/ZwTnW\$$ss%ToQ]OB|| m׭gnq <]5w!s<y}xFĪޢFb\2y[Yf5]a\cQ Eeu4 0QsN—1<  O F8-بQ+$2%gzFZ#Y4;uWGȇqKa /$Y}oh}KBlHܷ}c?_'!0`Xb'wz7|K&LSpw͛[1KL||3~ku5a೛Od6_ 'Wb ~|/faݻ)XeW c*GhDb\QrR}.+'zdڜBgD:uUGGʖ\Qg/Wԣq퍄sGZ/oiLd:Y/٦&6ӕ'߇u ''f2 aMjM Bg80CS5o<3n<ռQ|Ws|(iagmgv.0I{gx;kCh\EKt zuId82)V6`TAQ^PG#&pz@AɒQsEVB hkX_Ei0Io''?i8IH!Dx@@ 3*5+߼yܽطGNoszxyv%ǃO0DҎq; ,QCC'&W'OaQ?yȭ =_ NNI/ `V3߷ز.d_-ůŪbc0Oz&.a!+cF1y(L^--Ѯ@ڃ,'%A0.K!%(7Sa0Ÿ`B9^ѕ W'&1Bh]g fZ߬)*^8\Nyb*_|9&ϴU4'כ_(UրHrAJc*c]B&չ-JvbR"Sǖ+i!uLɗwKچbO W E;12L^S"E_Ebo/ܓ{txx6{{4|Ok.{Uÿe꘱7Lz8 uy{#'&ƼoĨcD+o]徸.V 7O԰~[OyQޣjVT"V]oO9]ϋ,mRCd 8. 䰝S~?;qA}&,@)>_nX+k7l Ū#/vGtT"_NNSS&Ҳ?XmLwmBntza l+FvB)?\aՂֻvt{zd^fw7V!aG5w[s-zԘo4}JԙuclؐHt3!bqWWSvRp<=f/`*_?=<ޝ絿֦tc^ ޴ub7s+s`!%lDk<dAT~w\.&t)6ȅoC `p ,aΥJLK_^90a1*; 7tۈb!̵mW],/R -^K737} `o~xiG̲X `({;K췞1oڰ0Pcɸ2ñYH0.ؒ}p4:DŽ0i;DŽ vZ%b0Ų//ei-E$ot~8B!b&~/ N܊oބ2ڟy*yHq է0 j\M!2}6|P4Ę><73:={s= ffHlӄ@٦a,8w2wRYw eR0봳8.sͫQȐ㳫sS<|L-pۺEDi0JVBa2η>;%i te*)ZF41-!)RVK\p9Ssvܺ szo?h4B)w0XhpFst5 QuSW?c~:x9cJKKvKI%V5.ev˼N-݌7,zxnb18Yl بJg[gp Q|N<+:1L?+XRyL8"&9p@ X"V.4H!$P =Žu{[e|iL9=BC .?3Q/c]=fFCfn!D}&TB2-!0bd+s<-=1a((ӜX;F$e|sLNY u"C\[0U0crk9e -1s](xՅXk޼nzwf&#Ws38J4on<)*+/%'qXFZM'`Εx +yow؆uȮ)aiI5Cjoʱj: w5B6tYUBD B!xfeyB_wl\ݽ_?fpsM)zvj`2LqX]ULnMg(Nܤo K(k{u_^NnxU]ٲ1\BMcw9ED Vզ5a%|VZQ+ V: ˊF:#9_'iO6F~f=|TSjYY`?^U<ػ!,,?n6x1Mh?K=Ho3;/3v',މl;q Jܙ3wcƖa+XEC`ʺW˛Em=ndɀ@c :iVc`x{%5NEzm4"%5gI K5uШo~UedF,瞋9?va˱+=ʳ3\:j Gi'S)`z|N$>3[(t2tvtApT'c۹(YȪ-{K5ǍW5WgCLWnWoZ!_0]% fp~69*5wX!搩BA%wF!w%UE-Z#Lõk@ @@WkSWF07J[nK)ש`~JM4*Ph}}sk7&XT ~?>=OA^`D d9*P)ЏJV!ba]Mw~x\]3q|헰ȮˇW],/_.Rl/7 7WK"p]|0Bw vp73g^AW2]~2eKIxճUv +GcZڊ)㭳#6 L53z)XqNNd>% /P}I+`[Z:/q 3T%VSLPT-˽$9jݸ=Mxbq:EE]xt(Ʌs3-zT- 6` W,KiZ'k)5I\I3"}mWF(F0U>S+ \XDK@! #it T+Ifx@ܞfͤX94cq}͝B~|K5Xt UR1)9s gOAV-ݔCB>sM)G/k7?{q]q^#lFc'N4{|" SJ+,\NR(o:e~aD7A&޵6r#"iG_Sf6='&ew1Bd+3]俟$m]lMuf M*X5_ݬ?GWop0펟j,ӉoGrJ8cg0FhXij[\Ӫ0uGYŀ1& 2/CKDx k+dekȚۺ&/C,<44x|3HTFa^Ձj$!OUg)T]Qq`ߌH`dr GCSqX.xΌɪ&4ָB)EC?) 2fgVA J J%@9UcSzxAZʀV[T)$,` H2^ >.;xP\9im, T7׎Y)C+)u '֐ ֕ld#N"(X/C 3T 1ԕUcExaͧ 1؁55XeJZUq]][,^0rRzx,t /߳ħD gO!xYc8_TKܭ鱙)\u}|,o#$\N#8"Fpp +8s#>C(q 9>'C#w;:@~8>lε6ק_z6[6n|jJ7*~eg^u~卢m^sscR(,>Y-S*BEnV o)0gDe0`0[)I% pAPNb9+E<6yv\c">Z0b[ף''~+5a 17s$4 n5|v  |囍iͿeBդmgK&|mZLlBCLɟ$ϧ$fx $|X}f 4(I/2Hx)A$H݂|Le5nBjq{8)΢?N'.ʅ˷mAl(xtPڪS1?^9O4dhޣ),|n;(|;H-VSmt۱mqbAC.5<K؟z12>d5P97_y.IxcА`60\ ḸqxxqADr2P<^Zm׆_6%1D;,ւIˡ#\'Նas]B[t߷JKeTR㓅wRz \䪈Wi/;$m&W볻V j>9bxV幕/`Iv]I 9h``]՛۰Mp )gGRUF+#ۆw)h-i]=> ,MSh\dbMq5/rP^{sm9bg8s=z-xFQVy + c1Mʐux~iGݑ9f!'Ra-gA ,9glk?y,hS|k~}Nfώ-@>)R^a.+&]{5W%|gL )}^ x+ Tfp72{Eti{㗒:3!%珅}qb{Ѧ՘2Z!3QWy=HlUOps`Ti|8KLQcЉysCZUPCTgH5EbI駿IxcАs<$ic?5 F)TEL~Ej$O1dUJj֕(6]Y/5,/Wn7tۛm!8J{!ޮ.]Rwz܆[B/\J#ԇmBʤQ:txK\ д"O6N;Nj|:E4FB lݝ>ǣKC_[Zr{ 6s4.p)G? Kuqz|(úT6n-N۵Z-)KPY[˰S8X;JC͘(QRd-w[R$(ĭQ ϧ!͚xt[F[G+ǜW?|oḳ&ZsbZ (H5u &cTFvrn$$[E=$)Ef0B *roy{5Z.\_Ŋq$qʘ#읧戛s.3GP%Wz|SwKNJb #=ׁS)F'F` q]΁9'<**g1 |Zb:sW!i3CV.ࡃw+-x-5փ U;ymLK~>ԜW)"PsH#Li]R[d PZ/J9]N`dcL\#lԔIK8LN#6:SJ*gJsp=̐A%['َJ3)\iT*Dz=8¼& t']2.jOJ?k(m]s=r|1L~z{0kAt~n_rT/淣l5i`MkF>F[L9k>?6ҳl߇Uˍu#!_fT#lva(^Pb":cn3#O.ݲ Mn]H.Y2uCAPQ$r1H1h˹ln‰ڭ E4K1v Ab":cn<ri6ڭ E4KęnٱZs*$,|t^n,%::CӔ9LӚ0/)KE%ևhcvP>{(sctj+cQ~fXy>u\2";6ChнcАs9,`.2HUBn_{Zy?,o5c7;h%Q6]KR㓹8?eGz)[m]434'>0o4m/͇I\duJ}o$# .BMGk0JYy jw#G, XQejEYK*k|)9u4ږH({ ؓz3,cOn9UB\gpg,g_:]d 5$dA;AiiF ,xQvS=k\ R8shs SE](OBBp͑)&p#[.);FvAxZ0(Tօ|"%SS${,񕎢@E!)H$Z)fN"`ny)h~^Aeex#zB dI:u[<_pҽfD/S/>s77pяY hjAh60/ z9CGjoi=<~ # B ҍvO;KkD V/#, 0|kx[p5jeMeH#U$3f~)R$-^10efZiz!XR˵ShpA;Ʌ#1B0Makav+IfM@rˤOc JLURS.ia@hB&jB gˆ5 TRf̚p~*9 'StI"ٟܻn\9/RnL<w7Qb G1+DwKb86~ XU>\7oG3YG5 .:O|7umGW|<| a&UROĪX$3aNԯ`aFA gyh b͈޳my@DrHB5 V_{f3፧lah$ң_E/sl'nw)~90Zq WoPG8 "1HB3%:Jܤ f b[,zҘ7\5 w2P2aQ:jwJ/` od;,xx_ѓY;&GlT|eg^Lϗ4-p\d?]j1NX3JYW)I% GHpI<=,,k{_Bv_h VUwg]dfVf0#uUvc F([š|5.c]<\Kyb8C457Mq(S=JыxI^DJ*$nkeF;#2ʬ8m:ΉAlViLeǕYʬ0jo!.b.]^0]wpB7RhhƼfZh &缗 R2&)\@d B88!IO$_ /s6*ϗh빀jV5ԋZBsj56Dƨ!R@Ad#PN2S&#Z#nQZPT )yR3EOH4"M`&UH{q'TZ=ë{.՝;;hz#Ih[IBiY f5G‹m4)kW 9tE:~*]5m@ _NF ]\7 nz:2dU{)EƤU·#Ӌ[ԥ=ٻ>aMKOV)7AI67:m+Y-hi~>!ZF\Toi1< :maCnc Bv!5L6ޔk%ɬ%=/]Z}Jnc0I<@aqyӣ[qw7z*ƢPf4W{m cJM 7O&pͷՃэǨwi@Y[O<=lheB#Lqrso#[Nlf蜤q3vZjC3zZ3rV!UՉ!,j}l2{ʣ' T'(yd7LK/sr4/'di oqVNW= {qU(zI!^ӟg/,vgq~qff8nܞ_O|g:%mb;b,g#o{^e˻Ԟnl 73]tw;4w %";wW@3l)=eFJgi2aܸJlr7'~45rw]l-$!+$XJuI!NeOr5g?Z}|iyߠ~ xA1iL!0#E 8%RVSI&h\%)=o+?Ƞ?h-OWY8C"A e.(S*4j hm4.Ǭ,Y"KxI /Zq(F4a{~.Mhm W1s0F&VVA>xzLJ4Dm1)HPɏVUcRAa>(Xm;z/hc^IPI|*c>w! qMmqMx ˭'99HS" +VfAy qe)as@0j}veo2Gt/4.,>'Ir#os 6g<>DP]QyHaTp  #@%TCm{ܯ{ҹf᧾3rBԿ^γߕXo4a%HVVVj5oe)BtS{߯S~EOyS(d}*eK;5NmakIa S)tGˀj ;EJƤU>`Rꌹ뻋xjyN@s{UևNY{.S~?.P~p !u8{=G~Cz_ 4|¤nӞ,~O>'3NqŁvN (46O-N|ztFO)[X1!?oke7%IE9c'k`GVbgwu%:r4A׌*ԣc` Sha+3W01z"%S7;i6*MLxܟ7{tl|"L@'\VcR+AF,,[>=ݍwТr<ϭV nVMScp>ēr2c&g|BO ٢J~T;ңX 6wfpK y6N=4v"z=]P!(Ccf_)93j9;?qE7L6&TYdȳ`z(w2LqG9Yb<:9yJ5'F:q,FL51SPr ,֊)\aSc"]o|JVxʉ|k'>ic'i=e8}rxl =Jq~xP8LLq hҺ5^#Μв:8;\)t")S|UkA-&[,JBxO˧G7nW4g*=:!waA"l mm0/Eɘܑ(dЁ{DB,' SbڼEDO4[^q$y# E j8Q $BEt)JQa,,'D1R2U`˺Lm*+ldҀQ+PweZsN)8月AShRQ`SvmV,TۼV")LX>9MYL74F\.q +  E%Q*edT ) 9sHTZ6g^G6DCmFpkmD)JCE!96i9C]̎p*"Pp1 BK3 {yJ^Sg68Hx8a&>i4 PˊYb a1%Tљy T& "E&1lY|PcI2._Oong-$oyuNzvNrpqb1Lom^[XA_t>Fd%!Ǘx0~ C Sq Ԋ$'S`Z@66M(nՂ[yj an:(NmmmDn#h1U Zn} CSy{Gm S`Z@66M((ݪ-e>!Z)&qֆp In:(+h廍gӟm r<߭Vb߭rVaJib7 X:mamV-h)W*Lq6Ȝ6& L̴V%Щm hӟmj;3@+h (ӈknNwAq{e"BƳ6E!GW*LI1HV솯U:mblD3V-h1W L)4n5ojt-+WFRoJp]F=(W-h)W*Lu5mF0^JSFv;U5fl4v@+h`M3>esJSFv;ܕlS1ަ`B^9D0Iol"9d0^"*hw٦0EZAz@+h|:-v3dx:(Ne`)2M)gZA٭rVaJɧ]f7.'qPԶцJ/V Zn} C Sk@-x7М%'LRQ )8.cUX*3nvpzꚧI^+|3<69|mZV*p[bk-9!Umv3rV"VNYAN|uR'rVakUq1amRd{3CTWH٬3bRetk  Ic*r<൲ ENQٽ׌rb:b2g" p >#B  *gɣcM,!U3kjt,p̄h3].. u:ENݻUb~%oxjFxݼӏe =cr%Fn+^>"{&s{agw.((flg%j%KsfSE1,ζ.Ɣ.oLb LTs I­zy9]XwM^_~_ksQc-ۛy~7tv+9fO[n;|!f7;ɍNrc'ޜf9%RUlm??~|b{k.!wnoάcj>z~7|;4{l-Y.PY?:κR$x ϣfZ# m\ܲ@gݩo(uA;{rdAԃ_fkknFE2WCjR[$ٗrh,9LI? JiSDr&<Ė)?oh4ހ;hO K{촤aVQ\f .-~flrٽѨ^xrQ_D2@Nh3g"3xrFXCt=ǑhB $K@X.fLE3;_+3oqpTnrkq@6rKD(\ &2bE散>"YB<M'G.e$,qST*@qf42+-2&48C) edmu(`(=k`I ԰X0J2+sm{+֏,L"ht!E6{¦Sjsk*@HڅG 8i]fI ngǡm[!|[>h)b~,MFqKF3Q A}%{?}.";BkId%F dL)л!WUJN\/`XϫMkIKK (2f AJSܐW ƞ8Z;7؛qP(х ܐYQw2? ې(}1E?܎fED41a|ͼ,c@ 9ĩEl37!Gh"5`M47,B*>l:Dv>"x%m, VK0L9˨w^8?'bT!yoKaȘA"3 T3G%~; &8(QHh503J.AA{3PC=6nA GKDn)V$}QZ[GB"uZkjIՖbxL$Y΄ʰڲ\ z/ C0:V}u@aԗf8.4ݟ=2"iNcȈ#4Np8h+sa2s@G&N`Or1N -_P='0GwoJ"aIHw-&Hs7F} ?f}9sqH׀AQ_bTd_k%pڽ]0ęVgjigI&EL`C[Mr%;ԡ¸ӛdE8Y1|%~6*^dMFxyW8$ޞy30ELoOe2mT:ѳ p4=7̍fz Me> 6`$IΌ̶F'oOvI|M^l{>I*+gG(b牡Y*鉣Re~>޽~ `>wO,.ϩ8j>8/%U9{:x{i%{ ˫PAo|  u+Ui$2DA *EiPxHYEor@Аh9 C$8ӏ rl)tR9߬>.13(F< 4ys _a9oa|c j+ YR 9 $1gNhz+#U9k1N  x? _1^yY3}yݭ_ߋ|8XxpJWq~-!\sFe?o]FHwhglzvzmW|T,pR v)`j)(mw=zg6+Ԧ4),:*F̒Ŭw/E+7{Pȱ\+ fv>}RN4WA?.~z[oW =d\=pd꫇˫dOFf~5%<ƒS~'rOb=CA*iUB𾭿 cZGqW gtH-󫪩d0ǛYx?@0mp*g0NQ"Ҋf)b!ls":ܵ98a9c`%^"$JO_;lw-H|8CtZ}p0ՄaU9ձS-V>*T Ur}vM.e DtTk~]lRHEX a_Xlz3X! Р0U::t^ ~.|t9vR-*^h=9| w( ˏ/zS^jű ]WuiXW-k٭|ӟܾ֯Wxa5{VS)sW~QrwhŧhԜbM( )FS'{i|/TR՞v.=k̐Sv>hX4:GuEp\nE [s5SlC7U&}^97a,ousȹm=n{3;OCFQ6t*&osC,0;R5II13ffv%229LIj/6%bRx-BVzޯ]j C?E,$e-6,r ]Qo.J*GsxA`B YU]tGwv564Á9 x7""nqǁ!Hj m޳PEs vH/FWߛUX;''[ r(FTB4\i8=fbSAKځO֡xf'in*VsE^ͼCel:]K9d'6&{B'{a7pky&s2A8*BdZ0 !6ƾFxbo4҅(IQ@;"Ws͑T+qTf0y6IPNvr&?&jyL4T?qeiQ5ujJB*q71L݋O6LU/*m"͍˰J.D\\~ '42")Kآ~z޶ФiW`un`s71ʹ9/(u:a FIT֛* . Q"P ϾG(MqhG-#حu-ih.ծ@aN KQ1ʋB;n`i;͸%ذ-ْ tD:9{Zhek:eE(4\"N룈EGqۙJ#5a[=y[#[w6`{_{k"'F9O Փ1VqӭXOF&)Q:Dg kna0-Q%!Xޘ:GcBcfGV?FPYhci~.jpdX7[Hk^Ho33^ N+r Np ogVq?Pi&7o` ȆRpRsr(ͅe:hV"׈p- Pbe:@iJ(+2R({ 1'=V˞(o(>d"!9X Q.9QqgN!ҵ/8ze)AZk`| !ba%HS tF}a3VGq\Jeod]-TCՐnz^Z{, VXY!+̊#O^I;# ԕ RTsفma 0'r*(Ԃ<=/h5v.̅9P s`hU@°hD )Ӄῂ TSًj7ȿB&QQ!pbɀѧL["ab9~g(q8s"H_u]a"2jeQc .RĸUGBYRI~(vϒgf+W2)kى'b8!b9D]D(@ɌL&U2A{  Yl gNfmQpԪ= j!=cM>P IA,!A)G@maPcڈ.aDCR93DW,]4Cތoe//@YIWb dwEH, 0-@+1&ѤD.|;ͶNa%CYX KddeAvB)`:$r\,RF%ź+sƻ5%"(a5,l4G@]+J#,@)1cOFL,,` Yf?EVF]6{iQًB1uøh})uDZЋ&V)qAzc:vxrRtymOlǖ'Qg yOHT&zM% x u$a uwޘ&ll\QhG JֆF3*1(ARbY@-xJ?58~hlnQc^Ĺwg6@\PVFS>sCJJ%SUFGꃥJOɉ'ب 6bsmWlR5Ղ5@{o9JJ>A:g](r.@eЩƭ+:z"~N T 6쭳o>I7kr(bVx~\L%p|~~p^-"~!{uv*AY[#攕zyKn޽;xw' Ƙo?0, m;˭zb=3D^3sIB(ӓ[dת۾[# Z<]w}}[n]*PIFs.nby^XWub{|F}*ڧ&X Yb1K9f7Y&Rh$!Dc/27w<r) G}Dmg__2YnӞQvjaۛ6ר_N׬k_ 8@;_{ *IZN֩I: }~ak;VH%vu(b[(6Apr19(!lBBm(5ͭDۙ4?b hKJ IA&Q-K&"+9מ,@>d_$'m 6RfnO.Q{8d=;=?]5\\&0zjK&.`r 9 T,?=D h)u:μJQZrJD3.$%i1Bcލsds ZcUs p"赓QJV+)QAzӳ ޮhG+ZśVA=vEK`Ce U˱y[jiWyh \H޶xgW-1CkMz]Ѯcmj^=IM{`*=wT>6lHA銽5rb pYqQƂ}bQ-|¯9򱲵gպuߞ_/f+F~Дbq>|ܗF[Ln`1'K\ݟZV}'Jńޖ?|ʕ>PQмDz#D6çG}=qRݶnzw~w/j9{$5\i77j_?w*ncF=(|6~{"hE F8=k i]7ؿ6x ߻/3Փ;='XUO6qKOZx{݉ؓy4FAMԥ=pj3kC]'C֒tEv/C- 5G`؛PNWhV-~uqROnOO6C'ų+WU\T]\e5(˛wi=?sm~O_o^ןצ>M2_>bzy4ϏZYuջUY<+Bӿ?Z%[Uhh=kVۼr}6|,:Ğncq:cGrPG"MoШݦEGy {8h7lCn#qdGv(|Z} 8+g juV:l;z~j@g TE:jUf?)Q7& {i~~DӽHP@IJNC椕@c*? (I ::TPXy\g=vR){L sC'A-Iu|)Z!#{Kf>@NK4SA,mrQ{KC- RA|s*Sq;֬odD>3hqlU侞H 8\P7hqb:,;Ca?9T˄qcP@B< SWá|;!J,a~Jpv9CMN:= vlP;!H`k4>t*ѽCmD2*o6F1A]wI'8V4c*x1` Yt OYJk"Yj$2QFW/ >)^kC=<28haE޴ֽm Ytj9Gc\4mGO5T:t7_v?2VrT[̟_,"_ȐzLzSRXA;ɅghHޓJ^& 3?f VYY˛?Y~@ӝ>wP w}47Vetv[p)ܱ N13Q\̅B3lI'4| ͻZ1eNA ZFZ9(4 J ʨ y#vŜH +bl ^fgRQ9H,Ɖ' P4(<Vl] GNC \ˁE뉣uH')=(H#ZE2֣1{p:]B-"h@=H 6hXbH#8Dhf]sɪ,". x1C)YG;Vr)M^+f\ ٠5sZg4_/%IlUfJdb-h.@2$JzK*!,R"TZRPb=6PQNِ@؏:ț` n8JJ=רjᡥ6$!9*ՙ$XD iAhxL e((XB ޙX]*.z6X=ZC%5h}^hV>j@m¯WcMuf/JV[Jjѹap> cf5TitB030BH[BVf"̯rPY%ZIC[,t4 bԳnN̻HZy(TiJ[M77V>Znv ق$8:54/AaWw?( ,/gz1.X3 +9sw\޿d7nA齿o|R-/2f$'ʮ&?c;%Y}~' Ӳtw?|㯯^_h [H9$ouSR(85${t $J A'n$PJ4\qM(%FBqM*%kCBiQ(C6,B꣱'ӐE(wH2&8 rKz eYM^$f}@eDZg}o Z~/nPu)߬bue4MO_+sl<<.K;̇ ֠a`ŵ:03&0ݴ 2TiZyq^>ԓ;?-Ef}j9ib g9BYϤ5*gܾc٩oyVAǍ0)h+ַ HZ9}ڬ*Z:4aj{ mZ2V upc#sJ 'y7|'[8K)d6՛̱A# (Ajo\d7;)g8_x g`}&xtJi$MaMZ'>CNtpY8?0Vn]whtF /}(z@¨֡+5s(Z?{Wɍ c^|vD☰]:b_V I]wIV7$KjEhfH?d&H 3?Im?!#l#4 %@&@ʕFҨPVIwKM'6jۋc"CЫ&ҮiJneL}ϴ+*LGΓg;3(DU;Lq8Ai;mc<|=^2x!6wZ==n1K/5jX4{N!= *[XqfC ʊ~cOqKr;utSќr>MD2^@K``-U5T X7u+,d^\$q0مw.,x`RCίag4'8 $MD,^ǿK"쉨tP|xF p;0;zR@wY#ouvFmZPcɘt2W<{,BĖ(vJ)$r$9Uo?M+?l8?\ÇEpu:EgrkÉxcA,ۘ fBh-6~|=ohyZkm-;:_.Mw`enO;m~W=Ͽ_JYS裸 `aj!WxA߾j,b͛#_o T:v,2=$a#O7Ixz3IvdU4$]R%yքIJV%[eB;%$2,GQPOhщavVZ;1`h h*eK6c PkWA`J^䚑]]!'DkR [eֱĬD A^+Z(q\ `AQ)<1\P@, N@muu١j.Ŋ ')Ȯ4) h1AWg̼Rd2tT,h^a-ŋH:ǣFZHJ{QR'a8Ƈ9AYL4BЀlզ{>OA|.E@F+ Zmœs;*&b2(d*"y\R`e2l0R,D'TWoɓ\&ԖP[Pbl!mBNغu Y#[rbB9Y'i΍AQ#9S 96l*^JAR4Ym&MƎ=AͺĢR4e34LBABOՌRB=d6R#jk$VP_?AmҝJJ7 sV J; 漭T5F iIXj:XY[)a IXKQ+bmR%G`L79}z̼}~eO=k%qMOZDu;&6 ?~;OQ}^wMMN_~]7R8'T>2D XȔK }~U֫KoyN^ޕвtf{ q K֛g!﷫E+ZqBI+e&h_㲌*}z<4/?H8.vԍ$}ƭ@mA\BXQvD=3ζŚSDPZi/}\DVDk9)#h+%.f"l@Q@y1rYF+[?#+מï\u_gaJ67Jkq2QLg h_g p6NJL5q+O ֵ{0/ߵ"PryVB Q@(d)/WIU-7Te Bw>EGtQ0颛QQ2gND7KWzx'P2'zY); zq(R:G;Lۙ# "4nVO֞5]í|w+ h5t $]跰5=H$~Z98u(@{у3R+)wg?~'XO8 0`I~kjP8dkt4AT 4f*6K}{8Oe74xcAj={ݍۈdUIM~.c_s_~{I:Wo֬/Oooƻ) 5#O($ʵN/`KiB*K3[7 xl0F< }$ȹD'=T1ƒN]F$u¨Y=-d6̴t۰7z%$o1AF]d1b=iV6ý>`FonC/;I,'lX–mcn;a 3{;K=`j[jIuno=R))̓En6`Kx2>ozSǏ~aįҖqf7ݬ>wØfF(-仛,h W3ASOCo)|o8Ϯ9NN6fzz\0^f=w@9Ü}m/ P =fFmM{L@ ڣ[YngL,/XTGLL35Ҩ{ۨqy@?Bmզ`}p|ݪ|o.J_]҄Ruㇴzw%{2s?x{z[_>[;:5nI]5w?Nc[4{廞*W3YD'ٔ'F?nN»MAL!.֩nӛg5z9,wnlʞwS #z2h11Ļ3PBymN;7MTVKdt)vRxF ]5<4ǗP_oFINsTi<3[jLUd\}ҳRrmVJۜ˨Qs(.VzVJ+U+_F}Y)P[GR6+PT}y[)6+n'}]vIr/VzVJJI"< +} 6j'JJucO]f!rBыn꾌z-(zˏ~Uaf}l* ̉&W3Azih:1|l:OOW߾wWO bZz$p3e`sՈ>iz墍Q?|{]JC`o~kIʣRA1=ɣ9E0D"[ !H5,㳯 m0dTUoK~p^u^ 0EyaIF|ƫLki[>n !dٚDhv()ɹ=]t3J7Fiu4cїQ_o.WLg~,n*m`folaE:UKZ&Y:E@q#ABƺذՔ\72NHhV=`*2G :촉1r $ 6*35-ܶ#(DF"Shn̊>Pgw#lv|7ENH2^-VO[u[[r`ҒkFQ1kՍꠓeHAq`鷰̾s$DWɣ48^Nq(˩Eb PC[ݻ3ʳu# hJ-reCP "kʋ@\򆝕Zq>eFȌXMPb!+) ]dc򗈄sI,ydǫ !C5*s9d9CpmV ]> &חQ_oVzVoTW+4*h5bbhԸ/MRƊOJ;ݥVԼMDs՗Zw Vh\#jTsvÕ&-5 HB/͍w!8A,ka}$ouL.:/J %<_׶˒22},w6fYv]Խ\q@`l# 2E#_!6~?#o 9nl=3cdo!5HAR69ӜwUWW$w$B2ݒJtdzgHT ~qU4|U̴UQ;LK!:awZc*zU%J]iTFLb\HJ-.}0h9b;[-dH)KA(4V ÎD06p"!$bbB8!N7EGv0թ# r*"#̲ _p3*$h--'rb:btSaYWK+-WuF*@$Zߍ: 0|*KwVGfK˛(g[#:lt^G: u(3xNLa)hF%4ChߝCn$6V&';aSB8ɧ7B2sE7,N 2LܶQSێ5mb0VTjva:B?#%nגp Rq|BDe42F0&53'D14rEʧҠ~\ǟD f_Dڋ{p3dkzVG`''#|߭ 0Pc:?p6ͦN O&?URsksB34Gql8{>/?2kGe55zUǢw;>#` dkNfiV9i `(,`osU|FUhyqUɑuˣe΋@V5/t* \8zט߉U ޲@H^X|6qW UT*B L ō5erA$Ww&• mUmBj/uV Bqk묦ViTF1A*HJab$vy`@ F}٢<^)<[C.j|' L`_2ggggErOǽascHfL2.٨9pGQ$iD2&–N-h}8%-./Cb+Kz]=~ye&T3r)#qK9 ʺXx"O̬?)u3e~oL&()Vx]^Q!usitDG h,sG 1mRh,%RDnIbhjQ# ~}$5Mљՙhq.I>E"JAg|t+eJeRļ~bca)$flD ӱMo]O#h`AO7O@'Ӝ$/g>hD.$h!7鳼~#0ly}/ ?3C`Z_v'd+a_}`oƥu^]P{1Wu3&T;h3â_oF*%ss牚*dgBv~l8-(ѝ&?>Hh; bsZO$ |(GzTTǣ R;*|4)%i#iBG=U\𨟂p/|eďzM$efd 1w/U&fR}bɠT)C8SЂmZ[8R+|O})GIvisjH6;0WЙ!/Ug6O ȃ؝αfwq2lgD Zk)NtGG(Iƣ/O &]I|T\?i "v>$cvr8AZ`gV[oJ({\3%r$QHZ5R*N11\i1yU9<+$e\)*kck2%o칥AA]l 2M 08#.]zh[%1V2h|FiG{/Ơwxw^3vϼ=jvHK4!b(Ɗ"jN$X&\!јRMhyju^BFT9y1)똖jZP䬑y8&٬fɯ!g? is K>~Zt~2Jp6J4鷷&E`ҠF0rܞF'x)P$g7o O$,H)^CLɋAs.:6\|&UI# x=Xtfyb1T n>npAłu|he~V s|'pMtWG6^ihr-RRmP ҠղRݣ> j.HTT]>oZ9X|&\R 2X1ƉU(!dD2qJ O(cXX n+KW@fjbQBBw8u/*lk~[5G*MM 5笐ٯ̈́3P@7ܧ~`fB2 \Gsc8\lb@cq@)3Ye؏48\b^ ٦9 4J $M^ۤS oLd&BU~5N6,hhf\i 74 ܑ*K'"ouء07 Le^{nY WLe$Q:qOOdQ auk#7qSz5Ǩ%tQhvЬXgXܬ/+[/ nWJL8uX=>_sZVQ! H0]k1.+R`b F՚xQ OD"5ߨ5iEgj`w0E|$!ucsPj iF PM)^BVR{ &!ݍ"E$V!FjoE\(f%KA0#e%61'$vK L3!iLm[S2VȱftNO\#fvkɌHGPݜ\QAB)C~v2l^-~5P+[CvE4ڒca\QD}LUUhqTa*?*h㨮_#32blP,|X6zT\FCĬnߊHq;*HMGp;qV Fݠ"D5TN a;7e-K3Ŝ&ej  U-Px~X]WZsi5kU6%U m<̵ɗZȀڝCAހZ.66x\'mmS 1>KO]t8[ ګyWee2/hޮx2xCѩ@S]ht԰f^T,wY:nւ ?[lCsS wNj7wAA}GGKb[ ڭ YOQ~= cv֩2mQДɝi[ OnqcF7gY=+<;=[g6j ǣϹ[~٨dQLϏA^tg2GW'';Da%4G0-4/X:0BM"BHF \i\ͮ-Dh4~ FB^pH0(F80|0C}"]o$WKC  9ZVR4|W왑8/lc;k3jUbU+B8*hlj rFIP2N@xz[١C{mNO Dl-H#)nQMYlm+Jh!' \* $ۮeAPY1mc[C&oĈs,PՖR[`X3-}ZIL " B k 5<7 È-پs \F i= LA: da-GSGfiFK\0 O3 q;*k<*8ROUptW ӣKq?=Up$H2j8Y1Ha&HwrӘG  Q㑚{j!<XJcPs8tLN7Dwl'ϡl Lxȃ䶤DlwZ T7cݩV b5^M= UkK{[1wǵ}+bj RsygT鬪i0t*` ؇|*S;M% FurߑbݦPѬ[~`uCCp)ܵn<ȉ#Q9H*!7{e=hNs+@*mxeqWϕqČ37y 86.?曤%.:ͧ͛牽!>T>y܍Ю_7F?o|މxzl>>%.MX=ߡ cth {D\Q N[Dn"y ;"$8L- ̔B2=b R9KKNJKTI 磧Oz5^F$=:P :!7w HI0t͞^Qf'lD`}ŧc}Ub̸Z* \wsw~w7W+_/z(e9h@,rnyeuc(4}$nj&F(jܡhʫqjޫP5WTw7sk6VJ+K8/-h`J.q++¸cRV|w?wɯ І7Ar_t>[","fnpc"pYg6Kp)6oX\&nV { PH4)׋f&HE,%h<y\ehU6ΐRԸ>絢 jY5R*Kf`jfV67HFPR2-D bV7e w%#VR2ߢѲV2|a`6H6B*'~x5&@QHGPRK9XȫQɚJNqeCp[D7DJV&եqlTRohTÌ݇ LZ I:=&P@0[BhI@Y[3d&PExڃJ@Yǃ=i=DLGhoP31[5rd@Ș'Τ9p*(/d3|fo͇A>.#WH>>J'sL ]/.r}sX8?>ϼQ9s0\`d0$ă|T+`GLa@>|4ѥ^@>$~pqd/s<)&?eP3"gsۘe EH :,1GIPA=qs8E" @|q/y8X1yw?:Si.ϣ~W6\vvZ9]ѩ/VBѡ@>WD1I+8SYm ^.Px2ߑ5%>"PT Ճ|*SNc݀SBg 6Iy_(Sǃ0D0ACp)Nܵn<к Furߑbݦ̫{;ݜ;.`Q?⩽ئ'nY"9"l)94K 'jJtĮXx${f )T_ -7l WZPO((_zA总j%RX eh1?]S<ܔ>`Mٛ v^}?p]G]~}6U75[TwY }fu&i)=V݊'J "%9 |K'ƕftٗZ*C z$Ԯ:H5$nCPaQ!FpH7ӫbt2 TԔ2?؊BSIزJ.:Xma|dcӜd(nlZI@)J*Z%X1QWJnU]26%sZ0Ck'ᔐY`TL\vn둈d(0{ OEp4.0Zn^vgcLZ>83c놖dR7$Q9xL0uT$1y 98{.iL P+Fb5 O!,\Ŗ{=ɅpǷBTh_wd?-D(_:vGe$7$R?Xs5"ZE ďiABk΂R +(ӕp!EJ4]?ǹuXcJ Qࢇ6VSKZKA5?'Qi$BKT[ŧim?r_^,盏=">`.?kSJaf5[_V l<.8 8ҳ$w- qTA?zoL>#WvD{=k?P&k~M &{魓ڟٚ{m9<-(YfrRй)ӵr5F"~ۃ9C4K'V4&_bMٌ$W?\AzH \Ja&$T3ɘbik> DӰxfTb IF D:mq qu-U儛+.0ύs|^; \2j zNja7u #g Wf4:-70pI6d#$ :xE2oF2bt2& p_ihxZYˆ*)KIJh]+ZRtIQG  aTPōbacQJ-cHSԖזdK!|ACmLMLJ 0Xi:Ui9F.;HD2Qh=:[?.ZNØ`ۍKӷ, / ;0O0L* #4%a$IT\{ƨ%% #9FQ?w%A0g39o9(Q?Ԯf4 =6;G53Z!e$> 0=a87Zعht9/sjm~|F{0Dd]ba5)Mer@=$ABGi0KP)(X)OMX{3]i[qK(N& AUZ9"@>mYe%_qQԢSdU_ej_{QGrWs3 vjnʻ.2dFC7:+QWrl4dKG&Qx;  vp⽖e>[tY{Gex.~?mӷεz|$yV/{5qQY-F oK&r홂$HΥm*񿒯/mSX %ڌ7Q|+"Ob A E٪B>hiȆU_$]W-~Z-[1/VOFWkuݟ]B2.7mW0=MP]pmdpM[ܾ׸\-WwmZm~yo0#T]W3Gh`9O 72f>\_X~^32xB}uSF R[7Tk{Su0cU<-Exˡ8fld{3=ۑU>G6x'7U Z#uH7Њ 2^S "-qK!#uRX;Q,"xD] kܡ(gj;!ܡW?58㗱eQv4zѥУ$j/AEGSD1(Ch$2'~@pb̛[վ|lv*ur \Hlb#FXp'2ȑٓi0x6ok޾R5:yWZNJ, ;Lb$ ݩd7^Tsz,N]b-BD1uA{w[xJogussn~n%up?>T.< VtO#x=7͢ }lH(St_m7e]spB[Ani?V^߳ \s{Ґ/\Et*@R8mo׺Z\ĨN;Rt 9C[Hև|*S=º Furߑbݦ~Hy-?A?lPK+$k7>{\R[]|: }6{bDZϥA*? ;TҺF{|.sq0F8:]"l2x#(1ow$1dT a˦xq8Zբgi6w6ߟ Y ~mNhrF 'e)jk*g,UZ򕝭o @5g1Ko߫QmOm5rlq5%7Y\N'K1zI.m7 h+6JBo74iQX( \c0f3#;`y آYqq!P0'[ lN#}/g0>ܠ;UA~&8k1" A'a[J$ҕQU1 MOCyJf£\E^({4B$,20A3PkugE9=@A(Ca%ۦZRj t O>Jݎ{RʬssғR4eVcKrjlCPVzVJq 5Wl`N++j+瓶RYX%Wx$znkiwUm6 TV|G{UjTu*t;$~n9 ;?p^N`D@O7]H*PXYOW9٩=Zi 0J>.Hpş{yk4YGt$Ds`lY7tctӌCȏzn`[mZQlݐ>$˵魞]}}.ConHj6w9xkrSrO{|[>mՇ*g"~G7EfdT.9F7 l\~]Ͽ-aINz/ЛuWJVژ޷1nwܻ ?۠Up TUmL1j-:X7u&re~5d?]/jBU@g[dB!$3&pm+Peo+z"j8x"il$!`"XL"ؚM'uQvB^a$&}"f1>WGp.if`q(<T_nSo>a(RCNbfHafqyNi@OW:_ť |]:^4t|E(4L/ttLSph֜h b!rVH*|geOG8VO$X,85:y*TDbҊp&$d̰W5u1$+ZU& -3Nqy YP!K\R! &m GMu,;)/FIi'u\Y)׵*JrjT VNzjg+=i+8eUK8v5(qV s>i/Vlh`ʬtV JѴڼJm?7"YiC`pKOJT۞/GaBYiCT5+1 LW'JlB^YF5X6}.>]6Ւxxz2m{χiJ9[G5l ֍h``|`NJdwS?`DVK+IVȸ5 fV{,f-VeX ` $L ;먟TnzFC&&yn`ʛ:>Ϸ}m3Jz!xx V`!]5bB,H%^#~hlǷ G%דՍfh*f f.lX\Q( C/D)#kH:Ƣt0! ]0Df#ʪO:xu:6s]K+ף/§[ XpŁ|%]3ŗѿncyLZUj?X%nk|:{]\3%>.ahC.kBk@£ Ad8W9[IOjv͗%?7E;Cy(&^'I%b{BLsL(ˢŸy=xAXv6r!qQ",`į@D֝Q@B$f(DidDnX\^#,65²hp)rE(NHY#,'],]QsINBla9!H5Mo*I蜳k&CX#X\#,zL@eII"tiLtNr*tIəg%3DXBe=nA/;XH)TN3Y^VdOR"rhI n $ERb{}CBXr~I]rMuvI]RH!,T sntV* oȄ9 +}GT>-NJpǧfǗRPg+=e+5Ya.G`a:Ggi[)2+[ޜVځjJR` <,#CWQ3k8yYZ :7o+k-rnx󻍓gܦF}ci. HWO.zAoZЛhƒ`"f6Mgb5Q@ont$:'qk>69׏?:)EbAJHUxX9@{ͮt^O9먗㇆ޜuK7BvӌC;Ͼ@!513Y[- l$[ĭ"A(ChL9eC(h !c#-bq5[M46nm"ɹ)"&X1:7>qǔt]:W=5ٝE NӔ6ksێDłۯWv`Puml/yOHo!WF߿'R)\Z"cjnu'Ruҡڵf C 0Lx ;K7!o?D`1oݰծ5ˣԢ J uU0cFvIJ0'bΞ=`ծ%3vLAa"@`\ڙ\2 .:xN*eb, !GǠR'frv2$v3nFtIwcok[uȻWn?*?ji_O{Mlf;ˢBfZ = ޱ{6Vfy:;F_["jzkZW˫x\Xo>fF&Hb@XU v.pY^[̽f"pB_yYf/6rBTD]`^wtyr[=x h;yч@>Eyt C=%O҃[5P&{.d9 tBnI: &#>w^NbZ>hHEYmkh)C"F>XZP8 0]2On_EWh1%*ۼ+ݳw`Cj$eʠ`5msA8sT d:MNbJפ]"Ҙk3W *ۘncP}]g+G͛啽~t)a_ULj;5mI:"Ƒ1/ +F<$(=:ӒpGGEi H 0~I 𽣛,\& {"5 1dIE\H$i @$<14D\a= %u}ccenI-k–ԫؼ[K͞s!Vw漘:/H^I>ﴊ$%sdV.hI}I9h>`ڐB] lxwMjAV4a1 hU]]V]4FIz:ω>įg?1叿OCDhq8( !đL)욜P*z%PCvJO9+WuW~^x?s7*+nWw${U}xy7 _-ƌ@ IR6wbc{P\^4m{rNhÉLxPv2 {h7; +ڙ.͎R'C:p/s|"'->ҩK؋n[@J)ji4s>{`3(; U*پQ=,9"x|nnI'y@mћs0Y((*H+aQ>F% 2]b c77"NEs:qh3| WE*UWa9cC4I_ %'BUGG$+)~g6$_?8a? 엺8(~Iz98w^XX?~tHYvYm0J)L BqQBpak{pYn"\{6/d,q4"Cp2|f,&cٻHr#W zY`U# yŀcF;I*y|}ɪuId2Zݭ ~ Ɉ/LQ'&r݋Sl1Lm>14ÂJ>F+ d[ G&D4y 1x2A/}H7qcfU3ʩUS+@pjQf  wf(9}Xt\._*-b h !hqbDP^#hxPJ eKq-?*I D;.%i U+kW+[@|JM}P8arK&@O,pybQ<} S$.*귕tda]A6_߮|j$mcZOm-!Wr8&nSTś?>quWUܴZ_~B\^D}ٔ]q9sf'֋m_>!njaLzj¼sYi6 4hߖ/'uuy+fg5s1jvu;PHhN?s]68ߠo^OND @ эPCq@`sDNW:|J8 8 P "sYzN4 ]ZTWZ\~1yJ-`zxbm]{+{bN;|Z<.FOUVb8P1ys*Fz'9sdr>fl8* 8Yj:y6,TsTܔ" pӟiRunyi|)IO%8L_(M"n5q(b@#v9*D'|O:U$R8‘PwF! .+ÐF)ZRkl":K͉R-"wyI}kKK-S#ɳ_Vk߮[=.g69|"KmxxȦˁVKw7nv{Jk6Wc ^Mxnb?F4rwŐh:uީ{|f߲^[7Ϻ!Q6;>|ac=SJ1ݏ~R妞' MʦZtR{7EX+YkBVA~ĻM}Ev C{zLև|&ZcSiĈлVĄޭөFwpl%E ꙛ2[hgB*PЉ&3i[@,H|<ckK(3h,3)n< RI5r>^XNjl@4 7jW {BIhPH㬊˃!iѧ3j{aO#2L QR<4wkZϧ7 @-_TY#oGC%!6ʆ:H_7o∊ @޿#s L3(8|g)T[]cJ9֍cpBe暔ɮNT 54ԣZ?Lr6UM9}w39Q.x\_ UU*;ڵ2r=3E&hV$ev'VAZe=#3 ~*9P |BjKhVE\;lBӣk=Zh4>ݪR-W c~dmHZhHR]4J)~}ݗZ3I!  o$NSSOM- >LBBqV;s]l?Mj11ox cSFngޭ MʦMϢ<oMi?^QunQhg W\wZ@VKBr[P`QMq`.qex3(49*wK@Z X{h$KZ XepD%-ZΫԮ֕0T J Φn݆\&ݬ/XgsxI)}/\4swjxlq ~\S.f݈[d![5bP+wwS; (NZK~:De0G:z/M#7&gxʔ} VlR.%|ꨛ`&M'݀Zwh . ?)nJRM~S,c"Pi931P84 B;Ť@4cPNf%E|x&|1@Ap #T5͔sT(ʴ MHE #sE,EX =]p 8^B' ~_ k7 _OtA)FJЯ5/RE0~JDOP)15Kbu! Lu&]9k2]λʅDd{N}1)ESg-҂#x9/$. tB(Wʿۚ<׸z8o#F/EehبF(b}Rfˮ f^ng׷%$Y'գ጗` WR rxBP p(_y42 stXkܑ$B_Oyé49xPB>m~X4g: v܌(ٗ_Bx ?ΝqA{4?5+>UOv|js9utqIF=qE?{\?kדV/_( K%J)JV~zGvW?İ)]P=.l=waܹ_ՀZ_\`i=MGBj^LS4iA94@ jq#P\}V?j5s1jvuA30)]t,Kԁ:DP .%[^Q`pX=a#I3-' R>h4{,4nBN7ȡx,WPŐ4h^K^LRt\v/3hAyzAYfn%2s HʠudI>Š{Xl A[=qziPtYeЪ_ 8آ4eОXd ,{A5(5̭iUƣsŒVAzmfBfBazn(EGHy4Eä/jfRplB Ѥmn_<H-DAhOG.îq.}Ij7pYOr}H@STmY*:٨*Gӡ[\Z} Rb#~| OU A@Noe=7|zFeH:&_ Ai(e?8tҍvn,dLX~ l0=+ 8']#ҦsNH<tCS1^oNeLBmݻ,d_52K5ScKu!=jU ?, p!fbqG 0<42tEƉh&Q-@0ڡh FsLT<ƒWa w?ܷ! Xh)vw5=G XC8Qr@eNTjpTCVZI1WX&6pW4u@xAr)9_+yۚ]xé?y/}9{iNו%]|veVƢ)z&O 6զV >׋&:SpTS[x+?aӚЋI?2S(PaIun{!S[<RV{ED@yU0]7=~+ץ: sr>+Ωmzyb[ϔLm뙲>UjGV S aǔ* alҼ>4o|\^J4&\KFR"934F7}ٚj[F 2R12YB5i!Rt;}*ؚe]j^%gzRS0S~unBw% cbL+֢E*_6ks:KͨD z d/gL1YGMɰi+? =ݨ{mf[nƿۻx?}e ? 9ezb \Ͳtrxcl*_7!]Yo$7+_d-#À_vW ,;⁐-=ݲ=ߗ*Y%<*JNÀ[Vvf\_xyz~|;ܔMd!scSFOx XԘon˻N%ټJ6MtMI:wS\Sn#ePG~Wv]Uڢn[ljs ,wnmN?VwX uȴԠA%dMhA?R DW]H3K5wwT嘮 Jt&@TZZEK?9he73 OUJ~+02@)CxF-SeyR:2% VN8@wI<LH tQrޙ5󰾞{s5aрh`J!v bhd ؐ5Xie<5ܮ4;52k BCs!".:#L:>֢A! C, <@%"k{ x&kV$#ʶuz)MKއh9xTZ@5K,8TR9P(T%rZϽ+T7"G>1 N$ hj ߐB~i"$ 1rTYGpLԚZ>\ KU l@AF"׬KO䚍):l>0LQ\VfM< PޜCj2?{t~}4/-UGXKٗ w߲d֮.1EUkqq;PWtl|j/FBUGDhYl'H2 p3vsvbukm[kU=\ڹ!cNU 6SN9b K8Ij"^B1.nsi=+,K*=EOzLćfN?u `*IB6FC )'V4٫ǀ8\Tg˞T%*ڠBZL\ʞ*˞z YxeOtHC@Q_AY҂>}ש1[A_[X1m2:ݪUmHh޴LRPGA"~M'yMm˔v~+)B1:Sxt36C%OLeŔY 9sIǜЮ׍%k.W_s?\}xS6hV7?e}|^yMidOŗ+l?(o7b/Uc;:w9߳vC۞vжt^m짫$~z|#Q}ԫ5,o()*ʟ, $dO(hت_z1K 0G {Tka_ÆʚX}v ᗭϡCeF%yҋRuV v< +}}"XE[uV*=,T:+]Sx\p+ʈO%vTT+^j.J2:Y *L>ՒruVZf'Txn+q큱6E7mӿpOLY[_WIJ4Uҫ\?y޶O(!]#{NAlaGTilǩ)&F0s[cǿ' Gi,O`DgON4 g#> $,⑋LSNOtʧWOa͵&HA4W7 LH<(:s~C/Ro*5mKy~:UlW)[ڳw2&N؉?ppER`򺡈)ӄ6FLl|NʇX#bQQm_^6׭&ͧD6~޻߸$J 4W&S ԶIJ:)uIGKNMl:Ӏ,L=T+&V&M~ 7hA)ulL%h)?7x%ǂ KL.Vig^/(z.V{KѽmuOiZ\ȺKc0??vn۹)nvoolx))Fd1b ւB?~>UZզ&v'ގWiW$(?}.qlt?f6`#`/FfnTWxU]<&M77%ܔxsӍ75iZ!ضQ6E]^ۄ©`lCekrFɱ߳4qXIS't$ eUh7.̲sN"< #J!셜l^3kYZb dd*3cNJ1%q%[el;tW5ۺ9H++ sg}E={FI_owLj;txj}(w%q4L EBcAݧ`uC` A h R;pEP J_͇9+_ 5P#s8R86`QބƠM!ed7= Kl#5l* 6 "@ZCHSb(,zq>l$*85ݫT_S-_.+W8 +E5Ւhl+[^NitClL">:+]S-5.#.JLTI/DS;P~~5:I5$o@P+]h|7Db줽5y6d^E0|`w?+D-m\'o#*үbL&oo㗉y[P +t޲VhгDX^02DuZ=e փ&+}ĉ^:AR9nҹ"uMzzdKОzƣ =rƬ!|hÅXyzPwE{zaC$h=4+Y=9ìC/A.`^BDGE{80БsOdCR`HD2DNHM;p-pOK0eDM2ܫdr-_P˔[\0mtDFPh(uR, Bc 畧]điްsfsl Xd-ٰW* V+,x5 |(iE)i%T&(L 6Hw1e@ xbbd`b14lʜ [$ae7< aҕT1s^"o fNL ͊AVogrݶdᅀSDN Pf#D,#)]uRӧ\綴l ֕/>}~k l[k}x7M6(e]ms7+St{ߠ:݇KMvj_ضKFSWߏz$='[v7 <?!kQHaW|nWŗqr{u.Rjg;p?2U+O !g\ѷ^(bW?s')8U}Yp@_ |2OܻO"9O\ }i j5k}EojUd4*gPe*rȠrȒ8$Cwt;9K!8lS:~-0=.f.rQ>_]<|``?w2ZģXaWw:b2 >`G^Il{i,Sh 8ԟk˽fahg3}6wto(w7o=f?[\tACУz{J@cg&H3),p<9E*vٞmN:mKOO9ް=|(+~9#-'1\F;8S<=1P>B\#0x~9.Sd̜Qѣ1A1fLNwE#Y aH x8t6=&>j`_ W!DPQlQT'C>S,.'ow>(yֶͨ;PYev-~PuG/7kZi aY66k}̃w۱?0{l[ OpK?rGt Pk:T>7l'Mt'L}H}UGML| U4I;dǺDx'n<QoX wwڔۺahNɾ3F8w=eBʃIFuAZѬ[:B>к !߸)2|)s'> ?3GPPx{QrlzwRWM5o߯Rl:UGOmƦT/oֿŚF}sKaj"'1Dӕ-FBqf9zUm&86Vuy 06(ށ/ w&&ule|wF,"N0%#:VZP4H>PR+Ɉe}qI 4o@(C,AvC&83/lS^:c'5vE/B"9XbF aTqD+dJPHhShzx[sŹWӗp-rQ)$gyQ;e5sT1%\7^rSU>#M*TFj|Q?/>rճlzFx!r?ĸkWt#\[ug[YG _^u|oL=F7zJB-9Jk ) ɨʥJ` Uq^sLjd(שJ#DcZid_eZ4ZAؑ!gSΡsJL mnaD=Z ڐ1H%hD @>x%訛%"!P~8q\:ѳ9yu)?4|)գu5Y (`w߇Ѻ"X\7u*70"b9 DtQV9A!o؋=s ugDP "(ɢ:#FS,/0_9YJ:P* ra3<ܢ/aȬRXB@ j0eyo(&SyMdW4B 4t`:[V~T;OM\`¤^lJ,qLebU-T691(sEgDN= [ k\;2ga}^.'a*N~6'b%LV֜m,eL;%ɵ$P=J!P0H'_ EFi_&YL&%nR*|D`dfTmDT·6+Ѓ\b) Jfg,-(*fZV\PV庬򂕊6UAnU##}hDеj^mF-T V1Ձ^ čL@}1-NAm'Xi!kOX'T#DkOGsiݥQ y B"|%|t_ 95̔s'i*֪R%`%0ŠBȵ*O"s̉~76{'5+NKD(XIn ]#i6ѯ<3~Fy_pa܆q+ QJ2}~pLTf?-96{hv|&7Çɛ@D9Ռma"gtmpLR䣡'px7ˉ9WCogb0zyEop(z!cvDI24G,1mU`bswYl![uZ}xk\d`Tvzb-KLLAL" pw"91>"V*vxF57GI^]LwzzR(jdQԿNKY:AT\@/@fNHǾ`t| qӄ*x9HDؾZ1cdX o~U|oCi&m5)]}>%¥™:TGB$<$%s. 8нzAt>CaHTԂ!Iv֍Dqo*\Xsy1gdm|ㅐ7mx#0؃3leTdF -ۊh$*3d g_k T^ 2gu!A $L?1 ƽMf%1c,*DS95VVX1fsN9@18 V1)Vrݠ@҂ &;ԑ!qU,?NLLTHɌvw XNBJ[c{[ yfF`40[: ``$ av[N˝KYr2D-ϑU֝]f<˹$e!ap!\݇ @eM7ˍ`x5_yx='ch#)lFaZr 2'rѻՃOOaT[O?Wk:b!JW4O.d TMD͂ɍR}tF95q"@zRQј̐4'fܵ}6CI!/+Z(-dɮKdW 䐓Iv d~ :@XtCSJh&|9~O0l0j X1\eqRw"!ӟj$WC#Ngso8YdϷTɟ".ޭm44SclNwwWw?}/72òlsOw<@1>R OuŗSV#ޯNL249X3]EM;zHwyѾ(4@hNi8ye׺`Ry:߈n:Pٱ[:~3к !߸&c݈wzNhRy:߈n^t8uK nCh7]ҁs^u3u>p63Ļ)Nԁu;tYC>ԭP;+ o=vfzqfN> R I$~[,@$pֱSeH gJvE&6x?E$I(! M^ Jv>P/B? IJO?1h-t#.5ۿszF[#*4_>Y̲ds *ܰ”PL7^rS>CM@ GbyQ?i/$4;˦l}O'lວ:l `Xpf_ǞP&x7!Mb [2( QKeV2}?L,:UY8,?ZFԕȌ$"3)J*N .A<+Ek 9` |D&@N9t.r5CSܫP61=.6L=C Fa wjS@R)@5ii)!O(&mTpQ?j}3fFmo T?t4aP6f9y(Cef:-f%fL2Q.m`_?S:'yu%X pz×֡aUإ)r(k31y;m(/Kn0^) ܏StS50∥(Ba3EY -UXQ[}Yd V!(#گ:fŦ/7 w˥w5lB)2y"Vbodhܩ MY8 ;<>nZ*ٽ39vAû?9owi$<.-dǿVsfv;O@G^Wdψ/*˿]L|[6{,P0ɼm#wIBJA(аzH{#NGu^*#wv誀rHe̹91(9N 1O (N~Nðo)c+Vdsm’ȸ Jgyn \3# \fQ-)Rn)ďTkPggu-  8ysw>Gm5j<wIhsTݦZ9ZTfT#>GmMf-9'o)kmFEKh,o!q=3I6/$1ۑg6?dK[-f_-y 2c.V},ɪ+^CRեQz(R^Xӽ@)4R:G)42[DKRI-:Q" XƥvgHCi)5c(=\RV W>7L H#I6C=J+ҞեEѰdq~(AcHRǦ1B8|'<|<†hʇink 4ŝ{ ˁde# Ikt9jۜB3h$C!X PLzEH59U^yu+SwڦWbQ @Т.{Iڨ]fa+K>6DQH?ߟR`p;|Oױ Y,}d%N~I)a]ZCy߉g2|̩| LЙ QLǥ^4BqZoi3 @<Ke&;`I 9mLe#+㕏ɍ>wc0Xr 9UM9Pl VfZtʆИ +!BcsNTLZTYuH"TQ7z˥Z2\H CZs'𹢂apzEAB5l릉M)@E#8,GɹJJDPFS'  'jUXWx^XQD@"^0&Mnl4n1Y/ĴMiHl2?Lʿߝ,QwR>ܿ;y^6A"iӇs/^I`yOPA=H}&"+JɞD38R66D=!1 ɪg=LL uY]G3\?d$MIoc$}~K,O=~웵j~d n (:Ⱊ6R;+aqg RH%Ǫo;\h }BL=a ":-Y;ԂrԝZK-T٭w(ج 8ę@qoe$߶xzy=ݶĢ(t^ K-b3*Ow}8KϞQ{:CeoyrS}yyJ<ݹEvo Cn߷r*_\3uNN0y:*? #Nc/? A|ohJ22\H?_ڧ#"DjˬQ{t+\ C*klxlDU)&mBiegޭ;oHw!8D`JTאyͻi ޭRMۈˠ`Sn[!S:2ޯ_4egG1pDFj ɛXM4XVs:X*dNb)B]ɗ.Kia˅z,n)zK#%V0 J&eXs&I^LM")ѢJ6iHO(_Bb74֓$)],KfU)-VTwULq9i4r@XK3:XIiYN-S:cVpVUVo)Q| j L 6ܩB4}jXpgISܚH7 Q>1Lߔ&T@Wad}TXjWvIݙȨa#yKV" ;WB+90љ簽W{?֤֌EvUG$t|G|jᲫ: U= hL BkM~dWuA tw/6t0֝-3ѻo0GMWѻuA tws!G!ni[!ZT"c(S.4췴]痲RZؖxQ9S ї>K~TGу& <iK3J_lMjQ(=,SD(+`PguP+Qz(Mw_^%֤#J i \R{Ҵ-G59RL{RLK(STORN(lo>|x-򦘙6s,w@>@s.LÅv /K{&l6H J[5yΠ) qϟ_Si2g?ˋ2zNw,\(vF Llt<1I8$$-MғoT{32R92R*cD1Lg=*;Z *F);OA_oFWgk ,AI;tqʻ=Xtĺ{W.Jt\=-R5nm,_ꢟ|(enǢXJw0_tE&o㕴Խ:.Hau˽?}W|uLϟ,{CWo]_v*Ee_.DѠΏssktp\,Fǥ3;gPZ:; ^@ u%).Е[7f{+6jc#FëXbw?mu?ڦ'۠x"}! ,IJJKпN~|ڊo[\,Oɤzi1 ӏs_ͷyyyy^7h`f/Z QF!1*,#x ƠHYbK 䏰,'b*YN)Zhw_Ow;h:eP'zB8gmHW~ Q1cJd{pguWyKD)2)E% 9FҘ]ŹH] ܄a0L*  $9LJYuoJĩ*bdiѥwEqni+ *a`iH:çLsmxX3Nu3%)R-|\!w!}-<ܻNzH |^WjA."2i)H8ҞI9 1/f:/L&hS}0EK<e 2eZ3ʍ`0 Sѣ)ruGqMe֤9p؛ P Rd/P 2 Pd:P7Rrn)+k~;n4p $>7v[ݺ.]ƐZ1_ w}Z}T֚4W&洠eHVfELj3m%"fl1s`0 f 3Kk$lMTesNӳWd.=`a`7+ =ySpߡ1]^sP9xt1Ve/m̻I?v+I}L"IQ./?ljdkq bV3r S}bZR#|Ehr:0J$Fful @!T{@2f/m WMu6=wQ5XeQkz?v>_3nTre7$X5|&}WyE:V\ـ+ǰ*tpGhl=25ݟq&`HeHRJD{_6ۗ89K=RwzK- nQR'28~RDTKFLFblst`!"+ѤbęIvsKt]m83Q]:^o䃲ޮDWI1EWOl7v0 $tRݦ\EHԹz nCDWcJa>H`[  `ډ'ifFm}z&'ؼm}S^'~#-}l~AÜlQbrt!̲W?=w1^G BOluf>7 5"`|aQ868#F :A;sԂhq"*E Jn86~ 'TDrsd!oC&ӠcT`|,`2 $U:?/ۮ>sEG)aQ+F$ID)RK7] IA[ / e`,8R *RhY+K6e),t.\YPTUCU+ &H+h2ȑSGVK`r<ׁ7%# ;\"Ɍݍݣ28F "O9zH CfBIPB)FOs)Ƨk),a? -|=jD Bt*|j[dr%E9dI$́l7vm:p \N) n~\*óH1,ξ! 1 Vl6i_Bb2%` +m N(9.*P2s=/$O_h-ǧuFĆҢ0H[T d- *FX5`# 2U~Sl޻α&h P\vN5g7 Ј0ДnO. WEQDb'vɁ}߈RáZAg(?7G8QK N%8 B٫q\(Q{&k\&94\$&t2S'K=[>yE.: Ta3rc@ZaY!o ˂ (AKD3(8*}H@f(nP @BI9!%5 \oԛ܍=墴Yڬ9ycnYg.J7mF5>ݒvб3>9 'Fwv͋t4{ OP@*~ 1_1bü6G|~/a@G95ӨA%(tqmG ݚװտ?opm(Δv)03egY$MΘλh9Oޙ|۶m۟mLz> g`pn- (@Xug1SaWGZ-sTZw_FVZM]r/ci?u(<5xߝ>I`mccD2Ki벮DyN$: D ܚCk)Bp^`j)P %J- J9!8qw"Lj*ҥѰ3j@^;vլ[7=jٵln=AA_nqV}/ޯ~}{/vLJ+B/gks;~:cvnMᱮwJwM7ŗ#3UhfRoM[?Q^]l 0c͍?ڻt5m_B^jAުΉfHŰgSFyRK f]^!'f0ٌ%4!4=9YиE9:[}6=~G`t_ +"nRc UF [n v5b72A5l|Tr L/g~Q?poz/_x"QBFP^(6Sُtj$.gn -[~WH8PLzv>3M*T_~bV.Vw%K!v0 xo*4J \ -=9V~uv~cXx ^@/fXwK rtg}P2RNwaHkLdǞg14"]㒟Ԟ+\YT@a{|0Fa1U?`Rc z8U1&r(V|pL֨%|e9U8C?%>*&Yi=4\0m@ 'E۸W`(pEȞ^XKHSZ,LAPM2E1Z= yNSJ*_!"HqNbL>N;ETʧQ0Kem3~emE.)&V"rfVq- UaAvHM#.avFqF;0󅔻7YorjX>xċZ/8JuWn)lې%Hu4pTMY/`¼nZ.Y"qj)ux؝˭˥&Ia(dĭWI~iE[ڙه˥3P%*B|>H ZwWu6 mfĆ!-t;k3;~QϷAm{kQk%OIov* @uS0KD 7RHW a,)SAUQ2dAsJ$BQ`F%-TX" .JrQAqlNY>+[|ԏ7Oa$}ەTa[G>$4$%ϧ9W"D 1Sb]aTxDj2惟dn:豟o_bzc\$(e K2]"~$B{zvk~'/ܱk[Y\|;X/ޗ*WX `,Fs A|h#p<&V?du8ܨ:p6Mީ,Q1ȓ I oa1>(r&u郬EPD*+$Wa~\87sJ{"sD{}D jqqa`jK, B:7F s"KF1I.2ܡsdS`@ɱP(|bc!s%1 lHs JnRZP73JS2#r+)ρ JLny4$ Rzێ˒ߟˢL 8{lbW#`bI +)e(VyQBAHa%V[\Vpk!Vu̙ҭac^ /P"ʱ[D_9&G.x^9v1=DŽ*3@B>@s& zػ2?OCF<EtO1RkV6y u6]H'hrns$)a/E -G4r=Qq)(BM)'?G?nZORm:߈n]ΧmC4%WN|DjIFLvr!Ln6hb~74v^s,!oɢ-{ɦ5&b^2+v:^֍>`#-A2'X§v.)-NWaKZ,uS LqIBuD i1&SMM{gtV IL9=Rh:>Ut-#ۣ6LRG|Eg򍿯*I!8P jY)&lDI%Gkd0 &C.+[Z5Pil0P6Xɬ(J0 s F_:X* =ss.WbG b%B JQrR7gj{䶑_-VdUHȗp/b$$Jr<}3nGcCjuDzT*_ꩧD!1 Ɩw!6\jT脽ԟ5R4J=ȯ&Lh5;JZt~Lw+B- {͞'RZ?{Gݽ/5LdmZLA9w֍@SU|qln3wSJT+!$:ޯ#H)mcرk3Ow3TCʌ'+1oVQS+cbRdQfjE'KJ]PrX ~ +Ǐ$%JCGm=WjdbԡֺCZ ̑Vw 2A tJ[ڇ-y'׭5@b0g}am۶B.kƴLҔhL ɪ.mmJfp첑vݛܨ^Y:t-v-}1B_>ᓔlg:ƣ>G}/5XeaHC/70΂yMoˌds1Ālt߱-MP-D<'-qzʉw<:jӻ7hиASE=͝މ1*͢Dz!q/9YTia ^qf?)ڣG388 prrʆыׇ$J,bh19RB-kDSa% dSTmEEVP[FSża* JGi ӈi*94H'FLwte I96}{L#?Vv3f$eCk!-aY׆+r(KjMfkFo8S 􇛿߇-B+Ry&Qboljbxe"Pfń vEص›G6n}Ր7u'g/0ݾۯ-{oy( J?6*[7'=k<‹xY%N|vhob=OY ~6IZ9Ly:J/,DRu锡e&WWm͒$[cnuij ۠&ak)Or8lvŁQM,_ Fhت|BkK 9mj=W9O5NEc_@BG.I6a9Czdȕl/f4S[ ok&Qk Ԕb 2 ,mgTeZ];?\CuV6x)M4V6-cR^L,j: u:)UP ,B+Y7`dRNV%BQUcdbʯCJ4.k5aKJTX*DBjRS :B#Qcd*p-%A-wA-Iu݈"uH+$Nw>$Ԗws$xkF*gqJskVɖ-mK]k?Meӊɺnr5ڑW +lh{K;%C&D&hw{=N SRʒFmt1{a`Eh{gC' miw祡Ø'_ 'ߐ噂UBwۀYi«L8QǠH^>ƢɅڳyFyDÅl FVFfuieHy GlW~,"U=~$G 8QR r9.:GͤԼ`UgN9?fјꫣذeĭT2>C_fH }<ќ'~1fӽ /7ݏt6Deҽn^W~yh>>ұΐcϿ6O=Ss3yY>evtˇEF9< wRgb؍nM݆_`䍸\ˤAȟXTaf/ O~C__n^vXJnP˺fֵ`i+WZ)ե@Jj$BpMm[_: Y'*d#Con6+gmMpm(O?6C]wD樕^utQgtT1c:tf|ʬ$Z,Qr]-N++qGԲ_d)l,1lh*Xv6>@~[ rle(9+Xj*rP"*%])sJդT I]5R`]J~bUXlkQQYṕd^s1{#s1YǢp^r.E2Rr JEpKUUA'%H/Z VAiD+Dc ~AteE+hZ#c~"N p/ k+dΰx b'1+\ҤByg4kFQrӊ"scGnzurA:MF&p~Nψ_F_aJ)d0gj;,PjkFy;rv!~,Au *A WCPPbTБСo~LG~?Y/dd @u%&6w8NtQ,Eс=Ϗ%6//hHKY 8'ةTKФYbOY;Cs\ *WYb T ĦYːx|,]>%vV/,,I5r12ؖQY } U?=Y .>U9OG0u>"TcX-M,g$P@ddTٖϒZҹ$ %1^N&'R4ggڧK=O0Cɨ'Eӥf+>HZ#q^3H|TӂXF< w,2atӢ_ q薫FJt;f&b-/E|qF7+bFJt;)۔[ϯĈnsx}Jq(ט 6AlAjUYI};ګF^A R؛CRFK/K)<" >:4J:<TI,=$Pjd}KR~,+lc} XO"QK{;V !fvbx2IFOeLF%g{̢,Xݓ'5C>ybed M{߸&曀|b}X 8"Wi܃B}PPՖЏeQ}[GCrG0Eoƿ^O6+W)O^Bw[1DF鈿Oqt1Z:- ,jˢmFle1O\/iWA%)Flds-z$ېAt}G ZlV[Dw'j'濛񲄣DXI<~T]yO-??"@L)\iV| z.M:1^rIj"BʿUuI mUn UVmk*j:"+@/z>;*PYwY6=ڴHmEo_],jM 6VwctVem$.L1.3"-EźjϨz@/ۥ@jx?*Q'Qk15C4s:?@Y`%k\rjSR-Tt aɶQp$\U 9smU^ <Y/8>/spA.ؗ] cXr.?dKZ7lEh`V]*ȯ{D%ge>9z ˦(%klNIS QsniTJ*]XP N뫖 MUB@ !JSf詛am dnCNIp'i\UXgɩs\*fʼ HxaS"T6D8bfe(RʒJ*u(H83U͉uv؈l)jd}h:F-9( T7jE.J >jxXXɠ`fĸm֔!E[",')lIf>Y=?~%?ra?j!.m-hOO煏ޖ[5_ۻPW̗sD &^>oV_`M?WY=O{}xuDohg_g__84/%]T+ߞ_}OsPq13KCx9YB_k{<1pc!>-ÃOlx=I% W3/nޅdA5Zn;萣*<#E"bR}uTe+k7wT2f>كl=Α`y7 |j*0| ӫ38*@?T8avgHA!_okL>mNkn+B#t䦣Xsy1<`>#oM3.%[ϺRXGGrF#!2w&`}˚8a;ղ&NÇj_ZU'NZF+,O.dtR(Jɕٜc*2@u^im? cEJbj}A9S*"T(%!=y_{9mT FB±[Phmhq+zj3;٧ Mx;b:ُh}n&}ɾ hN)رn+y -!6w4̺['W$B8a׺0oLȃ I|Gu/ Ԅκ%Z4OuJqD8{@D3)Z W*JTw D}Β^2޾ QR[oQʺoQ}LS'p3v n=PoRO?N %Y-dW<3hxVRKhC^צ=z D2T=gٴ@hms~>I S$4|پ[߮DمdPhisZPHi4iVa k&%*Tɝ'D*K Myݴ#4ܥwyq`, ̕dnrZV#)A;KkÃX0TMf2;2m4?FlaT gqݰ4~mrE318mfΠCdt%Q!ĮAK̅ZsJ>s%<ؤH౾3 ]_.l:Iq)NtL*+5vBEw- cv N?.% 9dV.EQRɥ 9+iI$%|_K4Ǘ-lɲ'|@W/7vm- MRfx7hЛ !6+4!; GJi{V :GS$pV90)74(椪HÔbb9fTJشdvnd"w:F;"|^>FP7nFCZY{S 31 xz3K4táh$ʉwtĠ|B v=8&r+:yv)SJvSMjhhEü;g4'C, hh%% (hoda2m*#{dM@CF؋7)nęrJ ֠O3VŠ(c*4EU9!T.qڞlycRmOyuSl*^oQJ NEƉ{ox;hR|;4@OQ䢻Cn6R09b-BK!=s0JR,2ΌYYJ3`Uv0v?kK)UR7 v(UYDt  ',IJi [.Cj:۰pH{`EJeO8[75m?N\@8[\uQ+w@iHiS<@[N@L@Ihp;ALhN2O49䥱EUs[)2(q h,hY%!&bY6ﮂfTV@H)9 @B"-AD+ on)i52>ACyA5im;[=YO]Npr5w/_-me϶Ȕdʩ@$ Dm_x` ]+3.$[!8P8(tuynr1WL̉}l"=dm(_57o~`Xt{KSoî%<- wD}ggη[Լx~r㛤o_N3 NDB.d>L,5, T6S\pI;QpŐ/%:Z5HW S ;` s;s.q ?mAmGۉ{mK'ۥl/*eZ}=+hrn%Htvfڸ^r#kAR|F8>sF=Ճi='zboX\8XY,ϦVGSk7b))qK%jqϋIyk&qPn1h3S{٧*?_%1YFº}w7N ͖,/ R&@MI?e9tOǂfk:b)Vn^v}U@u ^@NedЎdpB{!zoCKXBۺT0Txkzjeמ4o,KEvM/W+'LX,jy/W[(p?E9_n3sۺo&/ _c]^~W͏ƞzȢ_Z,&9vxv_M_rfu!rM)F]&);ʃI}Gu2z0떎~ n]hȟ\EStJ59ܱn DuRcƌ"}o떞NhݺА?)To5Zj.`%Dr'uwofn陏R˿3_qG Cf2<{kכ!zf'%0\H^6$w'mNIBiTmP(G$3ʱW v$=:b HO9Ѥo Ra }u^)ע7_mI(Hн91`7 XI1W:ȶ(oWR 7G2^:if R*kiv+^6!h!ykz`v6(l1LaORM\wd% =J6\d潱|@B%-vZ2a[LAÇ/}>A\`]H괷euO訬ND%hݓDۻ! 4\z͗&yw+3ͅPPndo1ByMAHz .dS$k+쌣]$L(LY#'K_և(\TlNk#q%>݀FQI'x=Bouo!z1ͬOltdwYhZ%-G07 k#Xm BaA e~EfD A6PI "5V%apQcːa/mz )[ǃzD]YsG+l8fP6lx ˞yN !0R+OoV$ jt7`ɢD4ʣ2+koB1ij5GdbjeR=TїmKoPPäHV7˛۩}J["&BV٫hJ%t*cBRM6r f% 鹅uVLvJҺѮ+.}MXLMTMԀp$M^]j \հ>YNFXD%#h-G -"Gɬ0N.4+09USAiDQ<ڏGTNyoR!Q)э $8ղ X{pgdN HD''#cȤ{3ᨳF xbH!8!fJB+U !ڧ p,C0!7Nxš KK㠄^IgB(n LP(ˢ0* iG0yrՇ˞bvl"sgs2Zɞaz&栋N&2a uϥũ9ij1dj$SТ\k)6 iL$8u0f˨ +1 VC e9oX[F C%4䓀 TF ^G`ô(C뢘W<-O nL / ^yr2(Y\1|n ~av! %&r~V>}Ȍ3w?X~n=ܦ0MYيX;E3]=]]~4ȶE4D D /e6%h/xoZ?x68sq,gzVj^H1DG9 1jha A;8E"O@D V uq#Ƴ1%TM'ƁcZi$Vn<+lRj\V.x_qF"rr,&yu1~B/n LIj&TK@e|^8*(7w=w&LDD)P^AC-dJ }X8W l /T&/PR!x;qkc6ejA 1i1ٴ-md!sj(D_FHyMEGc"Dc/ń-a`ƈu&mZ>6%I0\j,8RFe $ &KT4)] !=墓t>C鹡V ݼ|ufܧK;1_ ZHR򛂳4]V E5sȓT UA5-6<=`ՓCvKkTYkqa*b|3֬\CTVw&r/./t~cBж+goKl@hW%Xb‰kDR# u扱;0PRz~C 2TuD.-O&z+oRj ]0 4.`!t0S 6WJ_rZ:mk1<" /gk{fTS7O.Q}!S'lud RKײbETXݬa뱂eEֲʊ^ gKgUYd^3D QG~Άc;.6 :ܿ"pu!tk' ?+ qesE2v5ގ@m=[xczd&9]?ln(޺}=AQOzBc):]>R"Nt+'Lt*U|FB("MtJ1OT0qHa]}A8H  2je3_5#aEYަ/ <ʕN#Y0m?dwݽNvjwop]LJ5aB 13LJO 轆ٱh" Eg77]Ykf0#0{st f嘃ǰZ<_%Պ!O7MmgTHFNzaeW|wZS| *O9לŋ}bGЈQdf((>DJJ5sgsEpz"ېZK&O%QyE%;v+C\r}IhxGͮ|2!v#2=; \2fbzyoM$J!7Viߟtm0,7Oh;2t"x>ktEB +LHjeaD$X2XlcYjim)P$zřBX9ᝬ,^lw JIsQ.)VX#A29LcS煷"xjIԅ@2J{] 717N W "'䭬Bյ9!^a r*"#h'O7 AF kd'0A jԸ,@Y\{MAl$=>Aj a(p)(굢#<7Aa4q$r/iz0FnAE-lAM @s\9bGަALc'b*#SUo;g} MIEi7[R"9:R',LdrU!O9N9|VH9f8YǴ&Qcp=܊` p]tFM[5ӊU >1֎f7=¤@ 0+u5 tսi4U ŻhT]#"5]L_(4T $H.ࣾF$c U Pפ>&ba<+&J湕݊_d&KAHWw>|WcLc$TO&&Rr,p8toM gZv:9"`=HSUz6!4_M!LѾPQ1" E)+ X~lVޜ׮&Pb$d+s CQ!.Ʈ)O" a:*cd~MEp42],ZbN})hRs #(t֟'^dqb=_プ9A\/Ov[lva>~6\<_'>^)s~Ԇ16Ceeu>1/9-@bňSy\X>qC#[oW{L*^?LB<ee4)wЬ `pa&7g2C97ޮұ]._YşuadBWHmn<׶91>Ȧb3N;+>.93,:AuU~JvATGПCͨNc +5\3w徜ye+%W*NO;se.gF. D}oSPΗhJ( w`[juv΋;;`Wyfav8}tۗ#hJe^65e[QXwҡ7>aLis1`|Z9%3xj|FY[%~c7]yB ^m￾{cG D4̆F_D姇aIhSJx˒T$+h?,CG3¿?K%GZHg31*؆H\4y!PD>Zra|g͸BN#򜰫 4Qʗ׸s7^wcWcEJ%1=33ǸC%?Ѹ:v4k1/=MD^߄4J]w?@t:p{ngǦ[T$P|7Rşr˙y PmVľ\,dNv'uHMW>N.Dmzok-Ub-MK9cq|o~k6 u˱_9R~.jdjZUPH1 #u*A/k{` &P=1һ+RZt~y3X({=NZ5$y+.JSH0݊p<5U(mw6-Wԃrz2nxY %Al"Rb'^]MҴ=p8.F"b$F0f^4ʅ+əΥ>ÕKM~1m4/7.C ZfTѧw26*^Qn2*\^6v>wԮ0j4trFf_.Ӛe頋z71,[k9WX( H%l'FrN@U) ʑ)p0J>HURrҎ]DWeYzɂl]k ӤOнK)Af)`yYqe< p R4AOi]g. q\@3}Z\zї/#0$K8hIdo><0-_-(O`Q`ԛ LGx, 4d:[0Cio.S\)Wfl,4N-O-1P|՟/CpoBe*!O8y)%*+nFd3e+U@MRQ\9$6M[&Ao055)}(0S`Qbbm>mZ;d½^מBŊű`'mڲqL~˶z,TPvvel/ZWtǖ] ىʖ==޵5q#V(_\'ImUq*ݗlpY(E6n?!))Rjhq[_7@C"?xD {Jsr. !i5N%;cJ:;B.A!y%5=^\Ʋ-!'"XM3;Z Eo9bs4': {BDV9%>dJ̙ab%ީ,ڸN%b>pi2B0&Ք#5NYO۴f'<5 LTdތYN;NeRZ93ڇ`c.YkQEO(Ng1Qt/BO%qeRyf?(~\H] 㑺wxlYf3жCt&&dͮ/9oӾ)t|gF3$ !1OKG<ߠAqg250=t^i64LlC64LlC&~t[yB*H `gvJ#1X)c DjmAxԛm}گ/m\!6Bݚ3L6Oa7+#^jrI:.=vȁ>Mo?*]$˃B/)C#U,TYI% QrhjƎJaEjyB|~qJR&%_7q~욥-mR$>t4վYU.V^}<* q,::vN=ŝy6;:'@qM}3Osxk;s/)hEPigYNVX*&<|aƒtw/Z9cl Mc+C8FS Ht 'RXȜf4&W>l)BZ*T9IK0Zר;0~^ljD'(ƻ[C/j~ WECP%%;]B )Di R)4 :XAHjdkrcY 3mnol$ &*1QTJLU"Gͥ EjN=BVG#W'4~Gq~w Nڥxka{ ̫4 %V)g/`ǗW:Os_1,ȞлъXf= εN%6FX-< \ ,fҊ¬&(̜&$r/iб}((uַsyBwa2(6 wtkNM:6;/|[_wf?=#jxv?a+RKE1ҏ'->h<0_N2 <`A=vAk/o/[OLJ|7wd~vO1<$[R jN'iA1I}M/8B B@FfCܹXqB΁~ Sv|e[ItXS  fVgDb- I)!*$0Q ]Ğr.l8+ԑH}{Lw?)}R/U r1J8H:{QWy_m(^om_t5ohhy u> mt^),zO j`P*8!::d59-쨹m+0M:dYpUs& F)3w=TVcG<~YC*HqE#F6K19G}9Y*H#!iYCGJpagO;cWq!$RJ<hB O¢sSIJ:|F,[8ҢtOl!zGi}R&IlD[R>1/^6A0"<,kP@&?YXdt `i42:w!Dt2.2l2kp,Ɗ 1jlζmUtlDu@'y^kKCh$q`MqvnQ"4E-Ku_DZ"L%KD>,elЃbHέٵfebeIX$ZAQV޼ 5mZZ ,x%?K}W:` R}84=ǟ rg:G_תzygr@יK$!}]3^JB=UOaλ*h<9s?%[!M{AdnUfCGZ/^8YP_s؄J -#{Afs Z5Wٯ;5%t=$U k |-:1DԄ7H_1p8Lô8Lb10CA3EVyC8 BN_1D0ΓQ8FK*b ,m&̒Ddgk&J֔ŜuQr2 >7ceur8 \4"+TR=zBִrE⊐cbx  6J#3X:'>j6c&&èH7Di}n`FDbO8Fxo5M!rI.M@'~HZ".اPXw_|; Noo_i V #V$]Axbj5fS?̗)ҬX=!;Eߺ#m?>}PH݌ЂtL0Fzo#+bTu2?a+R '->X~>_L7}&@7w"Bj_y{1t>l|?^]P"εLGO߹x 'e[)N(""/]᝜/7Hk+L%n-y}EQ& 0D[?Hx"a)"6#f2QpEZEng^8ELR*H0;湉6:hh]Ȯw-m޵xZ2#'&<J瞭ƌ\b+ ;B"DdBRG/= :)Q(ʤRVX/ӷ>\#D|5ipnԌ,!o 쓨taK/6s.]1'[QxLs(XDa* QC4 )$JoI&D-_ŷo{A ~یf䘚er)-w\~4c}_7}ۘ[Ŷ_DgR/!3{C5;| e_wxB h|xY?7Sꈙ#{79}}3~' 4kŞ|V:VNm'|R6\Vm~j,s|D%C4S1ov1g&f=nǃ1")wo%p.]y ;cdR9jM`Z ^&2 V*.>I^AɂGꂇR((( l¿)9#aR1KM<mVA,#h">SɯvSbZC }yȏf3ӗ2Y:Ri^937YϢǼ_ lv$>m[۳-م)v;9%2 ˒8d+Xd},b\8K/ PYfɄ $m6.jd L)g̦ |S`u&\<6 อRUCol4ϴ8Ӿ}ݻ!)W77)9msAWs}NN)QBxN9]P)$[\H썴T\:U3jj-i(L܆y@iԡ2 jױfe#N #-HG%H"8BOdS ^H<#:8nQP.r,^C{82o3`"_ ͧq, NQpS-*$ aՒUUwQfg%dN;4xJb=qm>E*>1ZDdҮߔ)t8G[zH;>[8 8Z1qkfzP?_/ Vv(]ussƨ̨<>ޑs:ͅNj>^pX@s y jX׺2dX^N _JĤzuN.ZYXPa&a3&l˶uNvmk+ *k$`'MGx}թzMvV]y )h]\0 Aה@!SFdӷha@ퟝUt V좼sTt"Q`^c[n|lݞsR=5JՎ66=)ut6=i77=Y4鄏$^:KP0Xjt{3-3Ax'|-ܠy zR z+F_ R0ĝa^ٲݾJby(iivjAX~h4yh=Xgb dC S*s?R ɔʃ@5V ;tX f:UI두5YI9HfӚF}"܊`Œ d;@M|~ Q꾧MԌg|5R`):UVy<(b~@J x(R3s kOarB`YxM A[Ap䃭Y`)=щy~ESwv݆Բ3m/?6. iI$z@}u{|Pѝo׋?7#ww (\QON?1ǫ7a51P؛X|q6k ˛koOs-j7XMjGOcXtTF{!!Z4:u۰n19XR RNi9^KѬ[qGS[#>DƔLSw- "zŤIt231Eg?Y!>-i[abvvQᲽW(v??lǫp8](E>jе+SC'qj^M } zv5[AO*~ RЋt]jA_3׷uA9ۢ'e4>5uw紊{rua){)_*_ ѥq'T_&D ZOMFrY$O.x[5k2'hbAv]T|8o4CT.)'Edng7y:,bʊ+*@Y 1\mVE?aG͂,51S:bqNLՎ6~ \cBGzE;ڡH wN.ӅD ъwRz'.?L+N b9 ?ek+BK(j)XbwDq@6- ;m)S6sNgBz̊zP[$KZ [Xwc4- Դu2Y-\;Gy@cGz}~;n\M[| !AO~TȉC8$ChKb!CS$_9˻ &S5Z;8cӫ!'sOF {W9'2l%mCSQ9w/r@Q $%\8qXDjAVL[U)KYd^kDm$ģ49 ƣ gETς;@MS.HTIեL(ÑCECB1ATx=>>4QS VWzJ"`CEIGkކ%T:a XP_#FՌ1H3>!~%-(gԭΨS`;tF] $IEQڡIEXpGeSMV!DTaY; MO}':f2:AFE9N/ktȷ? ̣%i~`/BؘT N2-=2%CcD)(Z6zw2DmSDJcReiEJ qag1lh8De9SR͕Bz5r:P S[j_:8k,f6d r*UF|\ٖ 8?Dѕla{۴۲,'Kj(ר[*`?xG}PUļ~~]h&>cMhzJ'Bgo^u;_,wcٰv!7g.|z%P?]]A(l ,.A\'lW/f>}Z[*:Dȡ\qF΁1#YT3cQrvИJ#*X)? x;ŧq9dLcJŋHw扙8b8a=ƻo ؞(ӕ_h3OȈa%2tg!giS^:$:qrYpFNjaC5$Rb6bܧo(n(+8Hs!jyv܃U7&PVN0#TAO& sW'`?]^fsRT(*5ECq_>Y'p1rءD(sW29/g\_L~ $j-ik’hkxŔd^YC6>Mk}̯FH¸^8JU1LF3x/E@)u0Ǝ!/)'=Rf5L d ~-N<+6bZ D|$)f^YkQ&3xK<l]X 2[>\ŷGdYZI%p*CB֊!TbB@`jj!CFnf9KjD.{qF*X}*m6`\0*5`;**+nJ5bL0BW' Xnxt ;Ģlٓ9ˬlUp3,LY!InN{&7_St@Lj/NO)Hȑy3 bay"qC$0mw1]/OtlNj^u*2ŤaG^1%| deDMvc`in͂0c 3ӵc` DӺ Wp3$^H ь"i1E0/es U8a1 7wt_\ՃX+ld6w ~-N 8I YXXLR2k񰵀0( gf԰(i [L}wGwLc'wdͳOuNӊݛσ kn+ 9ѸI;8cȹTlǃ<-L-$NMЊK,tb7x\qo:%Jd|4^0?¡i.oU~v:eTypS*8iKt]@g<Ϛ7cg ~zOT0gsh8ͽs +$V* cqBW,LO+55bWFoqހ3U5Jot687U1`/P+cg 4Ùk_ͮC{l1>-cg..^-b:s[* 5 o2䇲d|^Gx-~4UJ7}d.)& 0CCq Si:z SjL~]乭bQ[ftUvQTs,Lg:]gO3cd9iX`$ %1oUU bߠr8AQ/9Q<<(,ey/P,n,!A`TO޳t{oKԚ< w*>%Etq}V"Se=ittТѭ]t |T(RR :Iz@3m&n'iޞV&$ƇOy% &yh=GR+iBN} sXc,.1P O?l`J!#k$zUn5\4\(_6d7zAc&~ 3odj;-,޼9.M4&f6hV12Rb4Oj i0tEZڽO>LV2UoԤٱ*b65>RԸHCnK`+[wm-IT_):q8!wнIӡf)?Lz}cŬT V$+HES *a^)j5v :WCB- uϸ&JrsكN9T,sOԈ}3Q83xl` cBIx_).rBL-9Dz 9+ @'*^+2p~PD^Q{$0@B)rl_:D &{ xӈbc9o5{u'|2;ub=={n#WM:r6XҭA;w%di@⦛/?MnB?n>pCM7UARn"gH?>lqN,S >a)O₩Dn}e6Q<<ٹrAGdϗ2+ v'DֿlѪ1(lʷ8T$"$wU@&~_;1ȎQ9R'Ion:x5 e/0i~K<襆D]"Qa_D>iq{7/^J7*,\(H͠,D q܏"?jr)y?iC]aDxa;41Ck~xEC.ZHJ.9vp.dTF{R] 2VnzxiۺhUzX5UB*= 2U׍[_:}/,8F3 5R(nKp[,Gg "q{K0S.:uq,Chۛoխߌs0JcM3Zu {/n̽VruID"@S j0Sͼ2֢fgzlS}O>S #zAkZ,~x şv8<>iCϽKhڽOa̪៰_20HDofMt6WEX]LaCv1=ĻV78(INn&*r{IJ,~ y) ߂?o{2PVg Cn{Z Ka_Vv>T~=s@b%l ȅEXPi=o{jJ@PM"M͖,^ Tl[jL*p FB\Wlj*냧M;BAe(εQ\DF&G_zH֓Z \=|Wjm JDye κYAp$t.le'Ǒ|;緿ɣQ 5#?qoxDI/*_V:q i0Stۯ:)t r3yPgGp(@g63n4=%jÑmf6E׽\379hu`ԧp@LN=\\"s<*3vx12Y-o*c#v6 |z>ciyhmY>旓{N-{0!zs4{zg'vXth*U.=oyʥKMh%/ ^DtA!שR|cz)hŹV[ >Ym/jܭ&gFeW^ S *[&MwuW1A:͓ŬN0wK6sΓ1G]ԨPw}klYW@Ђ5cj{qVpځ:} WvLH@H8!ʠDbCĿƐJ}NEk&̓ԋ^jNj]"w7yT4CfrWJ̭9W#=66`y+a&O!NfN:R#1TgxxH+>ڕDAPM9\JHHKߔP {iAOW=2XIIg] qqP![ T;C}ySewȲ\(/l6h4ȼ,'Ѝ.aCwli6qhnw}B /ztCc6xNnЭEzH} weS徻 wc,pjf7Gf]y o(A119uQ ޕ3"GM%17HF + GC[|M4Z4K2dyneQMa:Xm͠|* IsV81R)tJL3mbn!)4(iMhQf߻~C'e݂A)h*&!)Իi=#BncC90nR.~(i6/2d-X8jdU XLz194cua/,)nO66BgWiւ3(F{f-'KBI}F}_-DӋw[$(|hg5SIc[:ųrp9.xgCBwwt< ս` Ȇp7tF@ksL; iplF dWG>~\sq Qv6sq3 &bIG/Ƈ_>|~! t8N8ʤM%j Z;Ĉ4)KUX)oL)"LOBkf ܂?o{A9S?*<ҫJI3/+UI?ٙn> `=xGH.y_S;D=~׵>K~5%rӲW&cߺrC{!3{VJwAH{1b>Y_۹wv0S 0Qo'+2'ow,$ccDcﲠYZoCBvrK% :iJLdDB0 e/l1/%=36">c,Z)R+vG6hD;*eRhRA*Ȁj,v!EE3 8DÕ:AE=.A AWı2VHp |Tճ")F@IV.*2ԥ:u>%J4׀pZgJZbOSLBG斄@<݋D(zdX:,qѹlDBp#NnS8氠0JA(_~' N`BALR_1wɁg'yN Aa*CŒ_W!̤]9-+X+.Dik!&/iK"{˔sz9_+s@˓_tut}A>eֽpBт5i%@VS$H(BИ*Q%WHcBa]i$45Rb^#]:{Y.&fؓ0PlfK}6{f'꿔w7BR?=ݿ^O2i\~y5ٯ5`a&w}K:J$7WNg!/F} KB |oJ1ljLGa?I7^a+4l<m$zy?e t ロBF8ޅ18φ  g+n/)uۦt~c2;UO/ghgDe[rU\a؊jq{ +"Y#jbdiʈREƱaW߫|o!}:s8SRH)Z?zvBQe4+]'/q~P;zĀ@sCGD> '$oe`߆RA*}>,|ӐgYɛvr gEośV~?~Gwןlmvwׁ!‡B |aj̻iaV{/}COtffo`?=.34ׇ쌭͟q΍"DPXM2kkrE/Sn-j$:qe;/vm쵎IcN[i]l]] 4F  BsQݽp=sO D`M]~PiX%PJ+qp4H bB9E#Mc_JE[އiʈ nT "ֵ/AHX #wݵ^~&|Q x{t9ȞIkh*n@L*]oĵhfҹͲeGRf9&EH"$z:^joXgȞ?] /+=^K H5TŠݞVMC{*n8I+u0 ٌ!Ћ`ȷz;}{͞LrǹwxтU׸?G4;VQM#>"-`Ve?=OKDH4O1!a$ʉ5S .uڑ-9/50[F\:CC"LfQݠ2Gk"|`s E;]kwcFKj8ٰB4 }CdaȨ*ӪC{_Rݷ=N5[Jkh\~+6PAݩ ?rNȎ}sߟܐn *nZ FםjIqM)yus-1m1a嶇nݑ֭ UtM"% s)Оm]\웶Ӷ[,^Q_*Ǡ5vtY4l^"t_]>6Eɭ{@#]KoJ` (v 8ƥ<#}YWFiV^w٨iEureg+AŚ5 OUC7ֆB 1VwoףLM"93:2Ї|*SzlX7#N FuRu;w~κ#PDZ>4W$ qnZ7Ӏkmi כKh` ^;DvK"e.^w c]b WpK('KQIB"q{#'8!"NS⚒Rl̳`e!/o56ƚi}yۑ}EHJo$YB!2C :_[隸kjt* _Ǜykn"TC ^yRɎ;(UR/345f9.sow͓.'] W ^8^-ও0@/ {51P(q.)]YX-9Œm+߫uhZ;VRjhD5âd_o_YqC36ay6%nu6 i/@jVxtIzN򂕂:o NXޔX E.\,<Y\#D>VNw߉܃+ @4Zq^ZdǠ r?y VsaSavΥ [9mT7c‘Jf 8QݕrTw}+\ C֦VEt;1ZǍ!mL|L(S9FP[gzPwK2ܬq"RĘp%0x]K $OݝL*JޡEIDz:z8w&ޡVe ezotihD3B?cG4+&FN MD&.fϑl-jH^]vǩ\ ǩ= O/V| +Xԑ`Ip,ˆHY0* } YLOCIY󆭝E(x=tlACpW89DR x(/'z<>Oę=fʓO-cWO`ESGP@kyqϓׯ)c+^cI17{>Q%J[ݦpa8 jgeI`ge=U? ƛWO beD Q!)M\%dA4B"ͭo^=k̫!D$ ݩ9f)$ֻ9̻V] {>3[=>( W-+̠`ċrxN D[vF[Uև/]MßnP7+ѯ"kXL:; Y9Vf(XV *Xgth}IZ9hjFWUmevO+i~1u?]!2x#yKfOzX|ْo+rЈ=~1F>/^_[M)w)-wI5<.YwޥbOT^8"~Q:N;|Ly[ Xo* jFK!"NbXVHu2;lM6z/ 6|lsλ"BL4T(2K$9)Kp=P{ g&2@XU?09r8s׭f(QW›n˶ɗ+$'ًFz:TlXT&4KO`wȠ e>CR=&\hB=Bj\!j,S1-og&FO?F[ޘcoyk2zm!Qʵ0 Y`6@9Ж]gx|)TW=z;{Է! 1\b0ށΊ}׫1oq"#WR2BIWQ}AyWxih)VTshTh9jl$tT+usR0CyS# h~שf]8-R*㴴.ZzZ NKY' ⴴ-=k-$NKAy+$T8-֚ɋJ3c<1tTff⢥秥"uPy~1,"u{ j54f(GyL54Nî`S C Rܻ$!,ƍ ?@nd0Bɺv_fDC% vm W%j=6qGa`58~[T[h qgxyvi)AďE;^^pQ:u]Mne@u"tM՟d*gtSHmIq/i O8Ij/ ωYvmmȤdұF 牯u1['P3Zr ?xC o^RAcuEibHX-qxe#Ji7Z{ >N桑Js1rK'"<7-'fmyj q2;HD2Q)`&0KD =\s#НLrS319~zќg)Q6dҜJ ՜2fiӜҊ9~FB~שmYަk)P z[1sP%-NvdRj (2u;Cc2w" @!eϝ\X^f%2̱ \g%EQj󆭝\F*B.gECG+5Wޘdr2ԃ*LF TCFARY7R}g!+mcMo&7fn^yLAHQo=REGn?no㳊ubt_OLY"x3cC}A1!=8>2s-L!!įL,nנSGZYT՞W MXR-qZzڅiڪOD҇wx~k!V7DnI*׆;qfIt\|}h4n4-}0՜EӘLkl(ݝeoJ6TkQi!,}zV -[⮂FUUtݻf7-+G~&_Oyv3^]nF[h]|]{qzTh A}{^ ,,ټ~Ո%B%/h6]\Cۈ);v!+Oqv47ޭr\_&9yh&g:},TEz {&ś,/!Y@R%[~ޠPkG,NϕR^e5dVy_vy?fȽ%_cunT=N_ Rղ㾌(GYVV+| @yL$ng+4PBKsU6'QqmDH6͝57%}Q7t\y< qG]dЈ'fz$oS%ҫQH.\:mZ*!;h[o"?)7Gg92lC=ruЁGQ(CQC'!u!d2B<"݄;b7UehQOG)G~ףj0U4XD';$tm)'?ϤXEEMk5ϰFsS:蹆_.Z}fuӯUA1oh*~Ǐv5}'L Hc2msqIBA>4+f^Ե 1cȣxĕ`xțx#%8ЧiF4ȶn*_PEQǡ% bMQ mdkx?Ըꃠ2G@F SW؊UQ&DS6""i唳 B)SGP^"/ GШۛ!G71}ˁA0jW!C$X&\Li"B#5HQN2RNR]FxҹIs_Zsބzԕ{M$CRFyO1wE#j[λ1hSj.E,z8*>ł&F&iT7fMbD#ìY]dgnW⇝ֈJT- }q3 %?ߗl'jNrBhqYtk}&8! &OyËZݗ :Oh ,J]fw !1l;v0D| CJR"c]Jm7{d+NeH牺Hd9ʩlԙd(_"fQM|D9ck3d j7ԧ04bxp~ϒώw}곥{wod/Rd6{|#Ib@#dq2oKj[S%Io]2<=혽;ixl9O׷>Ϝm~tC^zPl1tֈ<=tz2wQQY!i*Ҳ_ԓ9ŖW}jJnp')=PhoO:*0W* Q*r""uk񾴗Pp &jb)DѠ9g~$_%HP ˼,j 癕u6hZdk:+e's]}AG2tSP_NN4hـuOO HJ ;j=V9%1@2SUW?+fQ'4V&m?dq(cR/A((Rj~pK6ׄ: D*, bpWQ@c aqΰwoCپ=Bz_cNYg^A:":\GocT*NqX;{ghԑ=OK G2>-h"qp=˸$rp72Re_bNV3%mw?oW?~ˉ3^\/C~1[T=N,L3ǫQ%~Mi\9=Wo8 gK$h\ C.#8A( 0"#\|榤o}H&nK`JT5Й,igjQ4i*j<&Giέ#A(#h 0(5UXcg ޼۸esGCZ ;bM]̴r@z4 δ[ x%E%'V}ޕxŸ%Jt٧XxƝ?QtUڐ%J Vݿ k9^a8/Z+vf񊢔 c#,&s T%`teH;ORE? ff[X9hdR)9c(e\xXc5yP-aXil-\b(OS D9)g'ayЄqٔ)H%" bTy4bbu\:QG(WD:!ZtZ^cPhDHx57CTN+d)#5D턦҈IFx%I Z Pa[XCK1[=?*߫:]" 8vI(]E"\# {#VR"~s__TLGStрс_U.\M&+ *aWgbg/>@[W*`r{wqzpIS{}[8>P9x0 MC R#̨s`ge©rm j$n 0(bY I:ui7Ӆv*>Y\P=M_Āu.ƞ"Wy svL^kz~E~F!:zD B|ưP.7!ѼRCA)h*ڈ.(VSѧsSD#i#e:V F%51Yk˨7AM]V0έaPrPsVFvA9:^9L?e>ߓR:Fr84aH y^ȡvxXZZVy^/[%܊"`4t॔E!ߔjGH3hk0Oz :zkR.z5,!N MzHNV(# +X|o-˽jtix5ݦ^kZ:Zg15G![j(SC7cQҼCĒe\;Eac^g[3>1}? (Y"cE1E;V}l O8cVr {7).HZbUпa Hf\hߘ$Ƽ0LjV7fu=ZyaH@cm;;JIp[~1YX]'Kti8HjYILb?UGtNŗ:1YƿE}QKU=}ڹi*cT E}PlH0J45ʈjT:Bono@ t{wאMw\8Ik`WB+Pػ}j)WVׁeWL^5"lH[ɾf1]͹de_.KBʾ>;^qAW'#֒g|9i)7LkJ.֔6ޗd _B7Ϡ7.{gj+pF~Q-ʺlG_T_D$sSI%םHDYʀ^"f>)6Nj'7mJVc?~޾ Ɠkqv>+qvP}|qϡ}pp%_yLT$f,L곽|5W1_]L>ϦW!}% I@Y+ sDh"bJ/UׂF^};;)AC!oܺ}'*jϮNI9*G̩aj5%T6m5w;'׶ՔA>%NZq0Xf>s`64R3:0͵-tf=LkwҜJ&s(;Y?Q4mX}|̰~2RTK9EEEk sҐ*k"J&TOF_B5 m?NgLkMi^ 5QR 21:RA9ǴsG6 l|Fhjԥ-/:gA˨e -B%r._zsioڢWcFQ}S9.}e8_FD( 9,\I-$ݚ >ɼ&͒y1_ޜa?G ?JYyD0@`t,nP0& l[*y2^93W4J#ABPT4SC$!ZGjP=kBE5j9O*H6@c\uqGF%tto!%\А $|0&%!\f1[eVE &vTɱrvI1 g(j[δDjįbQIoW|Q.F3u4D(ILA|;$jeѺWsk[ClPB뻎xD|W~s__TLGӈ=^ (P96yAf=(r:[d2>x4dX_~s_Wgg>m|'wG WZ16(ؙCop|>@ƙ&uký!]Ҹԝ3q!Dߊx2zt{Bq?X=tJ-|0}:t(p1Ҙ-2.(`삍 6`'0Z@Dbz"SNMB't#CQȔ]!yX\^fK㳐*vtX0]M c+ttCˬ쳘}NgqhE CjMt:{4c|47 eEy4XAVȑ٧h'O_Wi{2I.?R \ Um[<[trFAkT=@d.=Nrd$$U$U{༎1' >Y ML'ᘶ& B3.$FYm{дFiO]w8D =-NZ։0)H5G~XH \|q12B.RpߋBLuYebv(Dޠ< &&<:-S) 1}1C "6t_BϜK@z5/()i#ĸiQbuҤdLH5£O6q7HdւC'@"VV&T9 czUbxB=aiO$3S;Dw(9a"VY -5!Y3TE#4)~e蟡?zɷo0 Ю2sP0o338sFnŘ} Cʺ±D`d l_``Het ̢1 oR\C6c$]5F3Y9 h\xF3} @B\!P )ܳ Alb:9(Wtެwh%(8J5:j-hBy,SNc-JBe{CJ)tSIIZwc}0e r}\ڀ\%Anm (R9]GȚt љZe bTBw*}:blTxE8鿇9¡Uϗ^|7Irף?tWwOLߊyvz^}ug{QSF~xK[)hBX4Ւpԫu| k_vA$ef*Gc axaW-7ǃŭ9"k2~J[2Q-5qKaT  `6J`#:y@פSHn=2ZYe7VETI? 4mEx<#h`bUQ:AڙOP% GCd%%LulgD&t^Un*s979pXLB-} TaJ3i>7jF;ܣ A5 Kŀ CWN?c9s7C\2AK.E4F39XUu2K9FV0J_W%u5O*f-id K|yb{ZZv3IP+mͦ*\MDO$_q״gy/B>BR||eFށZ"s(OOuk5d[*sWb(K]Dd A'(5> |`c)ovil%E^KNͰTUGN 4* [9QUSt(8o:>_*oUC,Ua(q0 :5qÀ`˥\+ J9һC&S箫ͱֈJ9כr퀘W ] iBWQsPX`gDu&ɝ7"¡6m޶p(TR(ۣ?o %O7dCR\FP+ޖ(nS̡}$pcynjRnr]w#1LLe_]ĀN2%avߣj ɧ;6(g*7KTCK?Os8CxZ٩G;Dų<|+ςY aZ=LMpJo+\JIv +$;,T?U X%*-v;"8i}ڭ E`J.F-(ga%fz.!01d,X!HWą>YoWe2 yI;9^=σ8ߎE]2.}L.}Ž27s"CO35wuDx?`Vݓ{B֮}ODJofEq2Iw^ڂh ((Gǖ(KWkG/w5). S!J72fPM=D)&L?* O (]gႢY k͌J?9Ȳ$rt<ɐO|a/[\X jHpQx0(>}r#[̌r'2.~2hCn~p&8/e|(i+h^`]I+ &By7` ܭ#n!)lclzSݪ Dvd&)jʼnC`-Y_)4a3z9A^;#TSl ?P6 DB|.ԩ•r"U"};=(\b ʑ Ȥ`oXw1bXZHe!A1Ln􇇐rcAr8yDvdK `&0 +n䴞[{XɊ}^ϔuꋚa)^`kyQZY1=܍{32/\}a,!zC%׵|Sh2!(Ű诃sqvc X,PPYY~unlJ@&at{1FY w7,KB`پz:SN'L B S0Em$LVHk^+V/MF3PP".}C1NrF*):/x8eWܪUuƩedRG |UkS>ǵV>6HSʁqʁ͘1nT^^Rrh ~9LӗRzC$)#~%dc{ᴆ`@`Gտ6Ί^'AW6dANJC[C8Ac@)qˈH,79,uIyXlP xG)PJa90%4$%DV3.*P0lTSFJ &p=7y2҆@9'H1A@NJc2Qq"h&ZD*rdbvVfv]31Rh$BP4g:$LtXPqVmA,Ȍ +vf Jhi0[``cn& P(U0{\a^HiG!1- R}c)B{#W^4΍Ľa L_b^JDDPY3y("I |t[8{ن%`g_$d*g=yes#IKV$c.}jK,^݁MW܄?rVD§%~M\Kcذ`Oui~R&rT_V?1-dq7R{Q5hy^Y~Z{SR/Ph2Rg);z٠CD[)<)8a `w4:*;xIp{ YIZ`|d!s\ʶ@h7&B)r#+=#Ř뤰V 9^o]۷%BūqcUzkPD(Рy/ >h0D%Qz/s2N<V"lQlr_9ңyvZp5 h!vp._Kj+mekBE'jxA~ <89`bfk l_cݝs\Ƨ!pՐxٺwC7hGn_9lNFE@Vwme[ՏEpZ0Gx:G#,V;Jxw7Wzjgrj lsb ̟ ꅥY19VNgl0F?xfp$s]|j`ՠ {)i: qH#L sso7dbDKI#l!HKe{503 {®k] Rg&UIxNf* =;JfuZZB {ЯhÉNo?odNi|u]ș<9,(hN' _Gos{jr辻.4ipbI^8 pAwFF3Mc1ƶmVR )'ǜЫFF:(;3tVil@O El+%_+eb+ڜh뗵ʖ+_\f8]0,NZ[C.)`thlw1(`Ow%eÎI[.$-+lpѦX68_2ٸz.q#svZ+Q̺W==Kn{2ւ˶.1\[ (dcDz"Edc&0+.jQS$Y`.5̔p )N!A[BDM4ACvid 1bg$(5 C (2)#ʁ,K3PRPZ!I-IϘMbJH^J?pD+[2ԸtRQ8-ΉiJtUS3 >j?Q*GFC9;!6FN:fUb !7+%B uq+=LBaj4x&uRyËmʌ*dGUU!pUfoR0fQf6~@aL+j0ny)G޷SQ9Grz %>A9DP@G&9"G1 ZgK RD,)mI?Gx&mтk\B1?hǨ=cTLH$6F7 >Ks/CM(1q[f:.!2 QBK }s(ג+f| R߬J H24X|v,wo$u?rjr2lRZytOG޻M?=z?| 5xv?Ǣvk緥mX܆mX֗7hxWJ[u|W^Uyoþ(6pBS}g^:~?8̲hZB݋4$?)F$bA` enŃG1jsll s H09i yX(({4,&11GZ!HXXdPXG p)UrlT *CA9'ܿ{ hr"RyCƶ.n7sKދђ"᝼kHB\cZ5Vo{mJ#{:>8&{Z<ݹ)INrޥ;W[]9θR.\N)" 1,me ORw%@@9EOR (t k30%&{ѳS% Ǚ/05Ƭywx_۽:E>*M=P(yZЂԲ,MN!H$oIID~EJ1HPk +cX"}j QW 8!m̋os^I@]j }CF"(=R ҙQ*bNr z-DYr*QbJdvfKnJa2ۗ=Q SPK<^lP=WbZpYB VHgUX>82$5@C%^2 -Rj="k ђJ,)9a-ׯluIOռ?{̀ep8 \,{=}֋͏0;.^';>3l> |+繸qwukOy*rͧmɟ켦-6c[>yӻZxޭ)1Ӵ=5LO^Kv{PESa(iur٩:WZ}(/Xy8zVެI 3xX4 kkV@yU8^ӍV4SQֶZ X$t@QO bJ^/M3qH%*ۮ O JwJw{f`+#kvu 0 "o[HX}"h0$- G3a Fv,^n]YnE={ر#aGF-e;&4INp&iNp1$%rav\)L JCGwl ¼Ν1Oxsi07si$|X\ڥa.W!+";&L]$1'g8u*8H@j{d(ۈL VQ`?j[saA^@( 6\?kqt5NNb)͹IpVʂGr"z& (/fZ5ǃ"eCTwUL'F:D_2X(4MpB/Hm1:Dס>-uUwck J2t$)2R [*v: }~)'DV<)7mA& @it8WuLdP`8."ec$cd,C~ra%gbXǫMIcJ"uNH3Cw35rjfOY3BB;J%)NG~Ee3'5+J1;8aIS2oōmr8;VνhTt먣ո ىatxB@<@4(.sx\'!?|nnNQáoGᲄ <9y,18SBJ}.bN8$ 32~<e ?N7 N431 z e`le fUb3 ׄ*0ټzUӫIyl*dEU;Zyy9 KZcH 6d\3bXÀ)Hx бÆ(L4&B@`֕' ҷ7<ӠctC 4- &)ׁC΃(%2VTHC9, JB%E3$xЏcKpXWyӗې'y;A.Wlȼ,&'+}ׯ~3 "!DzA7_=92G0|ms?gMzy8?¢ou_wlՒ9wx}n[dB"Do>2ƀy?@F:0m})&l̀03 pHe{w7/;W^MSb=MgOν>#H MB4 dvGU.ZO 'z>{+(hE} iK~>k- 4!HGBXW::0kA$ly7bpp m4!Jz]huaytVB&}Nim3bhgM+l^QeՁBxv?X/S?Ym`"o|N @0s a9WN{a@ .K)=(@Z &03> kf:&fΨiCף};i8˱Ukބ4l&4nhSTp^.>x-w9+{S1fVMiRV%cȡH@Fr տٻ'm$n2ޏpsuLJKj @[Hcm'QHek7UHuntg:Ea)p3&˗Sc@ *j%A* 12".P*ÐbHˠ8:JY&XdH,S@Iq 6)kADMkNPV?kA@1u W•]h3 ;D5-!i1@tPF@f(u R!X|RSC#$';U#SxU {1C<S# 1!rY%|Sgm3) z.2@d}l jJK=R̨ JIੈS4;բ\#>'(/ˆ;>0@1V7'_GY~N*I*5͖EQĹ};Y,矓oqIX=,n<;OWjyt?:^t6u˜U0\@aN&[[s -t B8M,ZiAq x ;VգMظeoPs5z{;rLH.-]'ŋh|w1uuźO1c0p}qRA N!B.Aw|HafѮۃ5U Gm(mSe? \V[=mYOdVG`T6ElDZo681w6 w}пΖ@߽w@L(CE[QC籨Q=dׅFڎx 4nEz ru%A@=55?a3CͩAtV7j bݦ 7&1DCt McUBCU!vFr=偪;˜}N| #jv>t,g+ݍw.iBt`V>ncO7~k>$;ƦEl GAѕoӦu;q{N7X"9X| {,AZf&q'){3 6s30&pTPOe{w&_,W7߼|RzR!%@vJt4NmK!c^[rP:{ 1BUߐy+GR"ⶫL}VQMRfj". yp5J'̡F rV'p2A PmۻQJ% ^RA2Jgバc +ɪڠ8_RW l:(Hsٔg635 ~Zv`pE琌$~ e{q!巣}WnM9h|r{R_V u}3.fJU7Ǫ Q׎A 6q-̬o<Ҝ>uO1{/fcB ~tY%W@إ1ղݺ! h/'3"|M8CCs|nL)\;nJ Dl ;dw1~)' im뭉v ;R?  !R@7{ hL:MBD睮"hۛEwVHߛ= k|é75Y(QnvG,7.w9.![e30 ] ]Ͼ ׄQ h%*͊UObAA|3$ y|3@:Ť߳Bw09nTn`R!o8V*7||BVmPY)\RJcٽElK"MiP= 8 B~u^ `W*U dN|WcU/s F9rxp>>1s.׾{{뽯z7d(YRlxJ4MKS"Sw߬p RJLec~~[&j[5~Gݎıg/_PVr;U'&zƋG0*ꯎ \ju@DB~&>.vjE.JB$Q$=s::-N[k0(G6STX'!a3%5D JLP`*0C6f8If j)ưUz+ޗ@{ԟ}9)|Ҽ{q,,/?tsB@?} vME>};.Շ́[7*/%W*q$gmO7j{i ϛ{x6=J+tLH"n ;>x"h0R7[& Q_N1D}N&5NnH0 $IQD!J%\e{Bpă6l-$#cZ )P,5(#P[i0p$CXIA K!7sc9ԾXΕ꣚;\9ysҗ,7Ra̿[1h,[ H divl2;ܹO~}ކ+W7y\ڹʪ/>(αnv_R"7|=Q9)Վ[Oߎ:q֌wE[qMWjywv9r9}">Vp)^a73cD^sen=$,I] XޢU|z랐&Yw 9s٨ 6L&2DB!ʙ{mr$]E+BǙ>X xAǥp[j9w4)dift] [5MOfExV `Q@eU*DZ3Ιu8Oׯ#G] 2 pAZP j__Ex33G{w\uwqk] 8xFYoB*IT&aX R*\ Y.53dJ\I^3/j#c|ɝxp)!RЎo^<:8o*]xnI\WR7>w\WP,$ ^jL0uaOCI0ɹ/) ܯ{o{k[sUks&"N pMS0vKO:0mRN @+))I\H؁rco.N=!Ҝ# $~w}uw%rSySBH*id J' \ XE:v(|G>DH Ln}{Ⱦ۾=cedb'DIJHZmΈ)J1"m߈Pa-R,=־.rWvw|'{ Xhu0Ǯ($b5YhF$͸J"3MqʀRs5)ޡ8IC5) Ӯ-;AhZE NCmoŚu{@ळ辶I9d ɜ8.U8C%;Rߙ9f˪}WEBNb&vHG%b֛9Cp-b#\˒,asY%gjmSJź,$FZ6 Ü% 2 }F_PpޙyVJ$h@ȷ=6ߜ9Z<ݻ=LFK dZLW!u^G!8@F9ʾdB!i|t %ˈf탯#Vb&܉d_few=`G=>8E1"zYn6sĽ%c*,vP3i`]b>QnNI sY:.*jƓg+ja1jhX}F f*X l=Jd)znC;C\QimZQ Ai} _ȴA'K:X?U2u6y3"Hfn;a $smܣĢ4K6I wK Qi܉.B8uy u8}FQ./GƸD b lTolwWRu֚ ̈`cz|1#mi uد} /T7Z(1Ywick8} @' }ѧy 8)AdJTR2)Z'*ZRmJ-R$^Fe| HjaLJ1Eh'"iK21"AYCJ u?e#@겂rrp=Vrm6+qJ F;P"cfn=VxЇG9UJrOLh œM9./ݖ:K[-,"&1oo[v߫GOMufXϗQDs֮k{- AG.Xvnͧe<ԭv*P 9%`=0癥*Ed)MX"5!̈;J3cmΟBz)`]˾8/GΕGxL P+oƧaa~v$@tAt_eEO,~:Z|vI ˧nG|Sc3zX~[ņnL][o+7+_y"36 a$8G<$X3ՇbШ`kS` oqF"g%`nX*+5TK q*ԇJ~*U9" !)r#ڔdw~9"ibӨ3fқ_}5dN9Qg)7,O\rA'z9:jG}ζy867wEFS4tF}ϦO;],5 $%*@Dݴ%Mdo;wHdkR.5C|!Dq1C.RtQC=D;3b(;% A6bWOظڈz8ֈ]ѽ+ %RKwo f_nXضEC#,d,dCo_"0japg:wwtj^\l >}svɤ$_z 3'@o}o*,*8FʌX28@?U_Gz=uX1o^}B OX ˞H]sWR~ej޷zT.s(pj%%ٚLB)VqA'^jDmν. u,Q]2ՓI[#XO*7e-t|SLTf2y&)qͪ/v}cgd[ ǔf&k&_kR-&&E5!#pbn6|&]zߨ 7ZsXUa=%i 礪`6l~plVqpXo͊ĘBݠ9.?2`g,=$4l$,dBG{|*+&K;eH\R+Nz;-oLO &^\3ek{ɕN7oFO^dAAO:uj4~q9{\^.ז5D|Fp)/~#5,lDQwU7~HE[#LOoS@ja~WL/̫0~ )Wl!֔rDBNfR;Q$rrrʶrvyMv8Xs Tx!"y!R!D(S{ğ$R_~KI2/@$쫫~W _߀Ab<L{cלϻ} npvn~~(I.4:^c;GH"ƀ30 o7|Z 7r ~(t¯<2h~#e1``K~-}Y6R qKTHA'1+Q8b5p0vāR?YC+ycsbn ΅m"8RCre ImFHIGGVh|=о*m[HJF Pܸ(wH)>,hԖĵ@V1853 %Khn #X U$xz#x~o`],"#$k]qK%TPP(RlB6m۹Z)I-rxV iklVVBCCR܊4SĸCZ9=*TG.N|JI`OWBo%;.r$>~u[l-*eEWU_5׎vvRR d1$JH3C1s x9d*D(\_cN Q11ZgQ!!Em!&,_]1[XIņ9kP _-|> X!-J}2H @dz@*w}1bzq| ުn ݪ(KXm ֺP:*)O?\w/ =cr;6u:{+W-?No`:_'-HGǥaf͊SHCkwLt#c8nq4y}f*WQ?'8%Dmk;2fDBg|Z7'Ǧ;z302 EϘeVΙV cc-☷83`M˿6raJN4idwj?9 t488m!j2q;q8Z =MqnhM?uQ=4h=8$0o97;V 4?r=~I/:X.[1?@% lrLgBl"EYd=j"gxA%B~7}o<$@ ɵt@QQK-bDyy0^s"T$FclqJ%@Pr{Wj :񗃆B`Qo}O=&7^j>3{v?_6q{{c9A*P"J1V ݄S(>b]yJ+Z_(Ê_}CU0+asQ\bU,%JXI{h%VH7.I2UtW*@Qb#:cng7jݒ'ݚ7.2"CR&v( ت =t>{(|4\7Sˇ۫'5z`jGyrcWA`]2MFfBgyB=UOuHM:R0e @">LQWȄNrb7W _{FpdSD J>]YҒ_+^]cp*py>NV>%H%9' Pg?*|&ԭq\7Ljf$`a92xe(Ss (Cc!a[.V37#rs}= v:P@x|ARRxFi-_ZbgDplK0P‘+,JQrgW'8컳l}颷qtt:ngX 0tc0 c14F+qG6$ri*;!@ǽgAR)A a $+b;r[1:9A`X 8)o- |ƒيm2]u|~mdf#)(Sx(~ $GPRI3qC5 >\L'MC6^`C&!D Ej/F?Ju|w{3 ?k̻K][L>bƇ,iЙw٥POuߛ:!*\'DE=/!jcCPyυ0$e%iF'ih(IG%i aYp#FiSݲj3mYH(̃z `.o׹Ưе TMcuU^u=w )#/. 6c )_ =VA vZ4 E 0\k 8s'Vr/^?OLtLfPYҘ!"RTf(gƯC"( {RGEX*196cI;‚o#7.I2qwm&`/[*QщcJ?yaw)$䍋2E8ft]КR[Jb:-4W8ƃ眻bXwphOgWѼh4e4Wo5?D KX?F' t 7 ;=Y42嗢!=>|8vx[|s/bj5VcgwU )գ3rZ2[%=l2'vMFC#W+-y5t!惯~R_ҽRֹ/IAY<8(5`jI(o8͊rsb! ZJ`UÎ2G0%;8ʐb=QQfrnޓ$j+WH-N SqnleU0ch_:5 775[_:d^]_~I^]$bCq\;Fi 2:kʅf$ O)3g0޵q+"ٳ-~1v=8N^6jO2Hxߗli4KZqZd"XrLba"vdeX{?WL˵Y3Xd 2Ѓ)~v5oW[f29JhvT0JY;\EJUAHLIǼT*쏊(#m)i|4 349!eJeJI&!j54Jd%}gɉaI`|hu΍[նwٛy+v/ B##ڜF/ zdbu*i<;E; d=oEhmOP_ni,`YAqpPd4k2Qd&F+wDhrGU=Ѿ>;]>WNE}kZuG394]Yi0Ꮊ~ޓ9!'NFn"#u5ݸ%3yU*(<>6eG'}h[rIuh#k8O".9#-#>km8-xh@R[F=L&cڶmFU9p2!)L01a&q\l&$혘<,gZCHi/!PhrL(@ }@P30&YT#7d3d xvs&yHˎBZBu&_ВQMPKyƿ" /$W=)b-}F1 '0P)ž"& 8 #J<%χϗ1` mXoӎ^N)7 ; ^:5M'xcw3>ӷQ24@Xp`8b#1C!$罴{E[ D_ ; u~H1vRu gѵsF6Jr944j-JeזH hUyz`(>ǔIBAc  L!?AV{|NfrbU8߭NWY'OOjSRKGs0>_\&fߢo>*<țx=wG?܅R=gbYR&z}|\1/60Ox.~V-߯MwH8磀d4p"ۓWK@VŻFpᗰ A"HbeLTOeq#ը3vոV|(1ߪFmKuUn B$8v4`Egit `_r DNb<,RBL˃E 0 I T=~/x1A(A[޴$)fZF~u$ BeC28 $'L:0Sx/0\Ŷ) *.t#A SY pS*AB$h]E:.ZE,{Re*HhXt`"V8JKi(Ź,0N. 1T/e#Š/ $1S{(Bd}֋v%cto=s]\`KbK'*Z:= E)D"܌DFx] dU+b@7D KF- Dj] >] FO:N!QÿuZ+x{Q !mˏʦ_\1XKaP]b(Uq*o~sJZGfJ)nxh34tEP9)N$)\K$kyRӵg-ro'-o=}$K{C|=r)3 J{Dsx&Qep @P%!zB\6zaTq`cR!vH]hD\qcdRJ_oE&' ^W@P@ 0F1 (?J#:<%@=}XfU j=KRWfF-q(T]~z6?H9S @Q,xQb 郇 $ N'㗈GދA}*gO^)/aAgd p0 J jW;uEq9Gl7˶AJeg&I+ZXuV{ ON1;|޲ytUYoN<\ߧve8$tly W9Y_7gulĢkO=?;k]ck- )ZkY23k~6܂j9"DcPz 0!]ӷƥ3yާN۝Bahkv.%ӽxXctnPiQ 3ֆz)OSg0|0#R Iauީ; CʓEp|mm'46XݺbR5͝{C14Y\7 T=>11LC 3©$&s08BFeʯ Mؠo/^`s#=bo@5Q*&tWL # q"x'P_i Mm Ad0˵('@x}Fa_LՖ2 MOՔoixW / ]g 2d`/ Ϻy6pED_r[7 > J"(1ԖSIDGfn :\D'=[ъ#s՝KywwoWE1֟G3ݶ3(oO]6cL*U+nɪa6GSġt)+=~P Zqoa4mK4*$| e`x9Ғ>#mH q,Xl0Dl1NAP) Ś-‚pY6O}i1lR[KT-ڬl=?;[]-aIz p#;A\o7yNr-`c.9}t, 2@)͉{Gd'IjR()'âpwt[=sҞ;-\d"Dv(B>V{=2aK]t$KS&&2C`ǖ YiGXX!&- ?P1rdRTp R+`:{ZBIZQ] &P v(/ q0Qc }?Խ1.GLkr<a&%JJ"K,&IXї=%w9e"Q7 G7l1eOFAi¸f9jȑ]ء::[) /d d*\0|8:*o_1b%>Hqʹ[,h,??H@u4Һg[þ@7 0"R $F: RN3ˆRhffK|)K>OdO5CmB_{J&@pw7A7 f(g%ٛE%nɒEGI$vwT,spXBu_~ Gy5-?m7v~to 3b`vL*9$8 VX+f-'R5+ */4%X^TDfyXϳ-4N: ~Zeϸ`)mMq MquHĮC"vz"V}|fסd~10-W"nތ*M%_nGon]V8Po1ԁ}mBC*2JNs)`ofr8*dWq}*W .~ 4~O |sg649ޣF(A_KkqD7/N<Už;'vHv56٭i|x$9%8QĬqV9__T(m8YO.W ljM;B^Վ#kA/U^݀6[Nuk9j0M1J4G2V[5Q&6ng%-<.s,ԺM G@YQ}hV⺘s\:m:d )#Ӄ6q`y+5tLAVb cɉJFf7?1mżFt\hv:|3v<ay=8=Th1-uCF_h:3#%6"~1RX.fI}ej0K EyH9a-^%˭tDr^[T4L :2OL/RѤaؿt=f.j/1L]Z4ģ 7d%3@Tp28̻T3o/&bnsy wƕB:N;c-Y9ۂv5HJ$j2ΨUQ"`4;SBˬ) *A`9@Qm̆ U%QQ-՚VrGjЍZ+( K?=sb|ס'1awO__lX퇫 )23"5Hܭma{wP:/]ôSb"YBsZzfimv֒ͪGHf"0;{ xU/lZkո1=VjZö 1Ѳ5D!e !" t/Ye2Dș`m_ ؽy[_xww}4uŏv=54ic 3ͻ7ibbh4}s@t* 2ƩZ8< kf! 2V;Rj(uY';=Teq<6Ki&~E( |+p Kύa X5}MG$o/F@T'k|ɔdl\R)\p3Rqɥ%)x7\^Cve)(`r[OFu茻y*o]>~F.V`7Ǘ+(~zɼ^2?^BP4df\Pqӑ8Mq \x}iKZ48)h5%6Xi^c=֦-V#IG E^WʔM4r^ѽ!+˷~0*kywMf9)Z9.-y?if7:g8 .@mGzdћ yϋ`Dرʌ m Xq.*^Ea27fΡBqq?_}B|ۇ1'@JA/4 = }d`CZvyu(0g.U!ox|=#Ġ7R#n!śH!A!UGtΠ::@;05Թ[N L|[~Ȟ`+=4y7S&S F7; +jLok= E͎̪=)sưY1ջ˛[IZ>m>/hs`B: ^_Լ9W?M4}kqU +ʊM%_-2|4?L c _g2_ ס~:/\onL B(QE Ɨ"*Yb4 K\G i;ۻnɘ_n\~(\@8`[c-G~5 ԯo}.>eǏcz4KM?ʬMΓ!/FtbF[G=GGUm^U(*Ā//>Ψuֽ=IAːRpDLy# Vi^J`@Te}!*K0JT$p@k.`^7( C9 fEM'˾9|@/RN1Yz߼4, Ok-fXPMcq`\Ҋ3+aPj$UsP0WxDAЪp)]!`%8 AJOhK'cJ(Ʌ&yh Fw%8lIs1Z"pTak8̗ERzb$\VP@wpR.x ~/^Z S!ހ"\m)u 6s)k@{k^Kq hnRkWj-O&mU4=Q0^Ӥt ]/5Ar\Zr}B)g{uH1Q\3SSP<9xpRN)UHS0@7]9l+ɰE}#`Hb@dOߓA? Tgz_} r?@wۚqTq58˼B @uGd {z?\KĥkiI,&S򤹇qݝٝQLr7CYhYӝr %yTì= i^6{|sG B (fɦ5PֲVJ;VC$(`-݇E0sL#AfTAY9kyP[I[PXD!496@t.*W7"bRWdl/)_%Yu隠AzbG%:)nC+ ֚F3)Pij :6R-El@]S +u t _~&aw? 0İ[uLK .@HjcP'k`LfJqZC0cȅT24aln3؀Њ9FBRQ45QLbJjgW([lԊoԲ P-ѧNP/1jLx j(AI`[2h$%׮nHaeGJtԖ#n\`F` = ϴRS#x5Bo.8ݘ!@!P`mMO/nϸJvާ @3]%R*MD&E \,JKRzc"HPLxt'#N ݔPwW)=xNM, W"5'(R9Ie7G-( {DVJnvl}>TJ.)ۈj/,Dxt3dՔ䗠@ dBZ3(9G)Lf$br&mcMz/q>8md #nYIOϮ$?C`!t9'4SκNM,#FmbdЇ5'٦QK l!F dl~#zDP AN.bq-'Bkn9@g:DAtmY M |EyYH4F} %erܫM?9*t=\/>mF{*d&7zpY %8L ' eQ [ڵ~|eo.{vٛĻl܄ņIMQpRi ocZuRD+$ DT `۸'7a)PQҏO^$N0,[#&Jx(eKR5)8kdP NQ0 %69XQ+[[tbɉ2q䌍Cg1KqS0+]*h"ft 5(W1f#rǦ!E+V#4uSDpù vrW!W+L(ʑZ"Ҩmjh!L`p?cШblp)Wn{5F4 k Za`tkԍ8g0ۈF(Ӣ XH s $Q7䍮qF9at!4W\0sΖkecx"f|A`z0P2(mE[Fm'TkjjSkt& Pꨟ* =ZQb;H,A?qvkG HXa+/Ey%%Mܟ/5vVYc$NMG*#R@ۂ5BkxoC\8cϣb8 =q۔Z皒leAw盒D6%ǚj+BoͳGMJs-1w[:}@/^ʌ":}٤ 2@8*>G[ä^ P3v5`055L$B8[7@x4&i-s>&Ny ~Pl-kNpve5r 5x9ܝc*ݑT%=ڝc+(94Fz 1Ŭ%Dg,BB|4DZ\ČO$W_)^Y`pJJrC3C,ȩK`t2i8X`89R-I!)iw,m'1el:L3p}6#,9_}UUx?GϽF9uXdW{CƃvbR@B<#b iYXJHӓ9Cs= ÷Nt}(# aݾ ¼\SzH+JHg'; PS$/^mٽWG=$.[CήgdxI[$FV")I޾~ve}RZw:N>=c{tC.p\" 9j e| 3[9;' <8:̭0GnS֞r`o--7qwHMLcjT j][%qH 6[a.;Xmc{ҟÎx=Xj 9KbQ1NXRգlm:1yL?/B+gyt¹S!,2%@#G+LQ"1"p)n4~-`lS2y C,k=B,I^xډxdYu"T#Yw'E׾K5c? :pu˪֡m UcOP1ǸčR#aLmZfҜ*>o /8[}0>ԥKBjܾC{`Gk("_2:IT|k6v-mmvvCrvA;UXzqފ"c$8U4J *v]vPHab*X/>JM%،qfOY 3^{;`$%@{K8-q/`02?jAiBp&{r sJ&{2gGADX'@,Lg~؍3..'=h#:1>dKd5K_!ͷ2 Nv6"tLhm~i0݃W/3]7m]nw!1z+NF 14E,<7^PJ8(/WmKHBaGԄw $L--)=+KbkePf4J}^ < GI'RZ1;$x%@R)q\~AƸ6IԳw^:s uZ QWaYqjo7a (*kϿ <"*T^\s=gʯ~3K>\kdZ@ʄ\0A(e&jyJڝ7NҶuzPH̓q`-#i֐m$ieŴi/,:A_+º.%;3:/ LZLrO j JIC2t4r*8xޥV +83W*S,s&/Ꭶ%"甙q0J8cd@ K z( 寴.ʤ ;鹻QԱ8y2`RSr.\rjCO.%/f=-iX ]KIw"D2O$ҕ>K u^R>}g8Aq]3t!)w~J+cP&;̛d=\5''4bP>b`,-'xg8!]OcMQO7٨g~2:FmQA*_'s.{}~3r C+lwfp;fa"_h>Q-,QQnpn%Iez& UkL=asHT1YXty?́J(pG a94:uVI87ZKfM:K1h UQHJ%DXl}%`1mG5Uo9ٔq&ŠC C ZAJ}fmTsPRdV?}/)tߛ;v@P}4er!o0uj2:T>%wo㛐4~y{~U k92+:`J ; sH,W+TZƓ%b:,RS3Ia-byyni(rvY+: #x%ke=b}`p VSu7uZTHS0P7n6[g mp&N|H$||ȹH37uf_hP]NoF:/J41SHoAIt1$wy*+W>0 y"JtiǺ1\Yj%EuZѨh't+_RsojJKu !/\DO)z޸ 7s+X{z3/vn(XIʊ[/ynjlJ/$YLZ0Nu}Vr_ٓf$iN +4DG[OYzZ }*4H[w|yX+j8prQL)4;`q Wbpa.t]^ ŕ\˸8&VhoǏ.#!c}Y"'];D95pZms׹_yrIFs/A^Ga!8'*'6?DJ*:t6F,D/8D^BtL8 .(IRVjcAS۩4Չ@Y6%oi3`:敠F,T[zImzh#rE!h1ȑK9e֬:m6ռj>'VQ9%Q"6\1Wc!rNhP0*9-!mTP9DQ:1Y} VL"N*RgO(gausIuĘ>jD$h jGmT2xqV`޵Ld ۥ4PHpMsߋ O7PH3 Jm:b<AuӨš1EPzQۼ*I tzx{"-o)+qvP1gbGoo͝ 3og.u{ۄܝPYtEr#DwQELnyyOʏt>|em+rb i> N}$}MiD#\r{ cop?+<_7VF+C9V{4wtfUV;O?}~#P9)yÝ_b r5^7ڠU_j8?ǜ(?rŅi׹aX,0Jzc1sUU2!n$ z-ض#!,"9ʒ ּ>9!;@s%΋2$Rlreir]|bXY@Os]*Ur/\}Tp")W֒E&l|\tr1_޸̆2$`OfG:tu:uU6oHF ,C cD-)wx. U<ٓGvqmKFgKYc[Nd*C?Tfݬe%8*vtdȮ3Kv;nCwmD亗-WH1J3yY$_N%ߐ GB3HnSZd7gZĐ?;ܰP5ΙսUX~hTĎ1΋1: ~Lޚ\>(;bs김BL N0V)w e=-+5/Ť(hT^jJ ѧsD79e9"ѬjY0K>@+=u.3@*+lWl.JF7\]IO;aɬhsP-z&HژDf("lwm Uc2pfFPN8H(-zkl>Zcmt!zCz̳Ib} `)#Mg}nnXLISm DݏIHZJ$l߂dDp75Cқv&@ 槅_L Bk湡3 S,`t\Y'f/:TJo)jF|ngwǸ\~y'Ģp$a%"•n`lҌ%Teݯ3"uGSOc{ֳgzYI:5D:]U;Wn׊T3_UbRuhPm[3zxy|ٯ1BOGr 3 CɓϽ_ao\o?]t\u&;ߘ('aaF8f,GYN3$&g܏T¡ $8*R&BI!.%kd8sWԨhC `ҷ$EkqUr~yS+|b;$)JJq =]5Ea_%a%NB@9'jۣ܏f xK4i9W5cpBd Z[8\vٗz9.@g]^mO-oIh].kQt]Trb}q q*_I5]/SCb.)/pJ)ܥDM FP=sr S!RN:#TT{MElj/67CP +ͨkN\K JKYyy)I@kb]o(jꗷ(0%K{a \oA0Jrrw[ g/ݡPC*^?R:+zsNBA2&S#|[VD 7i[wR뾡J΃f~ٰXDjY,x_p }u/lސ㇙tD1 p ;!.hT%m$֭/`dN *E)8xPC24GFfVX NLjIcm ̰ /Þa 4̀x0rr)v$9^IA3RA]Z$,XXc0,<auPLEbN-54;>otF*+w@''Y̌]֬ 221a)^`2vU u2w$66 C\{ 8u0EM(iM#: ԆYih`lʌvPIc,CJrc xbIj g2NuQP0e` 1[p|r#QԀbH&ESܥ0)MSkMl|V{&= 659=lj k rsaU2UF+CΌ j&&U-dK4"e(anfYCq2N7(ys[5i//V9B+,`%]|jx uֿȭ+F(G+9|[ĢJ ʗߑJLGW?Y>etОh@GB7=x:nd{xևh;R)B^=~˵WRaW,m~m\KDEb_h0 ۵tAAd-D+tUY{8ny(ń0#+Ҥ'82= [s8[G|;zu_53$$GM3Tg'eۯnbsJ2ĥ]s/\nu"r)Cp9zylID|m-<O0_8P}>}~ LsSń`L5t.6 F[/H PLRr;,{3Uh@0s  NG GJ<T OAx`h &+*~WacWсJitzWٮjؕ`(-I.{Z̊Hؚ EJ9Z!<6YG@ <3֫!c1v>gj?[;JFǍRԚD 'Gzg&(\|B9@jy`{6fT_7.Ty ]qA>Aȅ?N\zq{;KMMmd%)uD ݑt"^_m^zd t=[@t)?[t[LQx%?K'⩽tiEky&icy]*^q+ߊȴh.Z>[hП wC3H^Ӟ2y v ~T.)\[׶bE(ߖTMjMjL[ >;qԀeaW6D;,׆QSxbk; GR3z/|=ky~s/% 1ƞ@(j>ys~[=:}_Im-Cz*~$d~%/ʷs[ Uu IC[3 x/vrILU<{і`3#'KgiŒNf {M8Y/Z^ɞ3E/.kϙE#Q/6Ď1%D_8\&[O?AcES(DlL@yc()8U`u0{cvEH7De CgXeXC\45l\VO .S= gDG;}xzоhaՌŷNEn4{]z=TSj2=v/=Ga 5VhTg5J'ZF7S>;.u՚d݉`L=W紅d|K$$d|d80RUX!!T#bfL)W&} LD%y(Y}3"洛8T.Е'/v@g5ًYgŐl -{9ωEh{D,FǪ q^|C&3ߠ?n;+42tw&ZdZgm\uGg_fA `z̮Z0W(6D8kfhcמc4<~Ņ>WF>ˎ0?Q 'j:}]?œ ^̵x57/"PI%0LcԈ2 )q@%ʉ6PTI@.3y,p].s*2EWs@>]?޻z?. O 1@]⪦,,(3Ū9|ۢ()NߑJLG0GG_ѧLg &χCp`F h0xևh;昀R0cZ=~˵fc3abNzS8n{a_K9rrOWB`y~T!Q:4F ( e8C"ԐTrP }*Bη GXR*$#`?IBw.揊( mQ}P{*"#rF LlM//LCQ|C<?:<`}#2!;X'LDIޟ)źA%1ei _,*bJnNl3tdk[^ї)Rp)Lc!~ʄ)2i[Q%/trs}m0'15.Uky-Q(UQ4^AQs'߷la8㉣+ c<(>K$IN6.k؇]^RKZ^nou JDb+Iɸ–kcEJhYI %Hf5<$6*\V3E''.pVsثü/ ;^+4^KYN\!p`v x85(ymEٕ&8E"skS&;G)5Ѳ=GUIvY޲)vN|&̜y<&Ѳ59Z خ#'R,D*ӧt|hTmfh^wI i#kgL;^ An˔FX-C(j2klǑJ^ HVčz`d!{쇱Qj"YmKD$ɪ\HnHb09XYXeSݹL@(g\;O5] akWBWZZ+-M *߆-$k]O{0%Dמmkt]E$J;UXr%.} SY>?=SOOTᶎCVh(%!IrpBGs5FJ b,tAnF:M_~zvtR$FCkV:(M&iIlj7^qͨMB(*ǽccc[H1Fq6B2,6-,Y&l`&btY-k=r'( (wBYKӱ%bة,Ei˼ jzz@_c4T1HMGi!7:2 1%!:hJzK5qA^Sci:RcH7ےȼ%3Tݿyi~X2s5's"oer _ٹZupt % Wq{=T Bfob3'Z5!m Ĭz@VTh%kפr ?QQ"(G Yb+|[WTk2!mJoԼ CA%M`gUu8TUPGwb^;{I3o*rYSMbL^uc*CVK׎;6x]EEb-H')J6|,>>i~ )ABȖ5t%pQWGF`:~q-6fq8_Pqu4oR*.zQeE20ثQ-֘8f!o]r!-ۘK"=Xn6+HYH1k偪xx~Bʿ  oKuF 6KİheYg~Y]>A磆q­)k!*!PtEEe(3Җ6Уe2 0٧/[Gu,dÜAe'Y60}(Mۚ{9 ЭV^ Uk/L4]_Qmf]ߎN7<6.GVuG'/yhӽFJ*{02M͕ .n9Ij-bNv;>o*ˢMy?woe@{f+e|>%5pR}iID(VF5HW5]tƋ|7-] g=yčQn̟gErfm2ዕK͢*58gO(}RZdFKϜ\ z]/lMpys8w -`p((t{i+YO}:dIQt&sEmSm@E+b"Dտ}[um%_Ҳ=C[޺5;pf|MoL.*rfoң\YԿ'RPv[93uB+SN@=4DЋlCBTбkQ$F6. ~T¶#'P{V\0`HP RƮ %a%SCl:|~}AVdH+"VxZ|:t0P?=|~lRK.8x5 }}myXLXO24QCإX6>Z,mZ&rkc?m)=4`ZjH,pIV0-Wï: A) qJP'?m9di\nX,#rhw0kGDL >  *P87"UL˷O+X{xy)JZTL#B⌍!6 "B2L!&qXCYG:߀3X+Bx5*jðO uݰּ,Ȏ6N( ">^9M맟l4b;V[9Ϩlߒک9ci+YoyDPS3 e/%Q .dȵe`UUGO]GKy%18 "~k?o Jƥ@[b404x}uJ12O'ʟS{^K5i'q}C|Cy֠|&G1@ ޮcd\ "Ij+r&Syᢄ2o-CJbmx*_+#loђQφ(@0̋xa7SҞեexg?|.ĆN %>y< uӹܭ(OGS{!.r,ϗC}1_ݍBCYP^vI sCԼuxn(n j(.TR ҉vl^'/^#Kff 79@ E0C) )8Noue_cT…ن`Pֆanf4TnZ C<,=~/qagӸOi1-Ŀ}־ߴ@ <k<m*8gb8Λq4@L9g/ ]]gqA%8Z^ҳHGfC7i#=;mԆ8Sj5אS  9s:Eܚ( *)|Qv4n "KI1bT㤁;G#ѕ*_k *$ i' w._Cr*u AD@ow2q4n$JQLH)>L ݷ!:Mɜe#}2XÂ@)p" EL=9`'ZHG_?@&0cwB,^Rb{dDM =T{"tjC u6cpn# " \X_#Eja͛KOC."u_ f ,J/oW ەYscf3PQoY\c[qZ\Ⱥk\&^t28 r먽\)YLM5 jnjb04{N9 洗A:c"kx@b{N9Oy)VJdV^ L ͏!.g |E"Gdv%[8ZmT4O;: NBэـ)mMB9fX*θ¸p 'X-6$U=^]8,2u@΂- *8* pD_T0,ۏ_-7pn䈴T~8f; m_+m/ Xդ3]YoI+^zzT}Ì^4f6}< jiDMR6O$IQKbV2 "3#ȈH[[NB2|Q~EwK8+ߟ1KE䶢Y;ºnSr"kXSږ)>'  r"DQZ+ϋ>Xd4xOe l4{MMć3*uHgG i=t1ࢥ  eMN+!LՂfN+` ɭ 30FX{񶝤dP°ƍ !YW` 09zt1yN {V ,;`YI2vZ;bC/0 w6e'4Xi-Rn0 I )t8# L W(LXb1,]f)>m6j%0=#=tc5qDji9N GNV\ Z ,0]FP$i!8ٚih[BdXf'OP0^*iW[\Զo%H|Z]ʠtsJABsq C髳#]+Imy+=:䶝*:Π)9 ;';" aNk R Mr *OF-3 야읤dP$:F$#̆ RR̤^xHڀ`Jtj?!TFn5+pI["a sUTrL{+5`'f1  Ɯ(- 8qy/wQ1V߁w!LGB_'0laltEv>a v6IYxvC)VVL+v2)o\UJ%.;OXOU8/1^2/[_^Tew}v1. V]̿>ZP'[%Ⴀ`%(tZI P(i=' S_"= gIb_e#wc܏bn? ɬ¯<.{% arBL%ϲBgUTW"Äwj;y8ăA<ăA`ZhB:_j6I>hCq,`CBpai䕘;PJ|ƌԍj*Bt)QB:I:IrRjY"XO"UfBEAbLYc}!#V]H܁8w𽅘DŽJލt7ZУT}ߎdg??_s,]tearJMε2H*lPh1BQsX.mYEt:V%`[}zuupr|(FNSm>m.APK%XH)F8Hesnpi#&`ג &dsF/Afc(;-&`E\K{d t\ EvKbB0[TՄ6 ڥg4>IGkӉด9B{k3}`' E`an4'Ђ7'A>ȂnA5WPuƈi=ߪa9i(VzS[cuޘ R6:aTuyN(ьwљʼn;9G;딁}1zsfz50ďfCvs>'lg\N{77S07EoE}~.d>:IX汄`yH6ȒFYt6Ksn9BΜVÍ2Gڙj1(E|JQNjRXiOVwSKY`W YNfǫoƏi㕔l2 \(RTEiԨTdSG?ڜ`1`T|,y_Mx6u9H >hz|++r*`ooU~,*VTa!n-m:,VoȘrY3k-Uu^6ъ&(AKDJDr^|v{h~pϕ ^2ix&=e OssӭӨ\傢#֗i(ǚLkYfˉݱ3v%4+τ؋MspMPuDųJ,%{Be Qݷh9LP^.4jV÷ 2Rmmdtio B)_[wř֍m]qFOɳ Ryc;U\Jϱ =^Cb9I6l]"'yɟj ~Y*$r.RhK cb7o*t&ܽNlL|l_]zL߮RO{w?(sb-J~lQRr=t]@F4rpd4b7An_oAot p)^'\ `.SbY&P"e"QUDLDJVIU+ D4w b5@Tf$RSqT.\#H^m@) K.K%ϳ٧Ju2pr+P֮LX1S4Bg/ߘCoq!*ԧx4)l:?^ ˙,DŽ66U?ddW*[ͤȘDR:1Zd% ~{ {nMfO8t1_&+K] c?_a9RL^Zx7~<PldrerererP7V ,Vxa iPf%kLĄ8% AI Hχ)/џ/{-rzً'AC =h Lc&4贗|oaV;h# ۄy%=AG"xN-<#i3‹1@ F֞1kUx(1eπRC!S7M7dTģYRJ<}g~Y'Olߋ)S>ϻys =x-yᱳsx<`q} "_3dӫ3x{-fX'{-~vηqA!:?ݻ~WX9IA\vw_:ҦTK^ݜysJ(W8c 0c6BF/Xc(&(, xI¯cȻYkNoP-=5 +1h@XS(",  O9kT{x:k8ku)dV)l\;x5"O/_{$Ł :'a۾T k|3͗z\4X0Ky*%nmF[I=T{S7Ar|j$k(` ٚ>H%-Vu{L2{u7O/U4;9Oemrãam/J9۱{|=2Z<6*oَIVRE.hY7 &ll9446GTɳRQ g"8BipHB{NbkM0*m[a40U۵Y \ʏӹS+̎b,'3/z˻;xuO^Qdzi}A)8_؂6~4DfLWa_yz_DcaMP}Nq-L)}lJw_J?j݇i1jP^.u9ҡh8$H"y%jS{6;U*XUG٤j'sb7j3N$NJM&r#,u+`ģz-A2X5M="W%d':3(JDku3`{ qqI;4HÛ]bqv0D'pa}^{q5 @>[ ܫ-BC;5EG׵%XH6=:#c rz*#ՎVn~PVqFPCT'! ozKh'OL.EJ46rA|tP֧ 3Y%F!>_ rI3)rp8)&%If8Yo3cyΌїa.%e7H$@%n@ev^tr\Ď[R{_GBZ]L2Rq߻@@AS5vu^̧h1?iiҨavNIk³»،'2JE0Ugi ]RZCtE[3MJdS}Jbuz i*ɭ܁.͓0 d2^+<^t;DKd5̒Qt2voBIVk'DaU1|ڤ9=EqhӠ/^A{KKvn_]vxҋix!<::.ހRb Ӿlx9>~˕eE)::E?+*E_ѢSuxۨvӞG&N9֨So 噹"6Nۓ\!=nX->vN0#O[o~:kwօ9|+?O_443Tεje?diwǺS A`* JILT׬amq|Ս]G_>+ti~On;GbٽzWSM2XXM'_pN #&4Ϲ 3 6N3,xJRĄB3v9KV9O1\ ѿ|(?GV-G>,bcy0H-9G_\iҢFO~kP:ݻ%ZsV*}|r2WN&_9YDOspqid aY RRLyZ4ӍOҼEAFK4& b9-ykeWs)r"'>)rRN,G{)# cYʅN1ךIDŨZIeOtRTp;QT;EKIU۠ !4+.˽'3U/ f_h"pGwk&ęfb Chft. W]">Um5qb48V8i"Gϊn8,a Qu2{1wtd6~Z"J,GDm7EB֛'AĜ{2pđ*↓F>K\rJ(eg),XSSNW<֧#__^eaZ+3KR.3Z *CX$uBQb9%T>Hx~.Xwb MV?.C {zx9-)FWtR Y!Zn[J2I$ciZ$::913@:4 Ng`3Y&1d2&) f&I53f\!)@lm+W:0)e!yiFL&29EH9Jqr_\NܤLʌr9$ øp%7HJ1 DZWlN#  KZlA?S!Bj:79!:,raG ^$ dO(@tZi4NQTEÀ06 )42#NrHM։ܺ } Bp]Zx9'Kp.) V-8>T b1&1-Kj6dܬx1rèb^f1֗{w 74OL~VsBCI(CϜ0 m "\JD"e?AԭU<-" ^z,=)=Y%, 6x ]*p|bE3$UY!<,w߼/NFi0eGjP67w%rpUȟl{I#,N7iuέ#I}:EEenE49&XkW(yEJӴVS uѮn>V@$[oP胞qAo>43xT|Z R8,_8JS4$OnZ.Ӭ.N knDf߈ YjC>YJ {Q-hv9y! (w9JX?^"-_Gk +1} џG}oY @n>:OxU+4ll=zֳG8h᯦`z- ;gn2*r*D=ЭdbYW9^qn?- 3L{xFX(Y(b\7M NXa =R{GhCڟ歑nU饳f6b=6ԸOW/]=Տڶј5mE.6ZDA)&Wv4vVd4(Ɏah;5ˆ[ Dk]䐨5C}J>? i4i-1Tv}C-ûYߡKuDFdfK!v+Ȑ%T0Q p{ۛ[RlPh5~ŽZ AxÓzhx, O]|rT7Oc!.fXGӐ}l 6hՉ&zYxmj)>T)G[é\hk15(S`Ρ߽?vJz<}5ﶩ֨Tca|}9T4\_[SZ4=.[jFMu,zkdʎf7FEh ~o߿r&Ռs(/t7nQ bTM_/Bk_s;ygk$_W;Jz6aݲ펆$䕋hLRMڱm.14y$pfl|l9-|^%O&d yr B^~+7ܻG z@Mz)]lk_3Sq_tE'73Ϸ(b}{]MMcښz"h-IV#v٥.G< d{)v:b<&$䕋2;'V6Z"~-/wֹͧg7W,WZ)10V 0-EMPDҾEh2v!Sk,݌B+)i n"\Ne)+r+>|!CWssA}qgAc wǎM—W;&WFxcA< H!wj4; |YP{+Up䟉SmKĎ]+xV$CPիd56K\pBwm~Y,d~8@3A.f3XlE![ݒj gj*Y$Jf߻'K}s c =~m#U zRqjmA{R{G%\q:J#Qޢssz|>b ?4ЅиĎ(g*؟~L3LAl7CHcJRGEW5P֝C}{,(t qis`(p%ЙRb^ls)x`W١Cέ:Rr%QJqh[$BW_HG|~LC8?,<\ xrTrRSRTKпNyΐßUH#T!|-.Sw34b(mX,3:eE Z(Ѕ.˝`7+E)?{6.dl=AӜ^Bx9 ,jxY%I~٘6<|>?9bϮ.egf->IN.zo.Or̷*s 7SJY=oPI; 6 97[oEmCnH#n }a87qś>_Vlp]槆;)HCz mw;JWf1t@Z csNN}NW&>q^w/Ai/DPBaf($؍BUFB#2|>TC/ri!$Eh 2c(wN;EhNtAK%Uw6+浬zT_&7K;{9Q 2ݐQ}|rkk B Na2pV3 \0AeԓbJR-s/0NDk# E^@1ׇfi B4<2ඐ+.G|ڂDŽ(,0N#Tpn3E@P K=K!vt__Бt:-՗Xx5@u2\Ĩ&.$VGOЊqE- dDo ,xc={!w;H0OY _U/䏋Ŗt?~P \|Iy~->v3~~6^-*sOO&a93|xuN˟ !ʟw.?$WEIL|{r}D"1-ͧ>[dhn>AxAw׵X J;M%^Tw(A'D rϵg]@)G>e^oӉ\ Br|SIND6icCNEnL#R*:]rN1PG%2h:DJ6RۓƦ JAXk/0?g_|0Ru}(La"/nr9?ρ7?`YnkIݧcY:f!UPg}L!߉]/h=E416kRqczG.+@xKߤ?z 8k-P8% =yũT{xF7}kDŽyc]hDvT%IJL> oU}ѼB`*w< H4b@ic̐KAƗVG [MmlJ\uÃd4P7`J{gxzpn xӘk0:Ooh-:'Rw5Pγ]v?A>֫bj)MmV )MMf$R^b[eM@vFDe7 {7 {^H4pYH34 ̲B1ε2yaxaTMePQj @{NZ|M1"@ N@rM,F[9h.% nUj-a KҾ"k7CΫK]i{0MφwsZX6^ Y͒Q{A~x,/;Z3X7\YBЗ&A,XjT1f{.œ 6 ]V?Mo? s3"pb@0I h4cq?Yt'猤K84`KK$ 5(A39'2*+rR( Ŵق3`9 *e)iJyUg9M 縟-*|V"C[Yleڊ5 yAx"YHgƢ!H i  ɅH%Hí`M\).=>lK}Ro6i㙘,Ub_4!^ [F4c:xEiJQVJ@8Bg-0=(?|bۊZw>m<\wQ㹋1k!$+l=E/\(FO\ېȞPS.ƥvd*+臬ŲV3vVߍB ^# ۇa&phcu(FHବ#UpgdEV0dRF)|EP 7YS*wI+H*Ti1ygTC H?5y!lzqQ]ҥht']jQ"QE߶1DqyH3Q-&4[p(@:#A UV\OȬ"Z<|dk"(b5ϋm ucSj ẚ_jM2mg{ETr0r==ЀM̭TdHY'^)A8#ܖz]Tp0<Z:I <&j gݚ53[a2BIM__T/./!leԥ'.n' -w>|SNǫ|qi1\~|@T6}׆y6^[~/oݼ'Fdh3Dh(| Wߧ%[9'ZKS< *\td!DlJ^ [*1FnzwB޹&Te ` x|#AS.J8'FtYx(5V銕wQl(9-^$Pk2ݥ( We422ku8eI+!wz(X:e%;+/ B=xEʝn+5#mh7zƒ (^äV;p%Rh/d\ż:pCVuj~=:M﯆ֿu(kv%tvf Qfxb V pum[Z@zy8$ m{ŒJHMwLP[xHJŖHK[oi JFxB):dY1*B*&byAOaHnic̐3!Gi傍+ U(yj>>? &F9ZiSZ\'1)8W1Ob ϔP S5d 4%)z\*lŔ6ƔL[5Z=?XqAdgƋ|p8Md+ (&l-#3`®^CL 8J! 7 *!+ t0 'p0 ' @*CW&L(H˵.Ps. ΄qwiGlHC!URJ xkeL )A _C@L2yP-a4|m[ف&]UVȢpDl<&9QTc$>(2eQ. !p8 ܽWwt`%w,&rDA'm&ђ-q{90M ^CߜJH0xYz0n/5R)p5H&V(:b.֥X͹GT b2ShBܳ?/Ƀ4"Iwn)6%*c-^y7E$-Y"Qm4% @(l8 % y&dSU*ÈvԲBv|#A*ypQ~{sB#AJ*cNvb%aZ;.BHP]ֺH@hӓ BLAF@7 `2eŨӅJ] "l[gX"˛wc/6Id^IY-T^ eT^ݬ $W-yEXr&y&w߃.||zwsvu-> zl;A-үEǝs?: 2$ۂUS N3C|I^FK5=]x^EX6`ZEM=H#ׇ6j8TG1b%-^  kY9@at_q%B_i\EtwvC&g`8b5xd}X=:׷Idr:VǷfR1?)"ba,}XrNyGVJf~m7N1+r^<é̴J%R =CY -̋KX HEJNAjNs [ͣuwrTA@3()_  33$bw  z}kKSRh+-xa"Ʒ xT}0iinVnztoVwNYw}~X1(v9'd=p 8L3?-ίXPVdQ%:0"d*tedH-;yT|z#2ϳlW6wg1<]SJ0kF SB{ܧT!=ǿQ״1j5,nt7#w g緎VR(;hKzڸhcIOp,B(hTWZ,\龱iIgHBitOODž)l tGew?/JTEɂ&ٻ8n$W}]ŷ"t^fNJe[IcόѴl_$ICu*bwCVM蠷pQу'HO{sL Xy0&Vj6++G'|!Z"Uqü㑇 ?"Iy$b yBӗ Q*zr4+[#8DE]*)X[V͕Fy-M)/I-ZeGV]0/*Ǖ̷͍͒p|3{|?ߟ2CGt{4wu -ŗ>ʯׄ#7'\Buqj{neYz-1T{%¯/U!=2w4C}c_RЧ8sJ3o?pzd8R#NEw|}eV2o*mĠe2@pcAu۰>O"A" 9iwqroj><%塐~orr<'&e؝G*ZTᗍ4~D$tF)Gc)w#{yIoZf4ҶTnOsIrrJ4Hº@%NӂH(#0P+RcªU&nދ#үT''V?i5=C `|C NO/uVKpdjoo t&oE+x&hAnM~NLq%&Ky pdh*M>H5p5 "zj\0N9B-$LtкtjN,J00FUVK#PW2@ 3k'O~.8MigյV 4@B{hp]&S ԲRFl ̽XB?8.<Om!QJ%J }RkTG:{Ԡ-P~+'4ZjDZJˇ36?W m>m<̛m]#(rg/o]46g櫹ڤy-Tnku ޮF^Zb`'G{1:-"ړǎ2Rx{YZ5v-6v7l?[:[M_gd0(~y! g< [|R~?>yYm9ϳo.(Q1h% Շf)H C|,1j`^qN8戭0=1A)p?;ǰ%Ҏ1q(VLqmb NSw?mG4X-'$[3UD#2B\K>OoL-i4k8:Hd~QWQ$ɰz/8W;? U( tyή6z^^ܻεf RԊ#sk  ^;=С[/ݻe`k3vciGM9$è;_O2%)X^> 4tB_/ى(1nyd^;͇ٗUW2?2 TȬ)8ʺ1a[a[{W*/vhpyf1x $Z@*^3geE/8PVՅ5-<:׳hЂXZ*Cлk?U X3 RamR-R@Lêo*SN2tKyӐ0SBBk$%K_ƆDXS3ZbMϊS3壜sV1=?@v燴@l[Ww۶n¦]&qhm+F[F%w91,./_dVZ1u7WM`p8*u,}VCW;2Qcºe0L FږN57?K񺬕Vu"HJ[UDU])+ h[r"w RKYl#FYnd?M+AXdnњ^ t*r]ʅDʜ$)y[nyMD/)e>"Ly0-J~ʙO_8B>N <_w%0sv) utxKͳ-Tfi$<8z;[Z-lїOkHeRo3@S,yad݀$5;x]PҀVAf6F!G WHD>6,yы0#b'CʫG9v.< ͫ-/HД%zux/.2mE,J yBWfxi*T'"Ib ɸ-T'`8PlNH,Vp#cĈ ##wrɌ]GESQ:OK [E;a!h-š?N1q:D n,fK{rƀr:f鮛w a]kP[Jib@Vkz8^a@/0`zt#Wa@F5^=ؘ€T0 F[ 2a "T+J&YD{^<jH#c$ym#vu!xwcΐ+cUariᝯ3暈9<ɱ\w!f(ʯ7ƑMq]?y`|EAQ0AE`ߐ((U&DqJFzٰPI&"B S#Bmx\\@:` B +o'0"1Ĩ`4~TPoDeEhdNwjZdl5k\E0LfuLSl{DEAC'11„$\fL+ ZWKV Jf%Ham0U\KFeU(1 1ȳCL "XINژYR5 bmhi]ajj &_ bRV4"xy.L2UiPA\:GAh)q.wi7VވRp59!9Ajh7Vu{cC2z䳿+Wncn&Fo m$s^#|On|Be^ڪ[3\_ojsӛs)3B|ڬ/;|ٷo<&[>nӛIlwj7~5k]Cp撳?EuW?a3$z(oھ= w 7f 0hr0^C;Q~{Fus|E /i͕lnG@ |7*x+fPD_jl.z=JɯzQ%(\%N\(=;Е-guvt$װi)k8hsP0:Y:s=ӦqK+խȨGek dsK<+"h'S14DDnR,Q!Ԍf_r{$#Mr1^7jlt~7絹|SSoޟ2?zpvi+mH˸C*C@ô{b{yyZḤ}#(x*KujӀ[T13Ȉ8f7)cL+?1 d"`9 ;.7L_2p3S~#&ZYXF5Dy=37.2uh<-BllO[=P|=i {UL$AQʧ xK[=40..S{%|~=?[%?hmYkLNrLy#pĞl0kazoUE-rKf\d d"65\ M0:$jeIĂWJ$EfHqCv"T^UoA;$tau=jǥ6>~~ւS䌾-Ooq,TC-nb- 0|./1vc)RROW7n~z`qq4a:{k`vXE_ ֘؄XV8, l{0,3_০;nk9=!oQ]5{xOJ3yJ)tGE4Q-{zd2q,23]v .;-N!SUOLN5];ۭ7%S z8iE@&a4#{8OkDDJp: ɺV~Cεk!RF&Ew7яi9#bTOGYc3_WL3\@rޙ!'C2P >/ypAr<1vհ9v%w#A&qX355]5e<Y}v-b5g8b/S2Ú=snFq}9s%+XwUyluݬZ|Ӓ6gjbYg 򚈶ψF u|=U}g8ֽx "P]82!@.D&b SbYIAX s LjX&jW :BBDhv;vaІ W2B*'~DW!>|1M9 /b&T&-"ƾ/c]3up~I5C^e(\fr#) [=v)U"6nyJU #R"ETSvqLA q%vT@~l5Y ⠽΁l;q514 8MڇD)wg2R#pkd8(4K N8 XF!1&L.*(ٰZ1:pR| K#_"vk1ƅL3~&Ť5 qZ*7yKQ`9S bqmUKw XʏUH '1@~f1IsTle 5"qRd%wKw JEڢMQg ! ԊXs4WN+3jY.R(.qW53V;SDWnGq+v 3d=wse |i9LJJ}g_SYGǓ=xA's'G?ۜ['Oh)>`N^ZUm 3>5yuf]Kft~wM!a^Ү0+̹蚣T^^xwFK$eִ⸟ W>==s=M3]x@A靂X "ҿG] aAFҒԩUZk֊_k|rA3GLE$RZP6i@ĔrkLRqQ~"p՟nוX`XD+<ZB1M tSoޠ}c4܁>}tfȚ&̐:3ҏI}U~PvȺUOUrfBk[{f*kz\LHJxqz7iK1|"u+.'۔وR-ep)"cOĞ-w3AJ_ z)K`ɆP;dRˡeQ 1QJq-at8F׃aī_KzڻF?ia>+ $ C' $p.{8m&Y0ri(;Q=c4K98;sNm,[HYܑPFMaIUrk߁pd+l?KQG%>-j]<T]a)\jӝ в3p.AfɞQKﴶyW)ٲ;UqgT -Vj+S5< 4T4^Ks%{ax2ߍhZ Ua0Cqh pkG {T637`D=( x>Ska cǻ暲z6E{%Zr|M=(1j}8/誱ټd L)?롺|JF7ճE E1dfl崟|0'JQx/`$0eYip@2 T]3#՞"J+E |ԿbEV6ELVZNk:ZVCaL9,A)>8bLqL>ψ$xd*h:Q^M'J"ZO/(ԪXe`wұM͜3y{qL/or9e +=r9.l^ JYmt5eAFTFzgU~W\}X <"# GC(|cQJC(ňZ,w6c)կɯy+E DPH r5H iYƳE(X]ߨʿU,4 -],@~xߝlvnu|D VX%Ո`-BnTҨx>jZӲFE-J [- w" +O ( _~CxGEIn( @NE!ܲ5`TE+\nd@\,Ȟۿ40fݱ3sk,sBYd2 [ԣM+5`hiɆU*&m4}2e:CYf]˨F=]X9\WZ5f}jvަdY5;ܚ%mլQM霣/[{y޹8!>޵e>Qiecsmcc0OVOy<͎̼1W\c^3 !gr#0il=#Yctwi&[]=HqvR@̝h{ZӨghƳM" x}ꆡ{lg;j _MN_ %\:\ۅ^F[@nTKV6@˗硜¯I Ŭ8}rWG.y+:]c;*EԌujSkх8{F=c{GLq{x<3зn>^z?@:oxyc_h GXbOvsx#<>A!1ZRa-/_L`?4;C0Ķ8CV|%vŐFǙ ,ً"V69#p1eOHeVs(;D1c²fHc U*5!5u5+l+/Mo&M*I'K#"fL)Jw^t¬ CE'qvl0(PпGdƞ4C3e3t\\e%"TH)o<]Bsg3 "ÂḳxcKjW3oMJN;>Y6P-}P- G hQ~ޗέ%N dO8Ssپ{s X ZiSL uP#*૫&>D) :$coDcֲ}-+ƘR4\)h}8VHYYt`\'GLF*mоZem6 P{Rm@f;ֶIw>^:~s &.@Սr\@ꞫRcڥ=dۘ(?5R3D6LoVY7{t=_?3h "Vj7Üwօ~XT}y:&x;l+O1yoE哽֛ eYX0B˂ٚmRhSl6)_@SGޠ7֒Ū(I.?>r="?.V_o:30󛯌 ~ޓ) @ƫI'Q6jƯ ۏY4bs(kI [(d)OBۙ0|OivB۞V8*\zѿk'B 8PyFKF^=KF;*RvccFa}htm8hOh<>NS.Oyo~5i'NĊȀp qaI^Ċ Wp~95f`N4& Gퟶ;*=žĎ拯%\+hMsM?Rrp-sF-* `}jCw5Xz|'qj㚘qȮe| #H9sWz/Ӹ[%(\ 5 {lܰ= vXS)R^-Qq>Lnwn%N jN#.p.tݞNf*$Fm#oGx;Hj3XvfRJш)YJ5 C%DE z7ܓvA'&66a$jYJ׍$Q9] KS:xO1*4EѕTdX mL,l$$m9dL>ؽd>׏):&IʮwFZj#0p㡔kP1Rέt(x s#^?!V,ғ;&; A`„SvbKJQEnMi{zCJN8$ْgja9S 2Tc/CZ`NM$+pߴcPVZ>ϴº5`1ih2 \T`ANY^e %i 2LK*LjE7p];,9/+'Lr$jtP9[b4i|*za&8MUG^*ѠhUI#%3va}az0JWڍ+a.kTߧEj Fݸ* Uz(|~y:BTk%g{Ւ;fœ;S#PYΑ";V9~fo-6r0{Cδ|c:I Z֎m3/_ceՉ0#<'Rz77n<0diإT+aۥ*)PPJlTO)У----"l 6:ZkNԵcB3![zs<ȲfuApfXrSe5lצow"Ǩ|U/y|ngXZtIK{ ϋ fe)ն6xX-^^)%md05%BRjT 'Wl)_|n;Xf92 ҍ$޵|2hl6b5&I]*.T@&{ޢqYUr/**CrBsU8k.l9.qXZ~8:q06wpw'y+NIi;8e ARs 3Rg" %fG^'l씒-G^jbCC|8fnn-#Ή崇u}Px˧onE}JOflF18 |<iNB|b"}w)ײ`W# /gow#=І]IaC{-[K=: iy8Kd;;iG{3ᱝN,TRA}ztf''GT&<5Pm<㺛gtR+ g?yqd.JGpx"8| x0}t1:٫˒7ň@Ak:fw,WS~T lgvɟoV';r/gOtAѪ}ڢHy4"OHb~exŗrE+X>w]GwLdr1pc/ @0ٵ嗽 bj+PٕݞΜ=jـu~2ݪFH>*^pXhRSCRWmDRj B&`c|0cҮY eӞvPHλ8Lř#!5;7Xe Ya\~i%DmhV~'\F\F\F\6ŲY©eKBZU&vBkgl`\d 8%p~(3gKF%Z\"J?N#c-[ܞ:99 տRT*|Ѐhj)K @ Si_Xi-sNqW 勯n Ś*G|>}aĻoMj5Nvyt:ϼELaIMXL&0L.YL Xeώz-U:UVRUtpq1bPp1RӏbJk z4pS wAeA{9S ot)=Q 9`F Ð $XbϋeT~hý*2 JNbt`n'CE:FZ?TjX:s(G 4rlB+]EZ(3WFϔ,׽؆7ބjooCDTٻ͔ݪ΄."/dr,Td7/nu32).P930O&$n>"sr/d>Gr| Bg@Z$H/OI\?(\_dE[U﨤Q(HaJB pJYhC 4.V]*٦+$iװDH$rYuzHb0-mmOEG7u8q}Pnu8>ڰB'k7W?q|cvMC[TE2"NN5>7-|i}\|ȿTT%(6, ,D+W?Ȋ觼lg2"XȎ:Ev*z^m. մvax͡~[zN7d@ԙ1tpL'-zVr9C=OgŸݥA[dgO`FH.%-79p _|ۑx] %xk`>rH׷sG\WK|' J倖 [h% [x T璵%C,. ,8rF>]E/ 8͐V^4T> .; b^J>Nqs.:tqdB0RI 6~A@F $QzdF&0Vw?%|#71|#7Ոnf$-i~E}v vR2D`iQ!L88WB2Df/_(xD@.xmՆr)win\Ƥ>* Sg %TR xmKXfZaV`1,9Ʉ'~}id$ƻk1eP7y黠U⬈<#UˠUP`;pЕ!勤T· 1}B?0NLmX FZn1쨾,pD:"W5OučJcD'<&bD?fV@wD3|9A Q ?mYd -G/cv[ g\2Z@rRR`Z1Hb u/eRVu)yruIA`'tRVyb1V| ;6{3a!AA+ўR֧xs)p,<cҰ;n" >Yq 7JSΤv FtMWk33q]z 0B&ғ-J69 k8 dCS#J:N!0xd@ur:hUsqI4-&XmK^:DK'9% N*b@,qpJ4%'5:톰^s 2H:Ȱ5HJ3IXg ^5=A½Yﮯ?_¬i`tGuy43Of.TcM}x2_ofekMz&_+y2̔\~/+#.Nj#Ǘ__c`XC^/և`k+9ηAI,/3 rh/f-7piGlX^-yPS#✡t138*:]>d!#.si^4jcڰv˾yLN$F) hh39b'ҌGH ;6v|x.m=0^KV/~#ԕC,AThw S]fǡr?V]1I Ŭ = J#uŠZLy^Dz fc*mUL9}NaHWԄ (GxpI1ʪEJI9\+3a+=N a<^J \q2ĆJc0˅/N9,V&I6ʸ2ئhPr'"** jsWzOc)~?"ߣ֚gd]ƽn:s>HlosuM97QTur3.U Y걉gR1 CE#@Z |SϏsGjJFU} ^$úۂW;Wy<6~|$vdcWyU-Дj!Sm' ^a Vŗ/l ɢKzZ[iOz].s*Or/U+bǟuВ Ihߚ-bLɔ(-a^(Uh]Qb%bI7xrr={}P Z%hb0-G~ҧ!D;g'hIy*- "z) .0>>J7<i+'=T,}N,ڧ 5xfi"%Tp^=_t!Hmv:b@g۬ ! 0 288qf,.\0.!\!UP1.j U_@q)KJl%@vh<^[RD H饓E\>ͤrH{YjW\qae)IFKr@ -–Ԙ ٣lSaĎ8ˆRL*- d2 +'}w}5: LN}O?OuX]f)n?==}gAHq<,4fbqmCpPT Of *l,<`Ao ¬B,XXS5"r*2$V8 {M% l I޵[\z b48Ai9ݪR(nЋa<"!ޘ\xJLڡ5MVʹ QAE([$$&Z- 7S(5t>TS#¦3MRݒgV^CpL~#fԄ8JjՇ f]@3b;4 ~İ5+9g@=-:T47J 3{*ˮ`TFymWaifwGx!d[g,yh:?v&z)$g _ċK{%Mq/@}FQG> H*N9QS듧m}"%IZJCkH Ճˬ1%&1& ¯I~fWFB"`'_m}t%[-ו~p8h6hcJPۨ(/gs+ w?%YyR*5}%J11(2-Z\xA\ i0ւX)(bJ ) &rgӉH%ҷ6K["GRGۥvAaMLqVA$<塔 (K ˴؈KXOdDg>N>Bb8 1bn*+ajq҂_m_7Py\3Ӛq֌f\׌zLIRD*ŷ%,5;={Xu7Inp fw} 4`wRDMcnwR۶Myv񑠢ntys_0" 1ĺ-aS/R,(G?3}3vp >t[β%vؼ4|ww՗n>l7s67pY$A}㉄]z\z|9^$O~ϴڸk##y瓬4QRG>ҰO`Hz)IV'_xi?+1܁쐩w .33G"KX-r|KL$%mJtg((ag1k=ӡf ѓFMoz>Y9̧'W-hK^߆E\>w?˒n'&KDM3OPqBn4[U+FqV4)cE-Tjl|7.Bw5{|O?7:([g_O9ۂ3H{| iRg`%#Q߭wkb>AJT nEB2x|u*,S87sMppN'9h%#(b(]AxxGafVD8QS?Y0іJGŎ\5)go=s)X8->_y`;F\͆R_/~`Љ_ uL fxȌ}1ΝbLw}Rk|bm8F<8x$Z egiy'G}Z]%iDZƏs"JY=L{X3Oc|Zhsӟ˘SŻbN+o/ӌa>^ҾiWkoF`3iGqXkB``3)U{w%v6ڏrp׆u37y^jb4(P@FإCYpޙaQ' P#ɡY l\D5t 827l@&9EZ R. zi?/bٝ ~ȍu-ȹ,^TF`upsW<>dz ye{ ^)؃rkZ5TąԞ%[יTfARі='uWd1f} ZPF'tE@?y ȫ]!r5Y+N67$7N8lasu8v_r+ ,j pMąo%0 Tu`u;,89,dh2xˠր-9Lh9vf' F0;T2I ퟓ\E9;#"T9LrSa: 8CaJ6J'9s:]< Y<W >_l=KXvuZ/7bD@ D{S?3PZ܏Z)J5N)D2pG4"VZfQ:a"Vw ́8z++8Ki8Nj&lj,!x==^m{dw(U|ilp45!$>[2|x+Œ/^K0K/"Hs*}Pvoq8Ofvx}q3@][3|;+ UHo. ()E0d2gfn>d_\^pEq&HTJ%Fcf No _1}܈R]~ZOH5kE v&Zr|nnD`T'k# BLkQF&2taj*RK&p?$ @\;7v&NSh=z(Xp2 r2TC:(]q?}Fwmx'xKI'@js䤷犖>Y`ڞp҇m1Kk<P_5Ls|GLm, >r䣋LK| [5A!4#$G[`c)&Z$12\hU2zf J z_AMy )@ځ$09ȬSQz8~-J0]?Ni=M&1wW$7[]O~5n \ (Zs©u۴]3, 6- \sɶr)E>]s򩖊&/俿e3R#);M JWS\ؿ3B_{='GwgB822kȭn3r=]?Q3)?yA\SMז0B#4_B26K} Jn:ATHH.^_ چoJWURI!Hc;pTz-g=$1-!t& vIkH WtD=sbQNR됪ȸb%\/5oi7 'aa{#YehnB ҂ *&ƐF[F)Ǣ[AAa"J [RѶOkPLӺH(:k"@<j"&TDX1dhk4hOݧ5MXubY _'][T: viW6 vj'8 Ƽ"̩o>1&4qN,)?RRvd+ClMP4|b3KT35lp!=c6 \ r4NgA5\2340o'pVVF{-Vz\\M[}AU Ʃ:uY0Jr_:G3>5Gž^jfKBԍxZs 5K4?v23ď LgilCcojGcGԅ@a6 OzJapoD5\$њjL,5mZ+ג UɝŪ9Bm5Ʊ)*[ꂲm B 4rK5u\.o{*Tk8k=?Y }i8zHtxL}~$ae>uiz-d /6'! NֿS[qWi+*m]V n4|0A qE(1 ~圕V~Ef{F`.|}m|_60 ̘Y I9&cɴyI^3Dx{?9畣!]MA.i3շe'T%E,; ` yR=Hۇ&ߦ,qfcݢ%KWR]%ǓΤSbPv哺O}U0L39/O@_W/,\I-Iȑh#jd$ ֭)9uI)4صuk>9Ӻ!!G.dՈsɺ)y[S rD=XCfVvm?hukCB\Ddf=O⚉[mM62Kn=P9 an圅M95]%]eΉH#瑽ʔO"M%e;JQ]UF/ 065 `#+s+ х_ҠKX~VgWwuԝ7weYweI|Δ@? {zƴyYCZ-iI} 7JdbcE2ȸ+@[Xg GgMXAX*XB`NKS|#Ѓv6!JSoeP*8˜嶇sNDUTGo!.n:taxyk@#Q *fSnb+MZݯFݞ5'!$;Md%"n@U{~Ac{q9XXtARϴpSYlXcm /.``f/EYxh,vl<&zie1b 0C^%Ņ C?c[:z,|>[`}Lam 2U"R~u0 v7].Izɵf,.NϿ=\U=egO nt=U@Dǿ}26aɐz% S2및,͚ PҬǐKy\=eB {U5S>t(TBT$-8)'E:wJC|(Er%G e$aКR`D 5Yxe!]T}HՇT}HՇr|Z&LbF-B QVjf߽ٗq4]eM[Tnk'ށs~+B7iZS4it TeqD!`w$͜kGX Ɉ/m+\vU 3r %oiQ-q a%KhlN7N=siTsXt*JɡtI"a~lt&("T,Xdp<* Q#б14"♙~@3_MY]qճn~aӿL~Ws}lLɮqC:P{ڽ7"nhiu$d>TǎQOΦ1ams _9I#B(ƔB3I)&2LT#$qj=k@5% #QЪݻUMQiW9ISyYYACߪDDh9AnRژp43CV :.,\׺b 2Yn T)_嫞ʯ|u?D<F ոD{Ί ^:;a瀠 t w }750s٭Ga$=i¹q@Ix'{#;7 |H}9"(8ɄVcsNfKGwxܢi=$JR:mHX`XgŠ zgR3 .{(UUtJ^ySpZHͥKLT I0o9yB|=+thRk:q|o8pi+:}oD~X%wwQ0zn0 ӳWI$LjtHNF* 3H>/{|nw3_t9,A:#xI[`Dy24fd?FG}#8ʉXF4Z3Q fH:S3!Es%{(V嬂/Xx |KyK)<)oP?Nޥh_F>-㚇nm r+T?7/ Śś .g7E1U]YE+;[znv{ C^w;y#!!T`Md)+ڷbUj3yz)N'?^Őwm̒W}Ȓ,T92bɞ,l8fH3gyp69bIڣWmb뷋cj]_:Ek_YٰYőVx205Y'k;'ӱ Ԗ);Bd'ӎrʝl#DIufOY > n1eO~FeOW=pi,8iպuOo]6HBV'~ DZ>ZkeHmﮃIq&8I4j6&ݙ3;Knǵ߼+<-hx73-wVq: HwC{t JK"Lnȵ&BW-voLuE]-ի ~Wwn鸫A҂:Zg 5{`YABCJ*Z8O7p3lk.5P%?&9~L[g >\}s˙ř k;j׼öeq''1&͇9ŏ]!x!U6e"Yl6J~J_-2Seġ#}śݵѰ@XO].'w_) ڌٮ8U |.c6'><"! kVHt %TIo[\lP냷;aKDb(3"ZWCCQ3.\ڰs0oض'l8oAd&KoH2σ޽ 4|]\8@@s!4z$сӠ\9kQ1!Q G?^o鷫f\ ^Rҥr oI`YIn}HMg)! -l/BIIo\$qEYo<><aɸ&PƵ:)19p0~[=M*L-x30W7ըIӅLd`gxjmX9Y\Ң\-.'W{˱&Cc!ǚ{}~/RE,ǖixP2x4C = FhG`Z^k-&rtFrt rLcˑӲo>7Y7+! Wk)1 ʹB; .`ӱ֒M2M~bf"oƘqU@ 'NU$Q`2g HL!Fia4ZE" i@sb멵6MFt|RY@ַ_>,,?<2XĖ:|ݻ$^G&|_kUCΔNt /=ςJ&`ɵ/vus5<$P8\ GxqsJz\Pbn$pY=KlDp;; gUrgsU-\v^-w(kƠ42.uݪ*A5I1>g{|7d߷4 9p}[սesw\!f/lI@vCx=<-r :۵ ydT7hGkqF $-eyA/18E,9 % VYtئ/ B=F-,>vPq5r1$DG KP]QՌ9!f4ȄR(08M K6nÚpj'FxU  d q*%" |l!DD H٥GC$Cx63O_ψ |?{ƍl0_{ˀq7/kdǙ&)N~ɖff?*1*?{v,GIK$pЍC@!0A)hARI]=av{zOcq}g@h\srydq3q0\p\GܲۛhN?l={2Z&v;ML/8Ű[a^ڗW*EwR0!p,€k1sWb~|LM/1f1S/$᳙꙽֋駛LB~YW/F&)~nXcl}AĠ Ձ vn/)C,Z!UA Kt$gMThThJt:h@ LTڗNM~hlXݝ1dq:p1猪L:g0A!1bL॔dE0ahv.3~UT[6ԺA H[3 MDlN0tC8>O> ]`'`ʐ8=4~jDrq}mTvVA?ڙ NEB3LFUdM'kmcJ]N ܯcБEOZ<{S)wܦ]Q#ZZ!vEjw&"BvGorI"}ųȓ!3Dw]w3ܪ/ƓS.rK45?:={=:ٝؽH~1wݻ|)bZ6Zv_ǣL6_zxvfhO{]+'X~ΎlB+.IqyY"-D4W]a+J-@H2|w=ڮJF^Y]^/3J݅ U4={Rdj KswvYǛ~ {KND ᤚHּtꒀrv]zuG7|#`+<,w`};0Βvy?#7~4#\SeaAr=3fG\Ϩ>(j38QwBr}O\.X\(墓5nM'k7_#7?ӱ,eέ2bzu>gMGоDTE*_FO񻹾݋@Rd-VU@ʰ yI;w<^GQw采O;7(X)Qm +j}6# "(XGCɋԿP&FbpRj]b$ vHYEcO hT bbh:/$0Ѵ% BrG A#z4e(Uܓ~6rw~WŖw~gӻxluXA&wy#в9t01NmU9t[Qv؛ ,X-3 LLG+rF6h'Xdm9R;}cI'pռ=Ig6mJrʐ^l@U"x5ܧ[|)]#GH3J˷#(/뫧(zF!*0 z$qf/C40yG~!K#o͎\ӏK4j@JJ;DZYR=̞ij$3{V&Wq3Bf%OIMk.5eHjZUv6 #/yvyu UaPeVYeS'6VQiܶN$6ݺwIPOjt#=_R$9@N 3ek.F*:!a1,?F$|^X4<ٯd/GvV4\X(ǻ_Rߐ7*],c*΅7݇kYyo(7z㫫=Wj˫-Ί?#^WRxa1#؊˿q`xc)-)}ɏ|bqCAMMM%;[,pCAbұ|.j1 =9forw#Εyl &Z)_])\}y c&v唿rM':g^Q1Ԋh&wOf&QG8Z>1ѲzBPn2(aM9F ,%HVJm9ͅ9 m5{cHΈp19Glϻ/FofiQҮ5}ыѻ#ltx[?K?nl ۹$oÐZtcn.̔=ŅiL\8rQ;%8 7\7r{68{ĉy_{|x {̉?}t0wDf<[ш>~v7G8:zROɆsW\^',ё,SAZ+ǝ_6USzVjAOQۻL>܆/eoN*lSFBܝ'4cYAڼE7wMIS q9'؜nDq$辆in)Shs{hU!Rf7k s:p^H.'0,tnCZc;Tv - @붵#w!+/'fyӊXr-h`G0IZ*T"qHEqII 6N}vK +H1Oᖼ r\3vuTta((j(JAl 'sQg!cMsyU;pNͼ8!W 19 p=79x&;8pm=~}kHTXT/ibd(SY*à J1*ULվQ ,s8(KC>ufdX +aawF3ޜdۋU20{p ^\qX&d⿇p9 τJMeP4 /4B  3a QFFU+XwCQfT+MQ'|QpB^-kT])lIbJ)0GM#[1]8S߂xX'pjH|8mAJKǫ8\\+?;WQx$J&"Y ݆rG^Bq5SuԓYz3搰s378aa;ޑ--B >QPf QI56״n3i4e}6/G.6o7rh#{HH=}B9L_y6iTB.҆JTva觥_O*$㎶nO ݨȱRѨ9ey3bIGJt\7hM2Q;Iu?w70BZ2e0K>qN|zZ6J +uRW!ls gMa#Q(լ1~̠oGP ̞c0GJ~c P!L5)~;*PFH/Z1`g6} XN $LO |^rïc3~nαL!6$+[Ģ!l֕tiKzd[e lrg$%&%Τ*)fXRd԰"1 0esÈ1"s'*.{7tJLj͊n^_Q(լ1rJLkO]Xk/6gFfN[MdD2Ufl"u,jyNCu$M2e}F$X>O.1]r c— Ӣ' kV,dEBV,dEϪYѫ' "ӅG,(R[aFq^̚‰Rm.S_ݲt֤fv.BȆ⁕S1XjX9Y\ʢ}X\M>?X> W: ju,YUV*nJ_S 49XIޡ91[=4)`f B9Fkq<4ZEu4b` vBjr̝^?|BvⰳUUJ33r+#3rXY R2 缿BCpz`Mɥ$wbuzȲ$L[4Y |Ͻ{6m_=6yNߧ2^hISTJP"QB8v2j{h!z.,7B4wH e]X&uN^@ &|&̔iC{|J2ml5֖@/BNeI Z3 4R3ya0N8۶Ɣ2>B6*x'(KMKl29bdtZJ:Sչ B^['%j|7 1;MucC1pm,u̺:t{11kk=&}YU>5ט*tΪݓ*r5)$_!.9Lmw1v.o638E\wn40m#w<&`]""MRj[/#X)346ucs}<_fXzB%EGk=ּS&4j)>r>hA)7PʭH6ʹ1ڱ4%E [[pz?v_s.8Aw}.Nυ`bllV/ SED=5 Qn=OO-IH}0zF{"ۚZYov3ww=5=%l-s>5(m1٠.sVX@9g*l.69GSfY m"sHSK>3(' ݊M)tsLTσ~ J!COJȑ%?E3HO!)`4|Ĕb S qi՛"_ew}&}X\co+? "4Y=c?{ s&%cQJVS mXí5+]f9ݲma]`ؚұӦ,:oBٯ溔-]RΞ$,_w|Έ( 1ڌC$> bDŽtgvKܸ_ۜF@e4nN-cW1~uY|Omxkz0ȣNt]'4n$KɞϒB"A*|'U2C iRvIl2DJ :ւEYR'Q'Q&) `5*W"mX'fens@1'Ǹ4d!xn4_ Z (+x4vx? xk|8gW`Cg8!d4 c,-~+Y7R->Ðg`Z#@n0]yd. m- " -6GۖhQ{Νygz7)ҕY30PE`7ۘo2k.MD0;ɑ1 7l &t~GJtHRw?mi~% ;ţQs5P}x ŀUr{Qxp:&[[ժ-j@#NSҵ¹ӔD/dXq@Վ? ;~v݇JZty,:+ҜGß1;ҥ#P~-jqgNPDB H`@̂1 Ĭj@Te8pqҖRN>)Lʔ2c$q2"+Jgĕ ᶋ.&48=MꚆH2a*vu|hf4ZizjbǾesV[ͭVϭk޶mZ_s$oyI;]Z|*>BPӷ5It+ʉkf'E,߅9S!4|t +Iyw3˂bd4#(ލTk1A퐋wB2n,Xl^NjJ ܱ+9HpovK[y䡗# ڡ[nCs'䊑 ,g4+ M3mLrWbI4V4w`Cp!#  -Yk!LXJҁWԫ "9z[ܱ_1x`oxв< |y+N3OwU/agF޺az#$uR3-hr=UZ{qP: 2y*:,s̋DW:9?0G-?,jJ 2 =gP#H}f;Bt9jpP-0Yc0AOkz)׏b4VZ)/5Oi,cRj° $ ȩ0zrJǟɼ-:T5,!5 [Hٕ-fWڲ!bC 9"o=gr{|J^?'̗ko/Z]7ق01YT$BnMi* t2p&ϔT\P@W~f+ xu,; H%>_&˷ZMmKSXn:U1ϙ !󓿹zϴZ^WyLÓ5[EA|UD"΃B)RQXi%R0WFШ׶>^4Ls]1[Zf!6cl￰Lէ'ãо/!?|}[ Hw߿tGpy՛O!)*N&.2/V2寉_\+";w+^p{˟-dEۿZjR*\cυuuE5JR_S@jbkٻF$W~.)w1ٵ1 #%RCR}-7ɢNF֕̊"22"2": > ӖHmxhR-9(6G 0^;+ u`|t7.<.`Ed\i4+a*r$HrNՔ'\DkW%'y$ }Ě&7yکW{[0Wyʚr{N}Bx!ڟPM+|$Fy_}jO˧f˪糭sAS:l|9m.hWl5ͩƄG2Ʈx2eTx!73(Ą A ƄF\T5R BmpӅR 07ܒT:R㹑΂!Ie)CR@G!0 (pq@=Q-짰V0+@N,HJju7[Q'?Z8rP55TۣZڔy5Ng+$ g ~8^66ojs$y/E_{ 0ˋ$cñm1qrk =v-fo&[/uv)gM|۫:kn{Y0򆷱0N΄N#Ұnˋ6»\uǞ=Ba|noasd6/lNxI=JOSm$34U<]2A: -Mfߦtawbnp[ECªMm`GELڭܜjE]w}W 11VQ(.wOC#!Q`ĉvd6N;ۥrǰ G}F-tx*l ndn]H32Ea3oJXeR[9G\$2$7דb}roNtQڬ.޼:dS{~Y1gjJ_>/^~sf885Mx4 Du d+9O0QsgZ5'p+'57o ϙŇ dzG7{7toWۙɇ3 TH@1E L-X$ &1:;z/{}~p ʖ6ZHtsZ cL<6DsBsKūͻ~7#nix8h$wc>Rv23dfiXW &?<{VNPw~~z=YRxexuNgLbqRf|ii|R>R;¶}Gȍ (=;I8yZT+5}v[ ϧC5ҏ~N苞`ɒ>ee`;=CT[:o3٩E7U廵=-k 13N/;2H4A}+e|0(kNn]l9C#l{W#DZ}Qۙw3F #WSwҊ#$ &]nj떓3xܺ,TC685у0(c-. ({B`eujcФsKQТjթ* ]bB)4`{E=jk nj: U^tޝz/j|WΥ*G} JW6 -{짽Sw,+1ff]5#~QcY|sUoFYgYs= {\# p1a8g)~Zh)+߿+A]y)@j+ MU04Ra?(tφ-Bc>Vd HM$rwU֤ZSwÞ/9QG{h%%%Js# nqBH?𕩨 Zq[3J꡼>[ͩ˂SQ`:%5^~qOb .c':EJr %E4;Q)O=o6-F0aA\-BQb" [zCA3}*X~j%e6]dey2뼼VT53yɉ(m] >_LgZ+.ܗoC" -* v{f0a ʹ7/^ ^+7 ;7 nnVs"< _VoJK]7\o d{+f (`EuT>ϖ~E"K֣5k69z&eͣF.GY=8j NL`1).pTj \[&&7 G0LP:wWi3>! x/)W^;YxWCg ̯z,$}Ō|-h[{jFk@b@:3:,TX;yу0]behcױ|J{ϕifWǘ}׮fP 2(#1FeIKiXRY8!RRʅi{<.g://UFzsfh1]J~zb0"8E"AhjԸ(Sm 5FMT>kPI40np ̈R"03#؊SYzhî~ZFn4Te-$1saA B&z #B0`e& q@u] &Y)tn}X7{ڬLP aPŨ@TDn0:HT|yRn=n=y^ rm䳙kkԵO?\N:ezIr|} w;#6-IGk ,ӗuU<N:B^@HXeѸ@Jф#/bPXkXa}|Pi<6UAOVabd(;2"leΊo`oQ,q|΋i\;#7>N?Y6+J?:'ZZÖ[Fﷵy,}dJ-ٺj=1֔b&Z Z},ÅQ(/#(^Ş=8s5;(s, eD(&cX]_& m2f%RKk|d#$\9=HY +}%FWff>ldak^* ~ִ7v{{Zgj|ƕbw"p4@ vE@QTJvl %;;-vnֲ%rf~ ExZ/x=>ƗI5L!5(r/y.JPBnՠX#OV(*` UaXs (rO ʪTE E"3`2w2҆@(X5>U+*DAXa%XLXs󂪊 )3W K? ʿo0-{KMODod~J;GՃ䶈l0dn *s1׻Xk2N\8٠R|nTA* 8ړu*$Qȴ:<Ӎp*o}$E{N}P>h{fz,~ln5v`>s{{͸7 r?7ן/m1da%٫g:nr1W 3_*#ɠ %)k\Lc&, "+]4{@EQ+v?^ >˝YRg_فdSEXK)_vy}(rBqg gRqݵ"N[r> YuA*95#)ŗ,PQ!f׿J۲$V2>m\zNmA2z2{8M%1D̔?gqL,J7EcdpBmaF^^NYء@a0rE+Oi 8FZ"8gB]F'&`]+`n:H.{F=ӝ4g&|ݏzIr7},~}/;#5܄{n\NQt~A_udS~0 gty;wQ'hu76 {>>6%+j0F<(5hgJV6 h/'Dwdfmn&HG1e0sO~ 8}N^XWgN&="Q4ۘbѡ5넊`ƃ i`0$C |KlsPAd923:8/eX'J,dYlH"yHhImAql["`!RZb`[Nu[U?l%2نe!uCy+ĂrB-_,QEDV T67ײBղT+Ima&IR7/ʝ["%*I"nK'^;ɨ)RJ ~\;^YP8 #np^.}<5nx6m}k1VxILTK_'>rVb~"R>JH0S0gʑ>Z]8&xKUz]:4{Qz0k7t@*.l\oei:/4{ &H{߸39ބ{ڸݗQb쨚߄ x qQa o3V*0D]]V.:wAA'(mJP80XvQ,JL0ѼM]⣆gvfG P!1Oş?|>. ?ZQhQs`AR-Rvu-!sMt݅t>r0 DZ"P$2,)nԜR ,rfS%ֽwU,+eez5YRBbԬYT)%BM)on;}0*,3-eRJr׹ߠ?Q_aK+Эx#t"/DRۆA5~0l/F:%` B@PZ˱qZ xȴ0ٔɦY5&_l J:|M1wi0Aඤ ZnÕM]py\ :]8장XfKZwF띚5 &Z6z^`ЅuE!u 1,,'KJ9YŨhcb/ѥD½b\T|yu9)o@C#0ZŁD=ݢ3cj^fZk՜tq/!gfX]|oQ)[$;1BE8jj8ԆfHaNM0 Ǝ6lpHtPJA6Xi=#)"QMt'441&$ s0 %z|DօCb͌\΄@T6~9g(TJ(SڟMG,\F-eI>SҼBJ9`+?rCc,| X@Q;(Xt}ILwsߦM\mR9r(D_\[靶ŚHtQ6+aC.D[fe9v8..WN\u\AZ9\:LDa,02>ǩH*:yI)_,KQjΞe&V1+*G ĶH `\ِ ~cF0`g]bʲ:MYr!VG4< (&bG”!DaXq y+ɒEe3"ȬL/U?X8,}kF$5 >'0F^j"LHJ2L6+2G^Hjwʠ0=J Z5(H=T@a)ÌSȰMkJ$l|< rk% eRvR)yQaByFP 5$/ #58sr43y2ziI96;bd쳃K20IE`&l(\PsiD%ч|H_ jZ=اD?&0.LnE<8\YX&woNmR( ~BN^M4k8qOHDxkAHև{ $g^4釧[ׇ@aғ[tӤd36 I{o"HZ>HhӱOڣm=O^ҍN;Q0~ÏCZRGUy݂ˊ@2mZGw6ce[ M w~_an+\6n\{ <& 2?AGZ}ek9-;7]KVFOV6bƦ'5!Qh98D%:t7}ӊakL1Ύ? ~ƩאF8a!A>2<_V+?^S>iȞކ?0W2qNT#'6z'o f'[${>:M+Kk;fEnӑ5{mVfNu~c++g^,>m;qѥe{ch=YpwɥM/ È -c#픊rZŎX֘82H8L%l] Ow^b]COʇ>ȋ#1Aa{~M$fd S-]߄=}o} q|x0\t֑KeYKq/r]ax?|t{n*CG?A3tctzg٭:_I5俬ӳǧz/=opq~OueA3tw˄wooo`dӺ#u|m5Tg~MrT.uó?p;qt=Ӟ/Osm7.30dLtvbK~vw;;A{kzVqߍs`h>e1aד/C|'i#bg+uu-Ue[_^~%׀--':r7sgAko;_@KaTBTG_a3]ЄWwGQf^םߎ޼%&\KM?r:Ԥ9oQpˠ>Y_-SmGoz.NR6Z9Y[ =]ۏ"-,Cϛ쫣()~u𘌑) RfB2+? th^9?^/4Szfɘ#˯<5ۭxIN,wION84 ZNoeAt Nk !G|Tqy$Ew4:DN:3˝2! "AUƻLXHSBxGQ1RC4] Ϻ\f8ϱ,t:y?v~ɫo~>8;?}{'yW?_GN5PfL9C=>? "&Mf N5: qF %p ݻQ7ͼq$ : N$JٞpS"TA@db$S[z8[6kCS],B s=2۾7ϟ">>3.W)>7{WƑ /quî%;&3VX˨Shd ,fuV~YY_VeV-~MW ι`6sW0󝞛~ 8!c~GuP=Z%P-]/uX@Oj%X A6b5g SUey IQPCY1$wr-$> O%ƞ D$Y0NCfk}/վZo/ ힵ_?8š}Fe#d|?Ve<+ (RSTt>DP_*T~:4>VYgr"}Sb4TZ'e`sުF9Ƙ@ÛYl\ WbEC< ӄP* " dP &k laVɮY:DŽYU,C@eT ȿtBR V5*!HpY a&QvJgxSmR gGdM{,)cL1ee:q/ٜѩسX"XowMVDHšOV~pUs8T T r=) Cc VPߢRU'!{cڭj( 'ûXam鯌l# bU! Y*I߂)ں>lu~_&Zk%%n%pHD<΀H0Ž ý:~ ye2i9X)H TY턇iaWkE놾献տpb&L3+W NE3(v)N齃>5dRuu83;aU;xo6zf:ݵ?P@R^l=Ou[5BcH-JWIeS*tV}#wt ^{(u+=MsW3O:_>i*HGo12A=1k//%-Li27 1ZKs&ߘ٠P%#st>H5j2'hUf| F 3E4{R܌yqXXu; g};O9佭*r?VYT$m}#+L~7KfqjK)Z6|+ӘI"9zdbay);~u.q?&LKqa?֊)ܒ+gN]Mn&Դc믔"D)"^J_gKS`6$¤t' 㤁܀[q3t[x?⎩+.l+nxҁu߼E]OxxԿYX  mxjDPR{[}8#ԭŎw.=*o_i౏TIHpe3eyx*zS¤+*1i&`49Хg?{V^iWgz8\48fX|YӕэeCR/jGg6Y ;%LVa-%:mSh9؅Y;ϱ$SW>0f#yuZؕ[`xtd< PV>Hcf$ASOF Iv @v)b#1gif06]́d_[\\!-0"o{16$v!:Nj2UT%^~> p9fH3>p I*֮~U@0i]Ɯ)Ye:/%EÁvt5k!Rn3@DbuȤ~)(OFҽ ѽ޲*F8~놷-^9'8mGmNKUI>=_ ÆO,=k)8*AcY{@8/D@t9ꯣ'Kizs%}pQ3m!%XiߡԳϪyU:ĺ%͟Ca Ù<)]٪9mO)_cۮI8 O]u}™ƃt\K|[Yap{L pIMo~w^n'aHO3LvH)x3a Ό$#iʸ%s5dP(笤j%TL5?jV5y0 9&-8u{Y aӇQ0H)3,ɘMkT+)aO"9-|}~<هA6LNB^H$Ga$<Տ+X:z:h}SEA\W ]SDC(T3m ]ar]aFig yxiz1vfu6|mCU9g^r-^JUɗFA/Kڗ|Q;z}ٙퟷ: }ig:J:Ve**yXr0R!9l"_j(:-T/FKgvq|Y/ܫW۟׷??|^7v0>FYvyw/..t'M<0f/i Kj3ݾ`87M/%!>.%;(xJ! uNQO`J !8U`]6@xnJ5N؅_j|GKQy u Hb0r%w_]]y"?}տw>x:%WXbA>yef>!`*-c1Rc< 8|1g#bQE9* %#CKb@WDb7Fz0AEDaRQ.GZmb\j:kI0V(-!0 BBT$Ed%鏐&KIKw}> bxmnT;rng 5`aaq鞟SӚsxƹ \jW&_-?M|~hOY O{q0Vf>}gBW t`43z~F|J6н+IٻJ\ U{}ș{CFWzh .40M{5R|pI4Z|(%8Aדu!a,/o`XL0ưN%{z9yb:2s}ca|$&zHm.p8ZpBzO" Ĕ:јb0I0ŜI៩7os_?z'`䓇u˩EWf3'̐ &Lҏ7~mI,63a`&2 ,@8`+s~=cܻ!~p.2F},hGɣwi|yu!k駒hzs-kqw%|oh*_ VMamZW`Kt^?;Y~ɊTqZBd>OL^?:0)>jx- .].z6%EM ~B+{Ek ?pO6@q:G փqK \]9@(Gډ`ţǰBP,4.Rq AjGKA$){"[𡨽>~,3Fc}|IțiAHc5R99n|ХvTkL3(7-j]L.`}f+W}檭ZՐZ~A80Xر]g/mw=׳Mc6wPiG7Fw]8Lo:IC3zdyѢ#M [ٻ 8xf%(NXn6%kڔHAyosRkfaU De8 Qo n ~gƓjfЖ˳9'7k`30^^]^b>~+%|p>f؛'rFVy9]+1&Ccfu%kLZ9M޽?MT@<ǠYg iqpmT' gD?~e a pV t@haQJ+41|TS+5*-kRt^Gx |+$ֳ8aE=Cs^QMkHK 3u6yI1}Z}eҮwOf839s)рm/Tuﮪ.1.͢S~b GD̓B?{~ v;f7oF;m2tTv#8#U~ɤW+{fRFG cz"LEC#ާUGㅝޤexxghyuqధ6䉒VƏVǏ޴4O˺둊 I{h;zsmœvsoXv/ёO8δ.EO OyQhp=әf㞩; Nwݛw[{;ce_WCX\8WjU*")XuG*tV(]q#着x>RkMJɉAz`B+fa4kr?FbF52)5/d,3)u؝0V&3}.IS9SAR>g7 Ck47*IVZ& [M=^\e&9Sؾe'Krupv4P'I8Z:yp|NbXDc!t#ē$|́󲗛sjc)q"vq 5Z͉".,-YS!d)3%R9\ZO=H]RM6)rbfY{a5 {20{`k.YC{˛q@QLǜ_!{ ׀O}̣ 3xXoogL\,Ln.ri,n%P?4nkOHYzvY{q37^?E{%+Dg Ee Ѕ~/17z.xCџz식=|Cy򋟠1OmO-:YkwW2I%>z77æC=Sc("emξ5ivA+G.} m>i$mn<8F\B$=Q9gۈMSPƾYԓc%׃VI\XSލQ#=W`AqIp:K$8.0 Div%'&`IIo 30ݘUރgi=6pF{p8>XF3+/Ƀwj V33 ʣ!~W7&)L ]W]X EPӺ8(!x]pI] fg~KM:ZSeD,s%P3Rj 0nJ5(9ǡ2;3`$r s-rќn  UOcCdCOtYō'i =}\2 pYҘooO[~]2Tyr1v`Y2b ݝdC'ʜZ"ʗ{[3[Ɵlɱv\gJS͹7ƛsUL;JQU8 qPs5Cjh!'Igv&AwhG(0.$۶d TX?1}e/,t̟p%c?a`q{FŻ?jY':rk@*14FSwns9h<#L@:¢Ff>3gp'lux<;rM؂El\Y*fEË}19~JlWZ϶̣vyX۠RiidiTƽwܞ`jR퀦8]||F߻rscl-l>ݣ&m[:g3$rYd^ZrB*uzB Sh`i[EGL)mOZ;>W'ϗN&FG>yyMΏ^%椘o?ɣrfOccMճ]SP$G7j(/T1` 7z.0ni߽bUzȕ6?;=쯿t}6/[[qNFo݌"":%RY->M֗b.sucI1Ɗ4u$JjDj3TzmO[rA|-WJ dXg\/;wC b&م9mľ>}T*՟OeX͠"P$a(K*"رI7:xR\+TrcB[hO/0e͚)Pq!G'[$2ͭ"ϞKHGen2p8=ѷؽ^dhq!l̆T>rkτ')h={0ZKTX%TCpT|"1:5IdώfCV"$e[Tf$Lghf ]<}4y/3AWdF S{V%ܳ{Fрt(DtS{@u">рE}S{,!xphճu'+xw=TQ).dьp<ǨV_o-n^tV0V1ۻ×Aid'v.}33r Ej>[X(e!\|!4*OI193||+ I ,ygμx1^R :/רd&3vX9(k$ӗ@~lv&ʝB,k:|MKYETFLz N%s9Ă+#VO۱])FXz)2b-=/eћwxߎܻZA֛#fQMݑHOPO@xy3X}dٻ]Z[K?jp\ǤHBӲ$.1;䡚dô6kLm5* ;R{!jLw,B/Ri 1{9,h Vuk%pA| ԉ3 ,)91Y٤ˌw4h {7BiD*Z IhU:=Gv˱I=:X0z 9B]L1 9?\q9]osR}/ż#t$5o8qĥ=G7nSh)jq[ L]1j`$fR05$>8Lr$w[D'i8@u+%V`)?s}62AY{Y!Hz'͇9]">;Ny?RoD,žhmT,KɋlfjkŞw|مvȔW8c9rSeOء.8=Uɳoo^zsRE9"fE9k~ۖ#(YĖRNwۋx{@ |Խ\}1*7mf,#HEW]\[瓚 k sHQ&%s-vT_EHh iB=f1Tab㠌}3N7Z_z>0>OK^ w[s_4W&GmrM;?U)n N_jܥSf't;| ×F :1p(:uyOu=hwmJe7[ kU)ۛJFuS*cB Iu38|c8hƣ 1MT辴fC}cJTjG ޗr(Dj(嶓4JBaǿȋ"ֿXn^&۸{ޫF2#xwÔH2}ue[]UsnLS3 b`-Xͥ]4+[I ڮ )]+:zS5FH}*%AkWubF;e%HSv҃8'Hqvy^Qfȩšj{[ GB3RvmE鍵~HRi4%eHKyhfXmXowĢ`}o+{盍M_K.O643l'C;N{s?n"o>7ˑZ,.]+tʊaeMJm|&5VmˈZU+sVbY-FzJ{pŚD;Z>͞q1*WtojԳ2ApT O\ڵ2 ʮ[4I(RIS92޳&No9m1d@h/*^4pdp:'?C8Ɖ!mmA~>I,%@AGq_=óCÎ:&A'_όPB?~lYvvE`H-`FZD>OZ0gTK[oH8G=KHKD)mhB6WH="q?rD}U#򧿁9L\ !c[ql/"`n =y-C_6@goHY$~z\yJb֔(q{mWL?ʟŷ ߏ/jh aKr f+>;gq/1JTF]MU6X3㪕ߨѯa c^g5`{+!5# G6 VA5^'s؍8|/-8P)K ET")7cO4P.4g+(UVV*:SXjj k~91<QCRhhx ]r9BuJ "4"Dj)(;tDH'yU,P&1I3]sWXTe<\P)E3f (At-T쿅HY ^h%l%ɟ#)jhMuB8% skX(;7g$8?tKD1JS ts/bIҔ2k=iA\H.f> hB@/t,d%!f0E%\rf0SU>8\RI(`(.[YLOLJHI!ؚ/]FJqc.RC3u.Th#\gjXr#;.MR'f~țq;\_gY-"z._>>N]A@u` ~x?)BNQJ:ʽ\O͢]-?:N gna(F:Ӷ+am[ZhNU.ummݰ3ꄶQǺ_b9Tmۺw׺А\Es=2Z8Kee|uq.NVd72fHa :*S d,GR"bTIl\+%?+c`CaYoRHFZ?NRCpкusWϞKKJ _j(r h j h*[-w0nU_UÇv$x)H V0}-bĊ}ggd/`_8BNJ5G=@  =$ISzppz޴ Iu7 J#W{}Ӽ2,iv{R>9~PF7,xYZk)>zmZJ TϺ,֛۱ {S_p (("$ a8$5ELIZhҲf9W"_˜H6 ԘHDiv' EOɼ0lsДHв'ijz!UaN Rz b*&M[,fj=4.<gUܕ*CqW~buY'$CLv$>yH\Vڎa0haYː`z1.A+te%NX#/"Ce/sӝ_> i?F8l6{sz\4([<9J"/)z{0?<#\:ZvzD#։b F]juuڿphD!ʱS U]T#R";gpGXOu(ĠRc6$n}," kscrT GB3R 6H!MrJVya.H 9>+oЮאCf@R*)ZJSb)H̕En%!ELh9$&(l'H!bHj_<=߻xc P9DP |jQ<]r*<]pG}А\Et 7,h0@jǽֶ-:t+*$<~NO>D#=L@.F̫+7o|{lK+ߚK9lkԛ&; Zv`ua}/}aK ){#HgJWةp~Qn~a}}atY'Ύ/M+ X٠rޛ2%dR-O*.%Vmzdn>iKRQR!%ͻM@$C럈UbM}Ѹ{̊n|R=x&إW`Z n^JVKYe"HNO]CB՞LJyxyPe$*/K`?â҇za,1:$+ՉmW/UƬBalMXLw{LI)zMYoډhDވq\΀sF|b/fk;]]r>p}f4uZދ+τAcDaA=r΄WӺO=фB"zDY1 ٣l%d6Qev\3H!rC؊}|OKsv }y]_0;"AepT(#(s`@RŒvPSpeZ4ǖCt]5/O~V$%HK$KR%j007b/[&X'Eݱ-N~7_,P3EtLE9uv(@N\JԃD'O#/ Ks5D9u1z+^N;*5O.WO9/4r\q80{}iН4k"W_bѧysiNH[ɶ8;o{DaU=ƜP٣;oh4L58^ d2 9o D1Λ8äUSv`ۅ\NhHy94mz@ZVd@},?=ָ%v.,zT]]u@ˆ½;V+H%EѬywIͯ&jujVJ^\BHC]w8Ќ 182 Y&0O@d CTvq2Mդ׋)O(=pu>h($|$"#St: 9B ;"E%j~ !6[<Û^nӹj k S̻^}%ӌ F)Lfտ#ʙȊ_iKrUDh XAtoV =e.#L02II0T")3bjckXTt9㘧A\VBMݦ~&wrԑ[t `Ɏ )|<[.&-j*Ჲhĭ$ZZvKE&MT} l%t1{p\:CYE4]al#u) ݐD !yQ{8U3qI']^-]JL_yJ3Y*=J@CJQ崫{4g3%l-_A}Zwucd-3+h яΖ Jpg\BLA|sqy O ~+ePPu[(Ҡ3W(krFt>+v=X=8J0S= WC:yzB/u :/9ĬSJ~ t/z*%*ZtR{}Q ΨKuߍer濛l2s57Nm(?_u&],dXwc)GP{7d߼Oad}k o.^y^|{󗙱O?x_po޸W&,&evЌH h&)F1rf23k2(wxn=k3̸̹Ĩ1@f) C%ĺ! QŒiim&1Mv0Kkx\{b u(o\ٵ^ כOxv,"5$9'"c}YBBX_fK?<߹LCq!Lie708+Fwla:X{O400wcdB(%-ױCWi&`ٶߞMS W67ntFL+qD1׼ASS-53,~YԶҺVvի16Lo { HM[uO >*&Hk]cWYM0؛p׮v(ƚ0q s}xLJmXo -N|@zC\ gSdQa KMɩ`݂r͞=pt!sxݧ/,TKH=~u2ֻ;o޼x2P,~{W-iSv EiH3ڶRF~s:ר(!zzt˻IAw{2tB +Sc-1qBa ]_X@;]DU?dn¤QULx<&[Gh$ԇianz񶺞*]z'VQ_f>et:'y(LoJа\;H K(RL&xw|i9[\6QsC}>|Ɯ :fޞ1{SPi{Vg5r??%5b)[6 NqV\R %/5LA0(kR0)%`Q(բn^е䗜Kǩ-83e#J* *ײ(!ذ ,V{'R[M}%ބ]4QVM2$/Nh0H uˆe+e(9bƸgiX UZ\21DH\2)<BF]<Ω2--f;dā=~v{BscWr>߮=(B|jg;dhɩ8 S"[a^07cȐ+>@9VN6O,OL剉M'&W;(Y$D[|zOJtw~>s_<ݹ)#ՙF59wε+@nI+w!5H*uC:")=珆jʚ[ZoUx v.v exܗ|)HSa +Kr–Iq,-kK,#SĿ_3)9e7 x”F5!k 5(X32-?7/*Vtnm0Ma%qܖy'cy[[[ۺ-z+&KDJΌ25UL)dsR eΐDڜ9hF AZڼA}3yzz.Wb_fiQӗA 6m6_;]E{V"ݎ}hh}_ )0jfz!D VhZ&#*ȈFd9Ec0ATqZ6.&rQnݻAqGop}p=}duc |ZhNy0=E7M*77՟w3s &]dN'3Et8yh-. >U׌jO%F,xkE]-SǨ+D6VqIH! 1{* xn å&F°<%7Eq)Tb")\Ŭ.BHd4!Զ#r^b;t|rbQPIUP<*-W8Zaٳ cAn?!ƸJ.V/wp}lr\kwuxў{a!T كTJsB<?C),A"Ms˝jfw)>qVYhc;dh9J,?\$tGn5lX6^7l^&L[hN@\T40]F<ѹn^ݽɺ;kRvLJv~͇xs-#ZF%]U f}uuB4ĈdPڍudZk)$ԍN-Ս Q&T38:3j-s$(No0ô/1ɎI qнzеʃ*wԗm\ŵޒ737 HyUchSm'pu˄Q9hm2uAV"yN ϬP cV(+HZ@W8;3m*R WKgu" O<7zuѳv^Sla~i,䕛hMծzm(ȻZ W-Ӊ}GwP(̻Y{zM4ƦTU7BG:=zX |L'!ޭC~S;nW=[ y&eS #;Ļ !b11wxuݾN;kxzMlRJ&ɯ%'RPmr6S$&6++z[!Un cgBq a*g{,*!8iT'3>{>hQ Vg!UXeJL< % Y&QK 292-kBcn aYͮn*xSd1RcU}Zhy8!Qy#]pSfB(KfKJWY\[a˜ +2YDKN`bSB\+TPIx-`pųQ 4\0qU->I+[9/vZWT&sjteU_*w0pa&DU j]ZAS jIaŗXTj@Br54e1:/D!e AEYr Զd513w ˭Bx0_o砗/,#G>w;F`sx^d㟾zLa֟ߐwϯ*"JY[G?(\_D,z-48W͗xС Q@v7Bp<\Kꂙ.Ϳ]i"YV;'&~'͏Жء+\jל gr5=_{BPS̈́[(OR 2V]XteL O`\>n)' )zXŁ%=v@$&*E9g-d.5 ͊ dPLƲ̑\zk,ԊBIiF84ڶ\ݼ&;< ]־0w]MhyX*dWTQ^R$Dt]ecrk5hdVrSĻ}A?(to4VeTˁ*^Y T^T9N$(uV }s}e_dp~ G`ʩa:Chr ZykvH3Dh­A9֒tsC+EwǢ3@l%UJ`7O, 1-n*ļy2ahk=)6jkDڼ{miqĉaR@h B'] pz˱1|a15*i6ք5zFWwՋ)iѴbQ<( ͺ{{еDO^(O^ PrSߣ *@jagk1RS.DmAEkЁG;Ȑd ,(!OV8v~#/p!$zj}gmM0 uT!Lkfn60Wm^$%(?6:|Bˌ |\'Ny5L8X@0?rԪF'^$^e"NkwLZ'׋,%K`!DlJ?fn0?!xX |L'!ޭ8v߶wOn),䕛hMI6¾w|-Ӊ}GwotZLkۻn),䕛hMut혷Gľ#ĻSD`DtGRX+7ѳmJx75WpX tXo¢^h$Ky-nvev|y z70^o]D`+kn_E嗻Tk j&ޤ2ӕd.I|udmI64TlKl ЬCC<xoǟ~BMżÞI_Rxp4KCŰkD8|(Ql,J.ԟ=ʥ^R֔i#)Z,mSbB5{#1nVsҋm /xZP oEF%j;LPuLDǜ1_"ʆ_mKުN5XI*d>l&L 긟PzP m/'=znnl]ul%"b%# يEPeqܻdq~IFZ1/ g;DcE)?s7z|"؟rFSyk-z\i9"7</#lo\hrT9NđZC?a!tQ˞}"Q7\ x-؞}AfE{gX"'P4Շ\X)ByL8jq7- ;F0pT nP}Phn`}sA,5J+mm< Kۨ$ o{6s#T!jȅ{ʍ('ZŤޥ-QCSěyt tQzojJU@u!z@N(_<ۨ=0/E=yBb4ULs֜ڹ,4Owߟ9` /HOp Dt}'dFR9~7]w*ji%\|Nk RZy2e1gƳC(#WƳCvNKLLhx !jfrdƓo[D`0iU E%4(I^DQmxKh6Jth/&H)uMFZ#gM6;FIQ! F13R|a#klnCD'K_NH44%szjLh`i,Cq(^;U+(Q (8j٦O OuTȼ}.]e0Vk`)M=~@ mғ6m|GD9DeÀ-,gir>_v!7 ˻3ί^A(36EwgIwƭ$j gwߨeeĴ*G Ce+Nמv 6ڑmgY0`47%>uOr׆9#/ EG:W@/gZWB-&QwRM `=T Mdt<0QvsE?_d){V*uw%'_|5}X]XنDmI q<SVmDM8&t^ZvGU$mCsN+սCkdعU{qkn1N^ & )PЃ:K5ڍEZb귚!B5dvIE{Y Rp\c:G ޓjE㡲"PR PȵA^pJFmcvzswL̮G[,O'醴wi.]_\͉τQV>Ng1o*iBq(=q!2FĔ3a9B]VFbN[0DF C/!@x/bڄ{OyX`pu3=g'd`r7|~Hh0O\d]OsRܛs5Ŷv!w[y4ob(!s'8|y?Kߞ>368JK<\t?>]~;<Rd1N :JZ{:5Ihr\ QA2T!٩x+70D>J&thXek'a9<>%ZPm9[Qm+/ 4C1ґ7Δ[J`MK~fp ؝D "3&2ZKLy'ZO|w,1:AH5oA5{ebR5&ieCk8A#iG c+'i.p;hWV#+xM]cĖrmN/.&E@oxB4ʿk;Źe2׶\n'Qo|n{݃҂n%$crǽ@AZ/ɠ}ޭyӢq LF腉CTk#GRhꀽbP(3~݃,S˽FTa/.4䕫h+2-V[3>ukA}6p[QY\[kFK[UNYGQԺ!py֭-%XnLǩn-n]h+WV:eӫM5*-кŠDu>cv=GzfZ3ZjݺАW;%uc [Dkp}gRJiqvo7mf츤3B[[a"Ԍֈ`ćPPlGy;AS;mOaJwSjxYG~{<xlԃ?||7&q2;`c n.C;4j&^|]Aڨ<``ֈGm$Bڷ$}U\ɜ_gͱA;Q4^([p F_^6e"\#y>Z5v~уlbV(t zLk4`zni؝6VndNޝNyU9 .wjgFBx׳jh2hݍbdRzuǹ%a+:-4}霓ܰlAj]xG3}R0]o֛[OxZe,}a!2 ߝB"P8 ~ {slmxȨ7M[4a*ǡ ){=9CsגVw@27 {5[渑"eYŬk  ez'7רMU}@,oBQpӄ0blSnAM}R@˽XLV,Uʱe>F#lis1e>1_2& r62B FFEzfF)TrQژi tY&DPQ2C p>`Q+)מE.%&7py(uLZste+8BPCs ܫ obp)%]'@$ dPDND?:͉Bp{:v99mN ?\`iuWꛅ9x}c(`e'jN'E >ށ/#:3wq_]s/Oit=-fC4Ň6eQ<tòoh M>Odz40ȄP'%?L&4Vflx$1D"VCHyhyx;x ʑnZ&(~JGB7r2p{1М ݮY+ /]2/<'V9sk"mȣEq$,Pbq8Y>npF3:15n-S++ɒ}ifVv%sxu'=U/޿{'PdC_$ВqNNBI JT ` (9k?9 `g%=QoU/˷ᾗ oA|t^=QK=Zϝ! m#Z4w&B{+̞~O[&)Pj3 ,^tJam ?{W#GJC/k%#(@/`^xeO>^w0URH)f88 bucӏm#cۈk/1{"G.|f)6f t $E|^ZHy% !pmkZ'}T y FD=|L]ڂ"eEjɚȂ!GR].ITu>e>{6 a\TkT Ʈl;uCЖI!95YZ%Kq@d7P(̑'J FbSTu˶*%~ˠzLAtj גk.dҒ6dNE&dd5!'WG\p+AЙy`km 0`6mWeC6 z)o3Hg%_T?aPi|=P W,1S«: CcbR}.s22 @*-&OPfM0r {JF[˿6orU rM zWm>ĕF?OUҮ'fъbڪ͖qKiް^)WU x7.j@a3wmU9;T!TF̔X {??];? Τ)v: PpGS84[xQF纣񵊿~k0A޿m_$VMm>~U;EL3Am|yRl d WwR/o]ø)C8bX\KZ]_"?e|f{pN; Uo]]vr>KN[Oz+88_}nIJw?±x\LH]8pT)adƷrɃq],\06ӭ\{yQ8Tq<%{[уm g!Jf*:W޳%Q?^:ۣ_A:'Gi&ݽ?פh=FM ᏍCDYruan3G/?3cGPlkLf~|EƕdLIl,Mtr?+/=p0ޣW iqbTRίB9u|=3oS$vڼdijO0 Jk8i*͞M:NP=5q0Q7.DyJZ)GԾ{YgCb/ YXr; q 6A::ӞI:5:]21:%^W+!?c=a rNphaH|Cn$R^JNq{H'Cu~uo\yP"~ߞg}jTP.zUS4lTdpvQG6=KsFw Mtr2xѼ\LsPYKZQ=[6TgX#]~x+rdX~Ǟs6]Fjl=t!VBB(mk8ͬAUfkcUOR38PJBNl-L$OE0Ԏcr ːt?O81(QD},HǮD=@.m 2IR8$#2u")2/|Yj^o-.>ݶ=td/zwEdbҦ8o#Ӽ}sZs|oՌ18̜]R}(VkŊ-=S>v ] kVְ~k)U$BGi^@ DT`F`S1d IJL -mK2pdHvPaZi1sh%T: [,hLUhrōdv$ ʃF3ETA񄡉<.-;c`bon)Q/NV{Z_҃X%N++GIFȒO,k\ho|1+&ud:-F6F9;oYn$GXIB`\Yc|R6xEEԤkՀhmXܽF<4Uit0Y@J<n%36(I[`b :` F/JIT#vD=h4qt' 6xtHr_Q<H2ZH H[[8ZK!eH ;#0\gge*%THa{q6#J %'k -1!mX*=W3F*=/ֵĿ%C'86& кg3hr4hE.d7 NЏv͚6MUƔfPJV[@JB G-_ ؒ*rx-lDk>ϗ>m~_]Hoۍ|fIݻ$砜d<‘շtqmoUo03rgo?'w5w$Eb^,?NIa0}676~~ _A#]lk:P566ME 7@oWĉμ9kygZՉb{ʊΤ1B 1 ';BM,&x)xޟjufr rRWE"mBKN\RY3 $eD4lO>%G&!eFs919j"F@jXڛtS]n;MzQMFYyrdZGO>6m:?ޡ9c2VR :A{V(^ "Ww*(8͵|w6m]C」V)CPso7ܓZ;=gFﳋ S+mnhL y,kPLJOJ}Dw{ e%wM-/߮dN3Œ5ܓ2B!e.&hy6jOa 7"h vY!|GiBV\VRe'ýM e Xr)d '2ךȳq.E!:=d{Z%gۘ޳h_(-* JՌ3rW]BL7{n-ozܔÍQԩƇ6U׎]1(^x;gN/p4)\c3 DR͓o1@Fr[nn sqySџ]dmnS]=\r'o^:B['K9 `r#3@Jr.vԬ1/XqKkw%Ϳ^[b\Co^|ékj94A6 [w,qxuݑ8)SFZW赌l:ޅME;2 yMrڔn/YO`섃5x;xY.q_>6U,svV@E9k.L%̌1TW?gҋ0Y7?]-}?pg=U%Y?ݝ &T';=(i2ƻKw*Fk@,:%ϜJ*LYLp1|˜ʗ DvR#j&bb Kc;E[azҫ'n<~YQ4`KK4&y'ݜ'ЏZ=|;zJ롣>*pwܒfk#U`"i4Υ&dpHFg"e1?kjj-^\-)?_zyuU,wц >gf Ϟ]bm.y zFv"Zbrc{Ňe^XKޒ7pCV=KG;/mpoZ$e–o[nt?o.|dfrWxx7 rKo |+o#Vy+ !Tb-; ą&1 &yGk/7{^I}c(BQĨ-dC=ƒPU!P [ ~%҂ڐULmrڡ':0k),-> G @f0RZ1iQ̐KԱܧjA`ܧHVbXƟz}:2ʹGH2Y'̺$'lHf#9rC`$mp}ԛf5`ޔү`7eԳ7僓41CjF%Y U]כE^(r+!3)f'm#IW~`ݗX`1O #OYݲdtwO/7ERY!7 ˒YqedDƥD[k%#5_.=Zt֔OұE: O4bRh=Zxp2m{hx ;V #G $|EEN ¿OEOf"W2W?CQ,-+1[{5MHIJr{^h+ &FY-zNyUDo_2Xs/E wGһu%M{wQKSU8:{`610@Mǫ+D6cIXJp'8R7F{)"nroבQZu/yoD2iym)w3)q01,IH{d3$3vd_&m'ժIۚ|{&OФ MW5.CӳܧgEy3#Q#̔Кu nMjgTT%QM~QoUg#ZiBm!~$P{x;ؓ{{]pӏݖν.~ԝs郉A{3gK5|[.V{+֌KJ%iSwuPc8 CDv5. 38V7Op"<7?j`ӿ^o>)"}ޥeNlHZ ^=Eсָ]]<=Gdgs>Te!mlʹ`M6%\09N ˆ@ݛtWQGnÓrA2ՙ}v\vJ>^Jv^٨Ɛ?-Sh̆:[]=jM޲–䪱$CA<ʞE.߯C rͅ˾|YhՄK:~Yثc5H}ҷ-x p+gȜs[R`eޔ>:H9U DRڑcD&D E^e+ɺx0nYaa+(&6j|r(kj#~N/h~dmն&C$7ȍ('F,bIk ddŒHظ[ZV&bQ@97gd;g Ɨ@Z"=㈬}02 &$ Yֆ xemhpR@, E 2:Zxb1#ڢTauti"22"s)ZdvX E4Sϑ=鼀P_ve/(T`,}  "@^N4S|D֋Egc)mNg17BP9yrJ6Y'*=Qc*JfYJ(J*M6IFB!DD~ BM]sҨ7ww  ]gWG,EoT!Z!e=Y_OFIhmk #hb=119T=ڂ? ǂJ+!-ekg3v`)7,zq4TKQ2~e%Gu[W =iֱdUa2%9ȖK>M~N-ݲX-="Mӛvw[6V8%e3_hFc#;{{.p%p|uk\(:o9LNKwb79 ?^)I\ٵW>_#_-cEZ6oY )|o}*?T&2yB>0T.(9/E%l2apNNZ s&>?*>_8Z&nrժ48.WӪ^v{|M>t|ESg P+۩.d >Rꐻ^|+7v-'Sˍm5MHsp('r6: 2ѮF4FDl "(K"X'!H= >+>qAC!|@MA!)LzI6ȉ%RV'”C"<УY$3,~P̉ \poth}GS): $`mCb(Jtp4mjRbR]zXa=jp9b?*hCOpAv xeg$?x yP{lL/W Zu'_ԗ=Î(}FDG/G->ncQt8*J>:׍nnmky󷮮W 7Oonlz=9)yэuَO/ ƘP٭BOxY-|եmk)kpRTtc:~rc ?p v4Ͽ?TZi 7,b=)1-xⳠ]ǷhYwpIďmYoɽ#e^6ܐBENA]`NX YLd;v&tg4A<9 Xr?lj-A}kuGnAcu2gbB@^l#ak1HNcEI{AB%7oU&Nڬ ziAS6^DgKN[ -TT}BBG' YpxfX3vF -8OCCɷ58c؉IhڥoWo bXdÓ_n<ڱʡG fl CH<x14k UpNj.4˧`]na;cEz8."y$PGl8mkׅG . ^:#r!"6F<蝆hb;-HP"9IhH:F %R(,!+$נrc؉P1EB4ĕlR%IJyR /bjm)]jZJBJwA~j%%#d8{Y51zEtVL*D҆}*1tJd|zٌlrInDX#Y<z:)HթQ¨Nz㹠]b0NiőI3`>Vt2:, ̚"dt&ϷC@,4x_ׅ5]4505 \Z~bdv82 X9%ȋH 0&&E *'M:ߧ>?˫%#y$=A̝+,IN'V,)D%6x9Λ;W)Z>TO?:@BitV{PѲz@)F77.4we;.1=1V+U> 110GXubBB{DctcgRE|;odi0B֞g\mK"hOG]ɐmtyA3"; ɡP|,'/YMz;y"Y&p"Ps?7Z(7ar@쓟}aHs Ou.I:=ْ V*1Ps—ތiL\x%f hf"X;ɑ̛;3KwsE "hb#҆˅y9׫%v(*޵u$"m0d߫:b`&3OY}hK$MRv&)IPP>-1E_Uu]R U;sk+p<s273I)yQ[DAwCBW.).(Z1բ9SбYz(6FR5#ũ@e7iE^}Q߿]^|cfXyȢ HUQc'{TΖX I]B,(k*]"g.dΤ7"G5->'s|wG˷#UB%qFtaBauF) :nz; Q8Ǵɦ` \AfgB8BJav8B )4yP9Sd1: \oκ|tBr!@& &SH|d]@ 4֭35!æ982땯wb9%4E@2&f$vP^dR@CT D6?2z}ubg ,fLc]&Ƈp?orFtU}^Hv]Tyri{N5F-v-ZzzB8u}/H%];u[ݸq:G J@B/H,RЧnUjeN:^KwlIχ'+9{O;dzg?bm{sIu?6xNbZ$zTV߱F9g(hfKϟz#s^nTQv} (lr+ߵq0ŽߵR>xp\ACnUn9vNI&|Š64"d X 9Y7 JF,o\mMIn>ЛJʭ=s`Q&~O!$eIz kqxMx)a{3]X/K^-o [8` >: 'Eýmo?Ek^7|QOVCkM$I8Z3uAR$,G T]1}k>BArT\7ANV_r[\Dp㻏me=0_奯T K`,Ӊ1H[mCjdN҄6qvddRj$ƥ촔1iHUQKGwC%iSR<&8Haf~¶FR¶֤VpO)r"YM`2^1BH,ҹ\ԂV '9}>/ 80n75] 2Җ]ȶ{ǫ@wok+N"M&2͗#9sdG\Mh`'j szkD"y1^w9:â,' !f_/DGф @,UK1lom^Ʌeik_8Z/>->{ ֑5P"Ly0`{u Y@*H;MۢA/a.=:+|D!q͸,Bܑ99 D/ +| eBd ܅KC$b7£ܑi-)bEiXpשn?m@kƭ\,zq C3*]էKbICo+$ZT/be[l@/yhPOOE1[Dd *)kqZ+yRАe k5Bu~u/tx"hrLe-jUg+Ӯ8L L`lNÝFReͯ戔6t7>LVw4ȃ UI(FRBFB"axeklveͅ<{nr>r JvXH.6#H9E,7Dw}I^V)9Z~X0Xv$O48S\U10Co8و@\̵vdCD/Dgه2^:&k='x&.Y@fe^TU9R0 rO/&nq[]KU s5Y &{w #t dzE[frc-s&#dNEb!p3v{->-}[i!gAyɲ iƁ̑8Ҝkr.T?nW0~u7R0V9ǎ->Vvo&:4}릃^Ӆ9A=QHap^s`?> 7<50e5UM IR8 ,p_8I^L‡zÌvkݑn}\Ht'0t%^ɺ{Y/!+/ x}6Wi8-K\gɇ4DV~lty?sт׹z2R%m>KV*5Yj 槛b5z!Y?:+k-V?s9ׯ?9/vr}yI tOu|ák}vϖ(@:F~?7A~֟H 'ΦJz2*Wԙ4jOh>eqHŭgB`7{WD<~W׋ׯ'wڦ:ʭlIG 9M;O:rYi:!&AO "l7}J0Gj`)>d=XxxovG-Ŗg}REC)ddImtKy5Wy+Xq˟5Gz7h Iֵ1|Xoc6a8l+QZ\6$H4YFM댚+ujCY+2 &䪧`=>ek2kOF#ɍ'#!i/fF%f?iq?sG#J|;9q͘;m(L? 6{Ճ_.Ǔٺkz\PV ?\?8zKn9 z{`uWW7,u͙Y r'Ӵ5}OØ2Oo~kG2i%Γvgz];8w{y;@KZ>@$k헵:IpxngiLݲ:s'`y&t[mǏZ[%Fz6Q?bb%^ F|t^^tjɋ[D=N/9RlmՖawo?n¬7Zc= y+S'FNu2淼&gkyM~sZ3>|- x+aݓ;-Vs;o\D[AžϴvkK&hwO֟Z?Pu!!߸)sfỵAic׌"*:~j־VCօ|"#SLDyꁫ|W򌏜/d !s%!X%4J(?;#:Z O:9~̾΅t>X}ƛƑǺI2ͫפip] O 7?> {Id6q.#ZH̽*!RqH 1)Zb5:ED=C铡!RrNYi#'UdK &l8mmC~;j  <ޔ[yi8'.~aD F/lZ g,A:\tvD;z"3E|I,2ak S`:.)d]_'4zȾPid0ϒ _< ggM{lK έ㧢D0~rս:4u}<}QLr ge,-NG7Zv]o >_[K^O!'^ΉjI ":G^)b"mE ߇ X/?^IȁDK؃gNĤ֩+s'37bux>wa& wȸ)!zOb:1}xSl0fE ׵6ԁ$)WlIOd3P9 }@GiNDfƢ4: 3 RH'-w܀\?TVO{7 |9n=@Gi ,h۬r0;Wo"gBaӣrjrrɻc3^ B* h oɌVC?};ն+}dH*K< |=CaJ0eV/]GE6 b uOKa&#zmQ2:t('y揷bizœ7qd@O$q,:iD6=~ee+j|hSlУ3o( yf4V>vcs ȪM/ p*7E/x0d̘d6)T8qS҄?7CZr.s p03A5yWng}Z9JL:n47{>"o=BKNP9oUZ4q`}Ca)quoH?oA ؐW˒q#c @rwّ4=ʟ1Ph GܰTuR{RMbtI"Y2|pnP̦Ec(C)E8®T$*S&m'@ 5?*fYVeg@G |u>rU2r/7LsrYnF:NMpf}ҝ`4򤳘7g :j0K@ֿoǜMiV!Fb 8D@=d2%-Eȭ\䛆 c7Ȫų|R7R5}9}+&|\ ϏK%&#UtStz}xxܺB~l?VQx~,ya,mߓ_V\oF*r_n[i8#z"Xz*,SZEz1S0 lie=5[n SahyO! Q5M6W \eQ]fc`NÔuhPzpЪZVs\^SmҴίjwECL>MY5ɢ>rTQ'fCLV0ru|LWbY:Obpĝf {xr|Pr3%@+EjUlۙI;9KhoB{ L/ &SwcVҎ=|EJPpǒ׏?_:kpb};:nԦo܌bR9ro^$?}y]|y?NjO$,y45Y%~:>&K;zU k9M-)SzNp'n߼w_-Z4*na?#W-1RS>zwK񕧖 ŻYeCNP|㷵~k;9csn񿓃~tNn6~E!G_wqGgocAyWe/-Sʋ$'g{}e륛_ zCaob<8`z);RBOFCFX yQmﵽy LoE[$}gxwgn+]Soۋ㧿}PŎIpFZO_i͞D yW03Иmc]41Hܘ/g7jڿ0Аtj6&9s-W&0QWk [A+`'it7Hp~a aߠql:4AX ;2Mn6Ca¹u_Ύ]7)j%L'4$X&9*{iͨmGv 1O]tYZls+: &\(R*gd+1MJ_:rɢrg7=i`mX4 )Ɠ`j(z"=!+yCss!HGFOneVK!рlm5]ì4((QB((k8%Ĉ!o_O^7.EdN(wd1,f+$ |&S }vd%HI:\?RceF It)fANNaB\JYo_(M2!jid9 &j&uJHDQMC 9_(I9dqs9*@Fm YVϚ5bPA7gk-z2zN(CyHzCmi( Zvmȹv$9QVMҠ%7#NdLR $z]:^܅RI])9PFpi6'.X0lFLjikmvu&R6b}NW1哦E݆cQ /ǒ$jf"@ѧL^|PҒR\^#2/@IEf4}xrߒQaߏ笾k~/.4 Iaz?/Rq]3`v3ږG*1 ٽ:XNTk[[jEv5N"ಖKڸ%e(]`)savpT gbԷ $Zvy'%Űȟ%3C+ cfшu0 | CrH!Pcz2C<)`B0 QB_oqB&X<1׍5eŜ+4yGvQ:`Xo lDqCĄ hW0$aX)E0J0TJ2=AAi ۻ$Vr mz#"e pݛ}ځDmi$ya%nɒVd) -vSŪ"Y}AQ*5`@) !yPf %^ -D|S' 6C ;EtLzr?w%"+<k':OaFNjɍ~|;Nm?fMo?Yi,ݰyά}V9^q׋;w}jһKt:xsof/cP»J ] FhXy71Ւ!L1nS%"$W.%2iK+ Em[ώ}w(Iqͅބqvoeێavkyء^Ȱ+m?[&Bj ( I`6h!_hïL)y[,OSaj)bib5Or5%s!"rF ь Qx.l?͌"Z)zFFO25#!Y`NQr^&BPłQEڣ0o=6_6`Ȟ๹zkJnE?PZq; rD)a\*>ѹ|M wA(E(N@NjͤmMΖ !DS2>:۪:BkcrNXrֶeˤ櫪6HsG)훻dDe}́cIky|r49Pm-s[fYܶ,"Z-zx[qn&-AԇP9)PxBLF'ITz##COJP}$y=Ғ?XmKr(AqH]Z%@}9 v !fߒD "VFP xF*(⤒\q.o@Pͤn]0S"Q0X|NhbWj] /۫VTKI_;kDAI׭XOS#$1}'wqk3@Jk05yz': hu3<~͓\b@=Z;x>RB$^?PT8[@'TM`5gRi VK5H'+TchHF?aˁ`JJN@qqPo|@l@p8g L 8ۈ0ȃ*Q)Ih"%nT ÉY2^4E)&ʩ6Hؼl\dTD)r^@ T {i.@RvjB Lp\xq4B\ss=aų@Q>%Ap&$Z l`Bjx $b@ǘD\*@Ј 2C,dvp|v8]L .f D_ Ɣ<їrnZIOd6LZUh灒)R}"ѱDT E?=c ~$TlBJ.'Cr,X?~xӞ[{=j.կEypgo&]%)1toowW9mq|c/CAFIƫ5@_/?z~ %}B9hZeF&*Bū. 'ďl}{ӐFJhpZMaY{X~Ή3J4R[ϣI Go_7a VB{CUxBUA@h^nGf tnpO@_ R}nhd֖0^rdPbe.)#3+xEEFnEޭ?VMټ E%Q,Z1ۑ9FęxjNW¸1\ ]UMSQZd tP5/KQyaTd'Wv~rjAKbK*\%\{ G x=Ds+UḍȀ 8o"G+Z%K$CAP=p#u #uBSiEe1N21ց Bz&TsV)#+oeO Z!K\E#j2(bpSA#>Q W,ojv: ?׽) G`Y19f)eFrc{ke!ּ&%܍jJ+<ȀH@9攋]D5|a]hꓼbtFn3^٧x;}ShDMA@Vr^EUc'\Zx,`^nio~.Hաٷz8Z5qV( 6I4;_t@Q{Y4vk]_h /9bXtԙFz9r)@xk?r4c]U:QzpnIExBA+Li%ಫyÅ]_ƚKB Mܘڡ`P9ʍc`U0h-C\e^k>DĆh f~QX޵eo0@g($u2xs7cVs;jHmޤQ8'LjwY' 7Zt M/6hmf@D|is~'1fڋ\_{r)5/7_Վmr}jEh4ɬRIvC ^ѿk=GZLmFׂO+35K dCϞl/؜;O Ġ*L( G]%)pSmy]ЬUN.yz몃 !e;zN2 W}};f'ـ_>82A)'X38[^%[AH1V&^g,Q*sڤqXr \ZsE,'YRG|x!F&E:^"Q G}hb Bg|BYgDITn{[ƷQBd!O$jߥZrV5T9-]cZ6XFb}٬^/zĜNH4(bՇ,MU7#t\XӰ}Tbm]Q;U>- {e/.-&*>2z.L·Z%=[cGE-?&{jVH5XlG 8Db$Oz ָ茕 ł笽e-಻Ԣd~.t_93,'T}7H@ۥl+\tt;6(ewj ײ9?XbOo~+_h Ԫf6;ʼnًen]pibNIIơN&ٍTrwuomGhxJT˹\) ݛ~ 8̇ն_aU;}Aw<\{8c:;3eZk+z-. NvGٌ'ڊi YRv&BѐWtjךɏ.XºŠDuھu{I?kH߾[Kn֭ y*N)\ou}l5⢽M׈ya>ol$1N)P+M/D00qn4X扴4bp)'ѱ>Q -Ԣl3Jld]ON Ȗ›wVP"&\N<$;8 V\ےAzr+/w ]&~k>~OwC})ť?`"wC(gRNcCȿ;THT;rlL?d:[o)3l&5wDD_AQM/zWv:>:bA_>a~?7Z][o+F^ ]eHΐy*hq ^O8KdjWRNsY~RڶT͸-YDpm'>[,&pvtҀ3'?#8ЊsOCCt+.'5OCxd~ӵEJrtIˇH" l:ؤX`Z0Ľ|܍|{=lp"8㇏&aZ|)&1o9E·,'§+pm]8§ >8B[NixI3yuѵ`c4* =d sb`ك)jՇ\2+֒cՙ,B;jΩkl;-p&#x C>zJ8gwl.磻~Ҹ|o.AvWW'YjxУ璽G~7زUd/f#kɐM*#ci]2g[[W~ r{qfC5C]`U\&i+s ?YM/SMj+=dcN:`V~Cs`@ʩl4A֞B#~&|R,0)zpd(b| 쭷'ri4riY P?K#|.,8E *\Cr1R X)^)t& 쳠M1{0:-;%*B@(UY?ՄF*.(JBXE Y(Ķ$95N386?ъ3  N[pIs)Z`t.>aqt\ٟj = e#$YR. ljbzsw[hO9 N0루1V`$DfED,vJ,Ƙʮ)^P%!3E%2B:ؠ4Yc!G3,~ =/i+xM`IBlmB7@(5@(J*۴; >jԆ@T6WH$.ħDJʈQArxY ԠCk5}{Fk4Nfc Ә<ϜR[B&a8NE쀦H-}ۋ9X QGrE&y*$A2XbβTk#M*Yd\1:dPR0JQ)Ss&%yUT!IU 'Rw] ovar/home/core/zuul-output/logs/kubelet.log0000644000000000000000006345335115155042135017710 0ustar rootrootMar 13 15:02:48 crc systemd[1]: Starting Kubernetes Kubelet... Mar 13 15:02:48 crc restorecon[4755]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:48 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 15:02:49 crc restorecon[4755]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 13 15:02:50 crc kubenswrapper[4786]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 15:02:50 crc kubenswrapper[4786]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 15:02:50 crc kubenswrapper[4786]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 15:02:50 crc kubenswrapper[4786]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 15:02:50 crc kubenswrapper[4786]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 15:02:50 crc kubenswrapper[4786]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.272717 4786 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275446 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275464 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275469 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275473 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275477 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275481 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275486 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275489 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275493 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275499 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275505 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275509 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275514 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275518 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275521 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275525 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275529 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275533 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275537 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275548 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275552 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275556 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275560 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275563 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275567 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275571 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275574 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275578 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275581 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275585 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275589 4786 feature_gate.go:330] unrecognized feature gate: Example Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275593 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275597 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275600 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275604 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275608 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275611 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275615 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275618 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275622 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275625 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275629 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275633 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275636 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275642 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275647 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275650 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275654 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275658 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275661 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275665 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275670 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275674 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275678 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275681 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275685 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275690 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275694 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275697 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275701 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275704 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275708 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275711 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275715 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275719 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275723 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275726 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275730 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275733 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275736 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.275740 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276504 4786 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276517 4786 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276524 4786 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276531 4786 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276537 4786 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276541 4786 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276547 4786 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276552 4786 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276556 4786 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276561 4786 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276565 4786 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276570 4786 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276574 4786 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276578 4786 flags.go:64] FLAG: --cgroup-root="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276582 4786 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276586 4786 flags.go:64] FLAG: --client-ca-file="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276590 4786 flags.go:64] FLAG: --cloud-config="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276594 4786 flags.go:64] FLAG: --cloud-provider="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276599 4786 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276605 4786 flags.go:64] FLAG: --cluster-domain="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276610 4786 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276614 4786 flags.go:64] FLAG: --config-dir="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276618 4786 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276622 4786 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276628 4786 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276632 4786 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276636 4786 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276640 4786 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276644 4786 flags.go:64] FLAG: --contention-profiling="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276649 4786 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276653 4786 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276657 4786 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276661 4786 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276666 4786 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276671 4786 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276676 4786 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276680 4786 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276684 4786 flags.go:64] FLAG: --enable-server="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.276689 4786 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277915 4786 flags.go:64] FLAG: --event-burst="100" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277922 4786 flags.go:64] FLAG: --event-qps="50" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277926 4786 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277931 4786 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277936 4786 flags.go:64] FLAG: --eviction-hard="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277943 4786 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277948 4786 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277953 4786 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277958 4786 flags.go:64] FLAG: --eviction-soft="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277963 4786 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277969 4786 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277974 4786 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277979 4786 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277984 4786 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277990 4786 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.277995 4786 flags.go:64] FLAG: --feature-gates="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278002 4786 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278009 4786 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278014 4786 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278020 4786 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278026 4786 flags.go:64] FLAG: --healthz-port="10248" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278031 4786 flags.go:64] FLAG: --help="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278037 4786 flags.go:64] FLAG: --hostname-override="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278042 4786 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278047 4786 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278054 4786 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278059 4786 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278064 4786 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278070 4786 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278075 4786 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278079 4786 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278083 4786 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278088 4786 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278092 4786 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278096 4786 flags.go:64] FLAG: --kube-reserved="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278100 4786 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278104 4786 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278108 4786 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278113 4786 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278117 4786 flags.go:64] FLAG: --lock-file="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278120 4786 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278125 4786 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278129 4786 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278135 4786 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278140 4786 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278144 4786 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278148 4786 flags.go:64] FLAG: --logging-format="text" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278152 4786 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278156 4786 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278160 4786 flags.go:64] FLAG: --manifest-url="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278164 4786 flags.go:64] FLAG: --manifest-url-header="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278170 4786 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278175 4786 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278183 4786 flags.go:64] FLAG: --max-pods="110" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278189 4786 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278194 4786 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278200 4786 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278207 4786 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278212 4786 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278217 4786 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278222 4786 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278232 4786 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278237 4786 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278241 4786 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278245 4786 flags.go:64] FLAG: --pod-cidr="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278249 4786 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278255 4786 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278260 4786 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278264 4786 flags.go:64] FLAG: --pods-per-core="0" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278268 4786 flags.go:64] FLAG: --port="10250" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278272 4786 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278276 4786 flags.go:64] FLAG: --provider-id="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278280 4786 flags.go:64] FLAG: --qos-reserved="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278284 4786 flags.go:64] FLAG: --read-only-port="10255" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278288 4786 flags.go:64] FLAG: --register-node="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278292 4786 flags.go:64] FLAG: --register-schedulable="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278296 4786 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278303 4786 flags.go:64] FLAG: --registry-burst="10" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278307 4786 flags.go:64] FLAG: --registry-qps="5" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278311 4786 flags.go:64] FLAG: --reserved-cpus="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278315 4786 flags.go:64] FLAG: --reserved-memory="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278320 4786 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278325 4786 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278329 4786 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278333 4786 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278337 4786 flags.go:64] FLAG: --runonce="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278342 4786 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278347 4786 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278352 4786 flags.go:64] FLAG: --seccomp-default="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278359 4786 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278364 4786 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278370 4786 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278376 4786 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278381 4786 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278386 4786 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278391 4786 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278396 4786 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278400 4786 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278405 4786 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278410 4786 flags.go:64] FLAG: --system-cgroups="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278415 4786 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278423 4786 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278428 4786 flags.go:64] FLAG: --tls-cert-file="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278434 4786 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278441 4786 flags.go:64] FLAG: --tls-min-version="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278446 4786 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278451 4786 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278456 4786 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278461 4786 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278466 4786 flags.go:64] FLAG: --v="2" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278474 4786 flags.go:64] FLAG: --version="false" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278480 4786 flags.go:64] FLAG: --vmodule="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278486 4786 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278490 4786 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278583 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278589 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278594 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278598 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278602 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278606 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278610 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278613 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278617 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278627 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278631 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278635 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278639 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278642 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278646 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278650 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278655 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278660 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278664 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278668 4786 feature_gate.go:330] unrecognized feature gate: Example Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278671 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278675 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278679 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278683 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278686 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278690 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278695 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278699 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278703 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278707 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278711 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278715 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278719 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278722 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278726 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278735 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278738 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278742 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278746 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278750 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278753 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278758 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278762 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278766 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278770 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278773 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278777 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278781 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278785 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278788 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278792 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278795 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278799 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278803 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278806 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278810 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278813 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278817 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278820 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278824 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278827 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278831 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278835 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278839 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278843 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278847 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278867 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278873 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278877 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278881 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.278884 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.278890 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.291970 4786 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.292017 4786 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292146 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292158 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292169 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292181 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292192 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292204 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292214 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292222 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292234 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292246 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292257 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292268 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292305 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292314 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292322 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292331 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292340 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292348 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292356 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292365 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292373 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292381 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292389 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292396 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292404 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292412 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292420 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292428 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292436 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292444 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292452 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292461 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292469 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292477 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292485 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292493 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292502 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292510 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292520 4786 feature_gate.go:330] unrecognized feature gate: Example Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292528 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292536 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292547 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292556 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292564 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292572 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292580 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292588 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292597 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292607 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292618 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292627 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292636 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292646 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292654 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292662 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292670 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292677 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292686 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292693 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292701 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292709 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292717 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292725 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292734 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292742 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292749 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292757 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292765 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292773 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292781 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.292788 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.292802 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293074 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293089 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293131 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293143 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293152 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293161 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293170 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293178 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293186 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293195 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293204 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293213 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293221 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293230 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293240 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293253 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293267 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293279 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293291 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293300 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293308 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293317 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293325 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293332 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293340 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293348 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293359 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293368 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293376 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293384 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293396 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293409 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293421 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293432 4786 feature_gate.go:330] unrecognized feature gate: Example Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293442 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293452 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293460 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293468 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293477 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293485 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293493 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293501 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293509 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293517 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293525 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293533 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293541 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293549 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293556 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293564 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293572 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293580 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293588 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293597 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293606 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293613 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293621 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293629 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293637 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293645 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293653 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293661 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293671 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293681 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293690 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293699 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293707 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293715 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293723 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293731 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.293738 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.293750 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.294018 4786 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.298911 4786 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.307422 4786 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.307714 4786 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.312061 4786 server.go:997] "Starting client certificate rotation" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.312113 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.312280 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.338964 4786 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.342163 4786 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.342440 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.360756 4786 log.go:25] "Validated CRI v1 runtime API" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.401503 4786 log.go:25] "Validated CRI v1 image API" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.403585 4786 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.409461 4786 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-13-14-57-25-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.409492 4786 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.427175 4786 manager.go:217] Machine: {Timestamp:2026-03-13 15:02:50.424567229 +0000 UTC m=+0.587779080 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f3ae1b88-a7c2-4500-a284-224d87cf19ab BootID:69102ab2-c57d-44ef-8cae-daa07cf79399 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:13:3f:67 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:13:3f:67 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:98:9c:c3 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a6:9f:47 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2c:04:e8 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a7:c4:95 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:65:d9:9f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:22:3b:73:3b:95:0e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e2:83:db:46:49:eb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.427422 4786 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.427539 4786 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.427804 4786 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.428015 4786 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.428049 4786 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.428257 4786 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.428268 4786 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.428801 4786 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.428832 4786 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.429927 4786 state_mem.go:36] "Initialized new in-memory state store" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.430014 4786 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.433779 4786 kubelet.go:418] "Attempting to sync node with API server" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.433800 4786 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.433815 4786 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.433828 4786 kubelet.go:324] "Adding apiserver pod source" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.433842 4786 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.469025 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.469834 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.469879 4786 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.470461 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.470606 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.471225 4786 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.473048 4786 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.474983 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475024 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475038 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475052 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475074 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475086 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475100 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475121 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475137 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475152 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475170 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.475183 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.476110 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.476804 4786 server.go:1280] "Started kubelet" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.476973 4786 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.477163 4786 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.477957 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.479032 4786 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 15:02:50 crc systemd[1]: Started Kubernetes Kubelet. Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.480717 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.480803 4786 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.481339 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.481520 4786 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.481607 4786 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.481631 4786 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.484121 4786 server.go:460] "Adding debug handlers to kubelet server" Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.484697 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.484832 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.484898 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="200ms" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.485163 4786 factory.go:55] Registering systemd factory Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.485188 4786 factory.go:221] Registration of the systemd container factory successfully Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.486041 4786 factory.go:153] Registering CRI-O factory Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.486066 4786 factory.go:221] Registration of the crio container factory successfully Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.486149 4786 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.486181 4786 factory.go:103] Registering Raw factory Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.486201 4786 manager.go:1196] Started watching for new ooms in manager Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.487024 4786 manager.go:319] Starting recovery of all containers Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.485604 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.12:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c6ecac6e8de39 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.476731961 +0000 UTC m=+0.639943812,LastTimestamp:2026-03-13 15:02:50.476731961 +0000 UTC m=+0.639943812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499536 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499607 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499628 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499647 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499667 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499707 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499727 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499748 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499770 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499791 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499810 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499828 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499845 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499956 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499974 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.499993 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500012 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500030 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500051 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500069 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500088 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500108 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500137 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500164 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500182 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500203 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500224 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500242 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500262 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500283 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500339 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500368 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500414 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500457 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500476 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500497 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500515 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500533 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500551 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500578 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500597 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500615 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500633 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500650 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500671 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500693 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500713 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500731 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500778 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500799 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500816 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500895 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500939 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500960 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.500980 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501001 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501022 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501040 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501058 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501076 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501095 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501118 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501137 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501157 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501174 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501193 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501213 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501231 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501249 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501266 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501285 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501313 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501339 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501357 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501382 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501400 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501419 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501440 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501459 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501479 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501497 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501520 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501538 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501558 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501576 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501593 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501611 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501629 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501648 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501667 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501687 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501705 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501725 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501756 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501775 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501794 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501813 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501832 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501851 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501911 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501931 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501949 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501969 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.501990 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502016 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502036 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502057 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502076 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502119 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502150 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502170 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502190 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502212 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502237 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502256 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502284 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502302 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502320 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502338 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502355 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502376 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502395 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502421 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502448 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502465 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502489 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502506 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502523 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502540 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502558 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502576 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502595 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502613 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502632 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502652 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502679 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502697 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502716 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502732 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502750 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502771 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502791 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502810 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502828 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502847 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502902 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502924 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502944 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502961 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.502981 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.503000 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.503017 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.503037 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.503056 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.503073 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.503093 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505168 4786 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505214 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505238 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505257 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505277 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505297 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505363 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505383 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505402 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505421 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505440 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505462 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505494 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505514 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505534 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505567 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505585 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505605 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505624 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505659 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505677 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505694 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505777 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505797 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505818 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505836 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505877 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505903 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505927 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505945 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505966 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.505985 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506003 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506024 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506043 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506062 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506082 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506101 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506121 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506140 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506158 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506177 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506194 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506223 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506241 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506258 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506277 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506299 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506317 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506339 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506358 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506376 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506402 4786 reconstruct.go:97] "Volume reconstruction finished" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.506414 4786 reconciler.go:26] "Reconciler: start to sync state" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.514272 4786 manager.go:324] Recovery completed Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.526375 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.527787 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.527843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.527886 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.528791 4786 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.528807 4786 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.528885 4786 state_mem.go:36] "Initialized new in-memory state store" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.548301 4786 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.548954 4786 policy_none.go:49] "None policy: Start" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.550213 4786 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.550247 4786 state_mem.go:35] "Initializing new in-memory state store" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.550691 4786 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.550736 4786 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.550764 4786 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.550973 4786 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 15:02:50 crc kubenswrapper[4786]: W0313 15:02:50.551574 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.551643 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.582588 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.602140 4786 manager.go:334] "Starting Device Plugin manager" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.602306 4786 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.602321 4786 server.go:79] "Starting device plugin registration server" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.602820 4786 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.602837 4786 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.603341 4786 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.603427 4786 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.603436 4786 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.614742 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.652037 4786 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.652159 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.653462 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.653501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.653515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.653668 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.654091 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.654162 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.654758 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.654796 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.654811 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.654984 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.655098 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.655195 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.655788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.655844 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.655876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.656012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.656136 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.656172 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.656574 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.656718 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.656778 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.657030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.657061 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.657080 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.662550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.662596 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.662614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.662590 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.662666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.662684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.663488 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.663677 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.663744 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.664755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.664783 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.664797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.664906 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.664934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.664945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.665067 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.665110 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.666020 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.666051 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.666062 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.686285 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="400ms" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.703493 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.704593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.704670 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.704689 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.704725 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.705247 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708160 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708209 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708245 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708311 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708392 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708449 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708529 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708560 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708589 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708678 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708723 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.708813 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.810632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.810710 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.810763 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.810809 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.810827 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.810842 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.810962 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.810973 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811043 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811056 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811034 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811086 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811083 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811047 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811121 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811152 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811169 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811129 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811295 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811330 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811366 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811398 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811431 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811458 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811528 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811548 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811433 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.811655 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.905918 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.907972 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.908034 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.908053 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:50 crc kubenswrapper[4786]: I0313 15:02:50.908091 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:02:50 crc kubenswrapper[4786]: E0313 15:02:50.908768 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.001322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.011277 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.027771 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.038288 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.043424 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:51 crc kubenswrapper[4786]: W0313 15:02:51.055979 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-bb9183c2443bd2fddeddc80e32aeeea79830fe66ab1f8c834e3aa6d7a8e35967 WatchSource:0}: Error finding container bb9183c2443bd2fddeddc80e32aeeea79830fe66ab1f8c834e3aa6d7a8e35967: Status 404 returned error can't find the container with id bb9183c2443bd2fddeddc80e32aeeea79830fe66ab1f8c834e3aa6d7a8e35967 Mar 13 15:02:51 crc kubenswrapper[4786]: W0313 15:02:51.058684 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5a110c5cbee02fa8bda0f42c4d62f40d7bd68d12f3cc49583542dc4561a00366 WatchSource:0}: Error finding container 5a110c5cbee02fa8bda0f42c4d62f40d7bd68d12f3cc49583542dc4561a00366: Status 404 returned error can't find the container with id 5a110c5cbee02fa8bda0f42c4d62f40d7bd68d12f3cc49583542dc4561a00366 Mar 13 15:02:51 crc kubenswrapper[4786]: W0313 15:02:51.065113 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ff05afdd1830de8e601857cbfd97d7a8a8312464d3426cacf6e7bd51a21baa65 WatchSource:0}: Error finding container ff05afdd1830de8e601857cbfd97d7a8a8312464d3426cacf6e7bd51a21baa65: Status 404 returned error can't find the container with id ff05afdd1830de8e601857cbfd97d7a8a8312464d3426cacf6e7bd51a21baa65 Mar 13 15:02:51 crc kubenswrapper[4786]: W0313 15:02:51.067631 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-89e1492ca574d036d641b36b59ea12a5bafd9229d38860eb7679bf0f1297c7d1 WatchSource:0}: Error finding container 89e1492ca574d036d641b36b59ea12a5bafd9229d38860eb7679bf0f1297c7d1: Status 404 returned error can't find the container with id 89e1492ca574d036d641b36b59ea12a5bafd9229d38860eb7679bf0f1297c7d1 Mar 13 15:02:51 crc kubenswrapper[4786]: E0313 15:02:51.087114 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="800ms" Mar 13 15:02:51 crc kubenswrapper[4786]: W0313 15:02:51.285765 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:51 crc kubenswrapper[4786]: E0313 15:02:51.285910 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.309460 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.311502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.311547 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.311577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.311606 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:02:51 crc kubenswrapper[4786]: E0313 15:02:51.312170 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Mar 13 15:02:51 crc kubenswrapper[4786]: W0313 15:02:51.350489 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:51 crc kubenswrapper[4786]: E0313 15:02:51.350562 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.479533 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.555728 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ee521934cca71968db2d2d05c8ea400479665dd914fc40d385e85d07a3875c11"} Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.557022 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"89e1492ca574d036d641b36b59ea12a5bafd9229d38860eb7679bf0f1297c7d1"} Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.558501 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ff05afdd1830de8e601857cbfd97d7a8a8312464d3426cacf6e7bd51a21baa65"} Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.559612 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5a110c5cbee02fa8bda0f42c4d62f40d7bd68d12f3cc49583542dc4561a00366"} Mar 13 15:02:51 crc kubenswrapper[4786]: I0313 15:02:51.560613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bb9183c2443bd2fddeddc80e32aeeea79830fe66ab1f8c834e3aa6d7a8e35967"} Mar 13 15:02:51 crc kubenswrapper[4786]: W0313 15:02:51.865686 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:51 crc kubenswrapper[4786]: E0313 15:02:51.866101 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:51 crc kubenswrapper[4786]: E0313 15:02:51.888655 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="1.6s" Mar 13 15:02:51 crc kubenswrapper[4786]: W0313 15:02:51.987441 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:51 crc kubenswrapper[4786]: E0313 15:02:51.987549 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.112837 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.114419 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.114662 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.114679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.114717 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:02:52 crc kubenswrapper[4786]: E0313 15:02:52.115254 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.467624 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 15:02:52 crc kubenswrapper[4786]: E0313 15:02:52.469025 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.479617 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.566186 4786 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09" exitCode=0 Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.566295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09"} Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.566308 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.567460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.567523 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.567549 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.568760 4786 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed" exitCode=0 Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.568814 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.568890 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed"} Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.570314 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.570354 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.570371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.573434 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"58df4970b9baef282947f53094d366c3d084a2cd4886c00ca6bda5daf320ab4c"} Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.573495 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"700a9a9f33a8704d0371a7b296f4a53316f862a41bfc10c96dc1f18877a9a03c"} Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.573522 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f3794d66cc8db9fcb32ef9b7bf307705985824b83d4eed2e2dad00bf1c4eb1de"} Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.573465 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.573549 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4308da01317dc8c0bb829355fb3a6b66e96afffa93ac3f63f531c1405b22fa98"} Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.574649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.574679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.574696 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.576932 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8" exitCode=0 Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.577016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8"} Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.577084 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.578300 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.578345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.578380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.579981 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8" exitCode=0 Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.580022 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8"} Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.580134 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.580263 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.583741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.583905 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.583957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.584219 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.584262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:52 crc kubenswrapper[4786]: I0313 15:02:52.584289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.478947 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:53 crc kubenswrapper[4786]: E0313 15:02:53.490141 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="3.2s" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.593602 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.593654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.593675 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.593694 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.599117 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8" exitCode=0 Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.599222 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.599326 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.600554 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.600591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.600605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.604731 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.604815 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.608136 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.608180 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.608195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.609586 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.610153 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.610504 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f9969ac4ffe3ab34121b0c9826d14df87bd9af2a61adaa8f384c60628f27e32a"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.610537 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"401d58f4ad322fc52a0f282330b987459649fa571251a8dcbfceaad4346d4635"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.610549 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7ca83c02c9c5b9d5492e1d45595c615398d17482f64c624ddfbcb42ca5c0f703"} Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.610997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.611023 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.611043 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.611665 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.611687 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.611696 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.715729 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.718343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.718387 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.718400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.718426 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:02:53 crc kubenswrapper[4786]: E0313 15:02:53.718983 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Mar 13 15:02:53 crc kubenswrapper[4786]: W0313 15:02:53.733587 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:53 crc kubenswrapper[4786]: E0313 15:02:53.733648 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:53 crc kubenswrapper[4786]: I0313 15:02:53.853024 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:53 crc kubenswrapper[4786]: W0313 15:02:53.918013 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:02:53 crc kubenswrapper[4786]: E0313 15:02:53.918112 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.615950 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e233354253b92d55f0dcb8360f0ba14d7d436f0bfebdebb0eec8813142ef3e0"} Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.616128 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.617380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.617423 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.617437 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.619189 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd" exitCode=0 Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.619303 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.619319 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.619474 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd"} Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.619511 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.619554 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.619615 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620694 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620703 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620757 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620885 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.620934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.621540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.622023 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:54 crc kubenswrapper[4786]: I0313 15:02:54.622195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.634541 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871"} Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.634617 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.634625 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1"} Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.634650 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d"} Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.634722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956"} Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.634695 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.634790 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.635622 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.635646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.635655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.635901 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.635934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.635950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.992290 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.992568 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.994975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.995037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:55 crc kubenswrapper[4786]: I0313 15:02:55.995061 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.002545 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.355045 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.644846 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22"} Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.644975 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.645062 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.645170 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.645201 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646398 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646834 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.646920 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.692113 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.853338 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.853520 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.919193 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.921223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.921270 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.921289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:56 crc kubenswrapper[4786]: I0313 15:02:56.921324 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.648438 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.648560 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.650228 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.650393 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.650255 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.650487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.650524 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.650434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.704116 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.704427 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.706575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.706606 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:57 crc kubenswrapper[4786]: I0313 15:02:57.706615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.443711 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.650601 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.651544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.651586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.651600 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.936626 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.936939 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.938664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.938731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:02:58 crc kubenswrapper[4786]: I0313 15:02:58.938756 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:00 crc kubenswrapper[4786]: E0313 15:03:00.614917 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 15:03:01 crc kubenswrapper[4786]: I0313 15:03:01.993637 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:01 crc kubenswrapper[4786]: I0313 15:03:01.993903 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:01 crc kubenswrapper[4786]: I0313 15:03:01.995487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:01 crc kubenswrapper[4786]: I0313 15:03:01.995609 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:01 crc kubenswrapper[4786]: I0313 15:03:01.995637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:01 crc kubenswrapper[4786]: I0313 15:03:01.998652 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:02 crc kubenswrapper[4786]: I0313 15:03:02.663003 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:02 crc kubenswrapper[4786]: I0313 15:03:02.664319 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:02 crc kubenswrapper[4786]: I0313 15:03:02.664368 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:02 crc kubenswrapper[4786]: I0313 15:03:02.664386 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:04 crc kubenswrapper[4786]: W0313 15:03:04.262442 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.262612 4786 trace.go:236] Trace[1079105257]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 15:02:54.260) (total time: 10002ms): Mar 13 15:03:04 crc kubenswrapper[4786]: Trace[1079105257]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (15:03:04.262) Mar 13 15:03:04 crc kubenswrapper[4786]: Trace[1079105257]: [10.002227702s] [10.002227702s] END Mar 13 15:03:04 crc kubenswrapper[4786]: E0313 15:03:04.262656 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 15:03:04 crc kubenswrapper[4786]: E0313 15:03:04.269468 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:04 crc kubenswrapper[4786]: E0313 15:03:04.270462 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c6ecac6e8de39 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.476731961 +0000 UTC m=+0.639943812,LastTimestamp:2026-03-13 15:02:50.476731961 +0000 UTC m=+0.639943812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:04 crc kubenswrapper[4786]: W0313 15:03:04.270882 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z Mar 13 15:03:04 crc kubenswrapper[4786]: E0313 15:03:04.270949 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.271036 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z Mar 13 15:03:04 crc kubenswrapper[4786]: W0313 15:03:04.272277 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.272343 4786 trace.go:236] Trace[1103948001]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 15:02:54.270) (total time: 10001ms): Mar 13 15:03:04 crc kubenswrapper[4786]: Trace[1103948001]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:03:04.272) Mar 13 15:03:04 crc kubenswrapper[4786]: Trace[1103948001]: [10.001947926s] [10.001947926s] END Mar 13 15:03:04 crc kubenswrapper[4786]: E0313 15:03:04.272359 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 15:03:04 crc kubenswrapper[4786]: E0313 15:03:04.272279 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 13 15:03:04 crc kubenswrapper[4786]: E0313 15:03:04.276710 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 15:03:04 crc kubenswrapper[4786]: W0313 15:03:04.277068 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z Mar 13 15:03:04 crc kubenswrapper[4786]: E0313 15:03:04.277133 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.277404 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.277504 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.282542 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.282625 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.481820 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:04Z is after 2026-02-23T05:33:13Z Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.667408 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.669181 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e233354253b92d55f0dcb8360f0ba14d7d436f0bfebdebb0eec8813142ef3e0" exitCode=255 Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.669221 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5e233354253b92d55f0dcb8360f0ba14d7d436f0bfebdebb0eec8813142ef3e0"} Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.669369 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.670353 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.670403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.670421 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:04 crc kubenswrapper[4786]: I0313 15:03:04.671150 4786 scope.go:117] "RemoveContainer" containerID="5e233354253b92d55f0dcb8360f0ba14d7d436f0bfebdebb0eec8813142ef3e0" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.148098 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.148279 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.149344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.149404 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.149424 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.206099 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.478073 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.483041 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:05Z is after 2026-02-23T05:33:13Z Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.674625 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.675363 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.678158 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" exitCode=255 Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.678269 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.678250 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077"} Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.678481 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.678571 4786 scope.go:117] "RemoveContainer" containerID="5e233354253b92d55f0dcb8360f0ba14d7d436f0bfebdebb0eec8813142ef3e0" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.680070 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.680111 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.680183 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.680236 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.680281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.680299 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.681256 4786 scope.go:117] "RemoveContainer" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" Mar 13 15:03:05 crc kubenswrapper[4786]: E0313 15:03:05.681682 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:05 crc kubenswrapper[4786]: I0313 15:03:05.700277 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.355321 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.484369 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:06Z is after 2026-02-23T05:33:13Z Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.682910 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.685776 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.685837 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.687586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.687639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.687657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.687682 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.687744 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.687763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.688516 4786 scope.go:117] "RemoveContainer" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" Mar 13 15:03:06 crc kubenswrapper[4786]: E0313 15:03:06.688813 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.853909 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:03:06 crc kubenswrapper[4786]: I0313 15:03:06.854042 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:03:07 crc kubenswrapper[4786]: I0313 15:03:07.484340 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:07Z is after 2026-02-23T05:33:13Z Mar 13 15:03:07 crc kubenswrapper[4786]: I0313 15:03:07.688501 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:07 crc kubenswrapper[4786]: I0313 15:03:07.689971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:07 crc kubenswrapper[4786]: I0313 15:03:07.690105 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:07 crc kubenswrapper[4786]: I0313 15:03:07.690183 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:07 crc kubenswrapper[4786]: I0313 15:03:07.690817 4786 scope.go:117] "RemoveContainer" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" Mar 13 15:03:07 crc kubenswrapper[4786]: E0313 15:03:07.691084 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:07 crc kubenswrapper[4786]: I0313 15:03:07.715576 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:03:08 crc kubenswrapper[4786]: I0313 15:03:08.450579 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:03:08 crc kubenswrapper[4786]: I0313 15:03:08.484328 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:08Z is after 2026-02-23T05:33:13Z Mar 13 15:03:08 crc kubenswrapper[4786]: I0313 15:03:08.690821 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:08 crc kubenswrapper[4786]: I0313 15:03:08.692108 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:08 crc kubenswrapper[4786]: I0313 15:03:08.692153 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:08 crc kubenswrapper[4786]: I0313 15:03:08.692163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:08 crc kubenswrapper[4786]: I0313 15:03:08.692690 4786 scope.go:117] "RemoveContainer" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" Mar 13 15:03:08 crc kubenswrapper[4786]: E0313 15:03:08.692850 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:08 crc kubenswrapper[4786]: W0313 15:03:08.903170 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:08Z is after 2026-02-23T05:33:13Z Mar 13 15:03:08 crc kubenswrapper[4786]: E0313 15:03:08.903261 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:09 crc kubenswrapper[4786]: I0313 15:03:09.484428 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:09Z is after 2026-02-23T05:33:13Z Mar 13 15:03:09 crc kubenswrapper[4786]: I0313 15:03:09.693749 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:09 crc kubenswrapper[4786]: I0313 15:03:09.695256 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:09 crc kubenswrapper[4786]: I0313 15:03:09.695381 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:09 crc kubenswrapper[4786]: I0313 15:03:09.695474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:09 crc kubenswrapper[4786]: I0313 15:03:09.696204 4786 scope.go:117] "RemoveContainer" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" Mar 13 15:03:09 crc kubenswrapper[4786]: E0313 15:03:09.696558 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:10 crc kubenswrapper[4786]: W0313 15:03:10.191935 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:10Z is after 2026-02-23T05:33:13Z Mar 13 15:03:10 crc kubenswrapper[4786]: E0313 15:03:10.192024 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:10 crc kubenswrapper[4786]: I0313 15:03:10.484152 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:10Z is after 2026-02-23T05:33:13Z Mar 13 15:03:10 crc kubenswrapper[4786]: E0313 15:03:10.615067 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 15:03:10 crc kubenswrapper[4786]: I0313 15:03:10.677268 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:10 crc kubenswrapper[4786]: E0313 15:03:10.678769 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:10Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 15:03:10 crc kubenswrapper[4786]: I0313 15:03:10.679228 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:10 crc kubenswrapper[4786]: I0313 15:03:10.679297 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:10 crc kubenswrapper[4786]: I0313 15:03:10.679316 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:10 crc kubenswrapper[4786]: I0313 15:03:10.679408 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:03:10 crc kubenswrapper[4786]: E0313 15:03:10.684189 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:10Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 15:03:11 crc kubenswrapper[4786]: I0313 15:03:11.482564 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:11Z is after 2026-02-23T05:33:13Z Mar 13 15:03:12 crc kubenswrapper[4786]: I0313 15:03:12.484105 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:12Z is after 2026-02-23T05:33:13Z Mar 13 15:03:12 crc kubenswrapper[4786]: I0313 15:03:12.484151 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 15:03:12 crc kubenswrapper[4786]: E0313 15:03:12.487951 4786 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:13 crc kubenswrapper[4786]: I0313 15:03:13.483176 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:13Z is after 2026-02-23T05:33:13Z Mar 13 15:03:14 crc kubenswrapper[4786]: W0313 15:03:14.085765 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:14Z is after 2026-02-23T05:33:13Z Mar 13 15:03:14 crc kubenswrapper[4786]: E0313 15:03:14.085899 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:14 crc kubenswrapper[4786]: E0313 15:03:14.276569 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:14Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c6ecac6e8de39 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.476731961 +0000 UTC m=+0.639943812,LastTimestamp:2026-03-13 15:02:50.476731961 +0000 UTC m=+0.639943812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:14 crc kubenswrapper[4786]: I0313 15:03:14.483463 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:14Z is after 2026-02-23T05:33:13Z Mar 13 15:03:15 crc kubenswrapper[4786]: W0313 15:03:15.365418 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:15Z is after 2026-02-23T05:33:13Z Mar 13 15:03:15 crc kubenswrapper[4786]: E0313 15:03:15.365522 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:15 crc kubenswrapper[4786]: I0313 15:03:15.478938 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:03:15 crc kubenswrapper[4786]: I0313 15:03:15.479139 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:15 crc kubenswrapper[4786]: I0313 15:03:15.480374 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:15 crc kubenswrapper[4786]: I0313 15:03:15.480428 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:15 crc kubenswrapper[4786]: I0313 15:03:15.480449 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:15 crc kubenswrapper[4786]: I0313 15:03:15.481282 4786 scope.go:117] "RemoveContainer" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" Mar 13 15:03:15 crc kubenswrapper[4786]: E0313 15:03:15.481562 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:15 crc kubenswrapper[4786]: I0313 15:03:15.483969 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:15Z is after 2026-02-23T05:33:13Z Mar 13 15:03:15 crc kubenswrapper[4786]: W0313 15:03:15.808901 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:15Z is after 2026-02-23T05:33:13Z Mar 13 15:03:15 crc kubenswrapper[4786]: E0313 15:03:15.809049 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.484459 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:16Z is after 2026-02-23T05:33:13Z Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.854448 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.854540 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.854625 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.854822 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.856482 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.856546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.856570 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.857339 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f3794d66cc8db9fcb32ef9b7bf307705985824b83d4eed2e2dad00bf1c4eb1de"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 15:03:16 crc kubenswrapper[4786]: I0313 15:03:16.857596 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f3794d66cc8db9fcb32ef9b7bf307705985824b83d4eed2e2dad00bf1c4eb1de" gracePeriod=30 Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.484151 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:17Z is after 2026-02-23T05:33:13Z Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.684791 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.686286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.686338 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.686355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.686389 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:03:17 crc kubenswrapper[4786]: E0313 15:03:17.687552 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:17Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 15:03:17 crc kubenswrapper[4786]: E0313 15:03:17.691149 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:17Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.721577 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.722263 4786 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f3794d66cc8db9fcb32ef9b7bf307705985824b83d4eed2e2dad00bf1c4eb1de" exitCode=255 Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.722331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f3794d66cc8db9fcb32ef9b7bf307705985824b83d4eed2e2dad00bf1c4eb1de"} Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.722383 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"34c2caa92970a47f32a973365c7f578f08cb80218cdd3dc3abfbbfcef84cc94a"} Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.722523 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.724009 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.724072 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:17 crc kubenswrapper[4786]: I0313 15:03:17.724096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:18 crc kubenswrapper[4786]: W0313 15:03:18.392915 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:18Z is after 2026-02-23T05:33:13Z Mar 13 15:03:18 crc kubenswrapper[4786]: E0313 15:03:18.393039 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 15:03:18 crc kubenswrapper[4786]: I0313 15:03:18.483654 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:18Z is after 2026-02-23T05:33:13Z Mar 13 15:03:19 crc kubenswrapper[4786]: I0313 15:03:19.482148 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:19Z is after 2026-02-23T05:33:13Z Mar 13 15:03:20 crc kubenswrapper[4786]: I0313 15:03:20.483808 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:20Z is after 2026-02-23T05:33:13Z Mar 13 15:03:20 crc kubenswrapper[4786]: E0313 15:03:20.615262 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 15:03:21 crc kubenswrapper[4786]: I0313 15:03:21.481076 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:21Z is after 2026-02-23T05:33:13Z Mar 13 15:03:21 crc kubenswrapper[4786]: I0313 15:03:21.993732 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:21 crc kubenswrapper[4786]: I0313 15:03:21.993995 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:21 crc kubenswrapper[4786]: I0313 15:03:21.995663 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:21 crc kubenswrapper[4786]: I0313 15:03:21.995731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:21 crc kubenswrapper[4786]: I0313 15:03:21.995743 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:22 crc kubenswrapper[4786]: I0313 15:03:22.483815 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:22Z is after 2026-02-23T05:33:13Z Mar 13 15:03:23 crc kubenswrapper[4786]: I0313 15:03:23.483567 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:03:23Z is after 2026-02-23T05:33:13Z Mar 13 15:03:23 crc kubenswrapper[4786]: I0313 15:03:23.853318 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:23 crc kubenswrapper[4786]: I0313 15:03:23.854079 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:23 crc kubenswrapper[4786]: I0313 15:03:23.855734 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:23 crc kubenswrapper[4786]: I0313 15:03:23.855770 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:23 crc kubenswrapper[4786]: I0313 15:03:23.855778 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.282693 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac6e8de39 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.476731961 +0000 UTC m=+0.639943812,LastTimestamp:2026-03-13 15:02:50.476731961 +0000 UTC m=+0.639943812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.288942 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f479a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,LastTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.296469 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f54ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,LastTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.302737 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f59055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527895637 +0000 UTC m=+0.691107458,LastTimestamp:2026-03-13 15:02:50.527895637 +0000 UTC m=+0.691107458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.308818 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecaceab9f2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.606935855 +0000 UTC m=+0.770147716,LastTimestamp:2026-03-13 15:02:50.606935855 +0000 UTC m=+0.770147716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.315771 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f479a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f479a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,LastTimestamp:2026-03-13 15:02:50.653486105 +0000 UTC m=+0.816697926,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.322130 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f54ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f54ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,LastTimestamp:2026-03-13 15:02:50.653509866 +0000 UTC m=+0.816721687,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.328655 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f59055\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f59055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527895637 +0000 UTC m=+0.691107458,LastTimestamp:2026-03-13 15:02:50.653524606 +0000 UTC m=+0.816736427,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.336523 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f479a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f479a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,LastTimestamp:2026-03-13 15:02:50.654775987 +0000 UTC m=+0.817987808,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.343044 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f54ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f54ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,LastTimestamp:2026-03-13 15:02:50.654805418 +0000 UTC m=+0.818017239,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.349237 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f59055\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f59055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527895637 +0000 UTC m=+0.691107458,LastTimestamp:2026-03-13 15:02:50.654819768 +0000 UTC m=+0.818031589,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.358268 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f479a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f479a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,LastTimestamp:2026-03-13 15:02:50.655824724 +0000 UTC m=+0.819036545,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.360238 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f54ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f54ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,LastTimestamp:2026-03-13 15:02:50.655852894 +0000 UTC m=+0.819064705,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.364706 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f59055\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f59055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527895637 +0000 UTC m=+0.691107458,LastTimestamp:2026-03-13 15:02:50.655883775 +0000 UTC m=+0.819095586,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.368166 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f479a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f479a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,LastTimestamp:2026-03-13 15:02:50.656107471 +0000 UTC m=+0.819319322,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.375508 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f54ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f54ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,LastTimestamp:2026-03-13 15:02:50.656164072 +0000 UTC m=+0.819375923,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.381742 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f59055\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f59055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527895637 +0000 UTC m=+0.691107458,LastTimestamp:2026-03-13 15:02:50.656184313 +0000 UTC m=+0.819396164,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.387802 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f479a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f479a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,LastTimestamp:2026-03-13 15:02:50.657052275 +0000 UTC m=+0.820264086,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.393966 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f54ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f54ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,LastTimestamp:2026-03-13 15:02:50.657073945 +0000 UTC m=+0.820285766,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.400681 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f59055\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f59055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527895637 +0000 UTC m=+0.691107458,LastTimestamp:2026-03-13 15:02:50.657092126 +0000 UTC m=+0.820303937,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.406989 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f479a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f479a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,LastTimestamp:2026-03-13 15:02:50.662584174 +0000 UTC m=+0.825795995,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.414725 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f54ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f54ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,LastTimestamp:2026-03-13 15:02:50.662608164 +0000 UTC m=+0.825819985,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.424137 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f59055\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f59055 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527895637 +0000 UTC m=+0.691107458,LastTimestamp:2026-03-13 15:02:50.662621525 +0000 UTC m=+0.825833346,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.431553 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f479a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f479a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527824295 +0000 UTC m=+0.691036126,LastTimestamp:2026-03-13 15:02:50.662656086 +0000 UTC m=+0.825867937,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.438392 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6ecac9f54ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6ecac9f54ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:50.527878887 +0000 UTC m=+0.691090708,LastTimestamp:2026-03-13 15:02:50.662678226 +0000 UTC m=+0.825890077,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.447031 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6ecaea04da2e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.065768494 +0000 UTC m=+1.228980345,LastTimestamp:2026-03-13 15:02:51.065768494 +0000 UTC m=+1.228980345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.453802 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecaea08385b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.065989211 +0000 UTC m=+1.229201022,LastTimestamp:2026-03-13 15:02:51.065989211 +0000 UTC m=+1.229201022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.460399 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecaea357718 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.068954392 +0000 UTC m=+1.232166243,LastTimestamp:2026-03-13 15:02:51.068954392 +0000 UTC m=+1.232166243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.466767 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecaea5fc0d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.071725776 +0000 UTC m=+1.234937587,LastTimestamp:2026-03-13 15:02:51.071725776 +0000 UTC m=+1.234937587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.474191 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecaea71b296 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.072901782 +0000 UTC m=+1.236113593,LastTimestamp:2026-03-13 15:02:51.072901782 +0000 UTC m=+1.236113593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: I0313 15:03:24.484307 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.484697 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6ecb0c7be886 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.643996294 +0000 UTC m=+1.807208125,LastTimestamp:2026-03-13 15:02:51.643996294 +0000 UTC m=+1.807208125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.491850 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb0c8038d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.644278993 +0000 UTC m=+1.807490804,LastTimestamp:2026-03-13 15:02:51.644278993 +0000 UTC m=+1.807490804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.498944 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb0c835c8d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.644484749 +0000 UTC m=+1.807696610,LastTimestamp:2026-03-13 15:02:51.644484749 +0000 UTC m=+1.807696610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.505406 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb0c8c2cdf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.645062367 +0000 UTC m=+1.808274188,LastTimestamp:2026-03-13 15:02:51.645062367 +0000 UTC m=+1.808274188,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.512046 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecb0cc9186d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.649054829 +0000 UTC m=+1.812266650,LastTimestamp:2026-03-13 15:02:51.649054829 +0000 UTC m=+1.812266650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.518467 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb0d421278 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.65698316 +0000 UTC m=+1.820194991,LastTimestamp:2026-03-13 15:02:51.65698316 +0000 UTC m=+1.820194991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.525272 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6ecb0d553f90 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.658239888 +0000 UTC m=+1.821451709,LastTimestamp:2026-03-13 15:02:51.658239888 +0000 UTC m=+1.821451709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.533143 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb0d7668ce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.660413134 +0000 UTC m=+1.823624945,LastTimestamp:2026-03-13 15:02:51.660413134 +0000 UTC m=+1.823624945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.539896 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecb0da25054 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.663290452 +0000 UTC m=+1.826502273,LastTimestamp:2026-03-13 15:02:51.663290452 +0000 UTC m=+1.826502273,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.547530 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb0daf8324 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.664155428 +0000 UTC m=+1.827367239,LastTimestamp:2026-03-13 15:02:51.664155428 +0000 UTC m=+1.827367239,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.555308 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb0dbe4e2c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.665124908 +0000 UTC m=+1.828336729,LastTimestamp:2026-03-13 15:02:51.665124908 +0000 UTC m=+1.828336729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.563670 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb1cad239a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.915658138 +0000 UTC m=+2.078869989,LastTimestamp:2026-03-13 15:02:51.915658138 +0000 UTC m=+2.078869989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.570698 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb1d7d742e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.929310254 +0000 UTC m=+2.092522105,LastTimestamp:2026-03-13 15:02:51.929310254 +0000 UTC m=+2.092522105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.577312 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb1d9317f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.930728437 +0000 UTC m=+2.093940268,LastTimestamp:2026-03-13 15:02:51.930728437 +0000 UTC m=+2.093940268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.586726 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb2af6389c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.155328668 +0000 UTC m=+2.318540509,LastTimestamp:2026-03-13 15:02:52.155328668 +0000 UTC m=+2.318540509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.594760 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb2be19102 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.170752258 +0000 UTC m=+2.333964099,LastTimestamp:2026-03-13 15:02:52.170752258 +0000 UTC m=+2.333964099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.602017 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb2bf8196d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.172228973 +0000 UTC m=+2.335440854,LastTimestamp:2026-03-13 15:02:52.172228973 +0000 UTC m=+2.335440854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.610036 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb3b1715a3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.425917859 +0000 UTC m=+2.589129700,LastTimestamp:2026-03-13 15:02:52.425917859 +0000 UTC m=+2.589129700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.618128 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb3bf99f16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.440764182 +0000 UTC m=+2.603976033,LastTimestamp:2026-03-13 15:02:52.440764182 +0000 UTC m=+2.603976033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.625665 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6ecb43a55371 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.569457521 +0000 UTC m=+2.732669372,LastTimestamp:2026-03-13 15:02:52.569457521 +0000 UTC m=+2.732669372,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.633490 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb43c7fbf0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.57172888 +0000 UTC m=+2.734940731,LastTimestamp:2026-03-13 15:02:52.57172888 +0000 UTC m=+2.734940731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.640697 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb444757a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.580075425 +0000 UTC m=+2.743287266,LastTimestamp:2026-03-13 15:02:52.580075425 +0000 UTC m=+2.743287266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.647757 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecb44ba3e5a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.587605594 +0000 UTC m=+2.750817405,LastTimestamp:2026-03-13 15:02:52.587605594 +0000 UTC m=+2.750817405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.655492 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb50b79e60 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.78876016 +0000 UTC m=+2.951971981,LastTimestamp:2026-03-13 15:02:52.78876016 +0000 UTC m=+2.951971981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.661589 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6ecb50c9bb83 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.789947267 +0000 UTC m=+2.953159078,LastTimestamp:2026-03-13 15:02:52.789947267 +0000 UTC m=+2.953159078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.668334 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb50fb4cba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.793195706 +0000 UTC m=+2.956407527,LastTimestamp:2026-03-13 15:02:52.793195706 +0000 UTC m=+2.956407527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.671911 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecb512e8a82 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.796553858 +0000 UTC m=+2.959765679,LastTimestamp:2026-03-13 15:02:52.796553858 +0000 UTC m=+2.959765679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.673611 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb513b4155 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.797387093 +0000 UTC m=+2.960598904,LastTimestamp:2026-03-13 15:02:52.797387093 +0000 UTC m=+2.960598904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.678966 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb5149c706 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.798338822 +0000 UTC m=+2.961550653,LastTimestamp:2026-03-13 15:02:52.798338822 +0000 UTC m=+2.961550653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.685382 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6ecb516889ea openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.800354794 +0000 UTC m=+2.963566615,LastTimestamp:2026-03-13 15:02:52.800354794 +0000 UTC m=+2.963566615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: I0313 15:03:24.691661 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.692120 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb51d1f491 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.807263377 +0000 UTC m=+2.970475208,LastTimestamp:2026-03-13 15:02:52.807263377 +0000 UTC m=+2.970475208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.692391 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 15:03:24 crc kubenswrapper[4786]: I0313 15:03:24.693133 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:24 crc kubenswrapper[4786]: I0313 15:03:24.693342 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:24 crc kubenswrapper[4786]: I0313 15:03:24.693484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:24 crc kubenswrapper[4786]: I0313 15:03:24.693638 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.697900 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb51e3251e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.808389918 +0000 UTC m=+2.971601729,LastTimestamp:2026-03-13 15:02:52.808389918 +0000 UTC m=+2.971601729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.698659 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.700393 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecb52c3a6f2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:52.823103218 +0000 UTC m=+2.986315029,LastTimestamp:2026-03-13 15:02:52.823103218 +0000 UTC m=+2.986315029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.706417 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb602801c5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.047783877 +0000 UTC m=+3.210995708,LastTimestamp:2026-03-13 15:02:53.047783877 +0000 UTC m=+3.210995708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.712350 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb60553814 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.0507469 +0000 UTC m=+3.213958721,LastTimestamp:2026-03-13 15:02:53.0507469 +0000 UTC m=+3.213958721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.717816 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb60e9984b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.060470859 +0000 UTC m=+3.223682670,LastTimestamp:2026-03-13 15:02:53.060470859 +0000 UTC m=+3.223682670,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.724331 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb60ff3712 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.061887762 +0000 UTC m=+3.225099593,LastTimestamp:2026-03-13 15:02:53.061887762 +0000 UTC m=+3.225099593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.728993 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb611708e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.063448802 +0000 UTC m=+3.226660613,LastTimestamp:2026-03-13 15:02:53.063448802 +0000 UTC m=+3.226660613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.733288 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb612acc70 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.064744048 +0000 UTC m=+3.227955869,LastTimestamp:2026-03-13 15:02:53.064744048 +0000 UTC m=+3.227955869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.738445 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb6e125f35 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.281247029 +0000 UTC m=+3.444458840,LastTimestamp:2026-03-13 15:02:53.281247029 +0000 UTC m=+3.444458840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.751374 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb6e1edf02 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.282066178 +0000 UTC m=+3.445277989,LastTimestamp:2026-03-13 15:02:53.282066178 +0000 UTC m=+3.445277989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.758136 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb6ef90ba0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.296364448 +0000 UTC m=+3.459576259,LastTimestamp:2026-03-13 15:02:53.296364448 +0000 UTC m=+3.459576259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.773348 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6ecb6f090571 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.297411441 +0000 UTC m=+3.460623252,LastTimestamp:2026-03-13 15:02:53.297411441 +0000 UTC m=+3.460623252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.778071 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb6f100437 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.297869879 +0000 UTC m=+3.461081690,LastTimestamp:2026-03-13 15:02:53.297869879 +0000 UTC m=+3.461081690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.780666 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb787d6e79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.456035449 +0000 UTC m=+3.619247280,LastTimestamp:2026-03-13 15:02:53.456035449 +0000 UTC m=+3.619247280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.784515 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb79084080 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.465133184 +0000 UTC m=+3.628344995,LastTimestamp:2026-03-13 15:02:53.465133184 +0000 UTC m=+3.628344995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.789899 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb7916ce61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.466087009 +0000 UTC m=+3.629298830,LastTimestamp:2026-03-13 15:02:53.466087009 +0000 UTC m=+3.629298830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.796704 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecb81324483 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.602104451 +0000 UTC m=+3.765316312,LastTimestamp:2026-03-13 15:02:53.602104451 +0000 UTC m=+3.765316312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.801421 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb84765312 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.656896274 +0000 UTC m=+3.820108085,LastTimestamp:2026-03-13 15:02:53.656896274 +0000 UTC m=+3.820108085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.805655 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb85232c79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.668224121 +0000 UTC m=+3.831435932,LastTimestamp:2026-03-13 15:02:53.668224121 +0000 UTC m=+3.831435932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.812123 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecb8f5449d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.839215059 +0000 UTC m=+4.002426870,LastTimestamp:2026-03-13 15:02:53.839215059 +0000 UTC m=+4.002426870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.817101 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecb90349097 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.853913239 +0000 UTC m=+4.017125050,LastTimestamp:2026-03-13 15:02:53.853913239 +0000 UTC m=+4.017125050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.824192 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbbe013ad6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:54.622300886 +0000 UTC m=+4.785512697,LastTimestamp:2026-03-13 15:02:54.622300886 +0000 UTC m=+4.785512697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.830449 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbcb4f5966 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:54.845524326 +0000 UTC m=+5.008736137,LastTimestamp:2026-03-13 15:02:54.845524326 +0000 UTC m=+5.008736137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.836333 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbcbf5c2e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:54.856430304 +0000 UTC m=+5.019642155,LastTimestamp:2026-03-13 15:02:54.856430304 +0000 UTC m=+5.019642155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.840976 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbcc10321e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:54.858162718 +0000 UTC m=+5.021374569,LastTimestamp:2026-03-13 15:02:54.858162718 +0000 UTC m=+5.021374569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.846562 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbd784ad5c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.05034582 +0000 UTC m=+5.213557631,LastTimestamp:2026-03-13 15:02:55.05034582 +0000 UTC m=+5.213557631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.853214 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbd82e26e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.061452513 +0000 UTC m=+5.224664364,LastTimestamp:2026-03-13 15:02:55.061452513 +0000 UTC m=+5.224664364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.859965 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbd83d62b4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.062450868 +0000 UTC m=+5.225662689,LastTimestamp:2026-03-13 15:02:55.062450868 +0000 UTC m=+5.225662689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.865940 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbe6779df0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.301148144 +0000 UTC m=+5.464360005,LastTimestamp:2026-03-13 15:02:55.301148144 +0000 UTC m=+5.464360005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.870259 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbe7a95cfe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.321185534 +0000 UTC m=+5.484397385,LastTimestamp:2026-03-13 15:02:55.321185534 +0000 UTC m=+5.484397385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.875316 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbe7bc2e7f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.322418815 +0000 UTC m=+5.485630636,LastTimestamp:2026-03-13 15:02:55.322418815 +0000 UTC m=+5.485630636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.881947 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbf5d7e83c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.55911686 +0000 UTC m=+5.722328681,LastTimestamp:2026-03-13 15:02:55.55911686 +0000 UTC m=+5.722328681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.888844 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbf67bf9e4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.569869284 +0000 UTC m=+5.733081105,LastTimestamp:2026-03-13 15:02:55.569869284 +0000 UTC m=+5.733081105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.894034 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecbf68a1793 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.570794387 +0000 UTC m=+5.734006208,LastTimestamp:2026-03-13 15:02:55.570794387 +0000 UTC m=+5.734006208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.898960 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecc02f4e09c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.77911926 +0000 UTC m=+5.942331081,LastTimestamp:2026-03-13 15:02:55.77911926 +0000 UTC m=+5.942331081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.905350 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6ecc042bb75f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:55.799490399 +0000 UTC m=+5.962702250,LastTimestamp:2026-03-13 15:02:55.799490399 +0000 UTC m=+5.962702250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.917276 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 15:03:24 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42fe91d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 15:03:24 crc kubenswrapper[4786]: body: Mar 13 15:03:24 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853496277 +0000 UTC m=+7.016708118,LastTimestamp:2026-03-13 15:02:56.853496277 +0000 UTC m=+7.016708118,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:03:24 crc kubenswrapper[4786]: > Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.921832 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42ffa9f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853567989 +0000 UTC m=+7.016779840,LastTimestamp:2026-03-13 15:02:56.853567989 +0000 UTC m=+7.016779840,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.929842 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 15:03:24 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-apiserver-crc.189c6ecdfd7f5810 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 15:03:24 crc kubenswrapper[4786]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 15:03:24 crc kubenswrapper[4786]: Mar 13 15:03:24 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:03:04.277465104 +0000 UTC m=+14.440676955,LastTimestamp:2026-03-13 15:03:04.277465104 +0000 UTC m=+14.440676955,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:03:24 crc kubenswrapper[4786]: > Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.935305 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecdfd80c134 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:03:04.277557556 +0000 UTC m=+14.440769407,LastTimestamp:2026-03-13 15:03:04.277557556 +0000 UTC m=+14.440769407,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.942956 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6ecdfd7f5810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 15:03:24 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-apiserver-crc.189c6ecdfd7f5810 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 15:03:24 crc kubenswrapper[4786]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 15:03:24 crc kubenswrapper[4786]: Mar 13 15:03:24 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:03:04.277465104 +0000 UTC m=+14.440676955,LastTimestamp:2026-03-13 15:03:04.282604464 +0000 UTC m=+14.445816295,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:03:24 crc kubenswrapper[4786]: > Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.952402 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6ecdfd80c134\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecdfd80c134 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:03:04.277557556 +0000 UTC m=+14.440769407,LastTimestamp:2026-03-13 15:03:04.282652966 +0000 UTC m=+14.445864797,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.958254 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6ecb7916ce61\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb7916ce61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.466087009 +0000 UTC m=+3.629298830,LastTimestamp:2026-03-13 15:03:04.672338625 +0000 UTC m=+14.835550426,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.965378 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6ecb84765312\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb84765312 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.656896274 +0000 UTC m=+3.820108085,LastTimestamp:2026-03-13 15:03:04.889670867 +0000 UTC m=+15.052882688,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.970340 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6ecb85232c79\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6ecb85232c79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:53.668224121 +0000 UTC m=+3.831435932,LastTimestamp:2026-03-13 15:03:04.899258301 +0000 UTC m=+15.062470122,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.978130 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecc42fe91d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 15:03:24 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42fe91d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 15:03:24 crc kubenswrapper[4786]: body: Mar 13 15:03:24 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853496277 +0000 UTC m=+7.016708118,LastTimestamp:2026-03-13 15:03:06.853996806 +0000 UTC m=+17.017208647,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:03:24 crc kubenswrapper[4786]: > Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.982872 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecc42ffa9f5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42ffa9f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853567989 +0000 UTC m=+7.016779840,LastTimestamp:2026-03-13 15:03:06.854084528 +0000 UTC m=+17.017296369,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.992185 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecc42fe91d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 15:03:24 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42fe91d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 15:03:24 crc kubenswrapper[4786]: body: Mar 13 15:03:24 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853496277 +0000 UTC m=+7.016708118,LastTimestamp:2026-03-13 15:03:16.854515295 +0000 UTC m=+27.017727146,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:03:24 crc kubenswrapper[4786]: > Mar 13 15:03:24 crc kubenswrapper[4786]: E0313 15:03:24.997545 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecc42ffa9f5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42ffa9f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853567989 +0000 UTC m=+7.016779840,LastTimestamp:2026-03-13 15:03:16.854584297 +0000 UTC m=+27.017796178,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:25 crc kubenswrapper[4786]: E0313 15:03:25.002644 4786 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ed0eb548df5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:03:16.857572853 +0000 UTC m=+27.020784724,LastTimestamp:2026-03-13 15:03:16.857572853 +0000 UTC m=+27.020784724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:25 crc kubenswrapper[4786]: E0313 15:03:25.008594 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecb0dbe4e2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb0dbe4e2c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.665124908 +0000 UTC m=+1.828336729,LastTimestamp:2026-03-13 15:03:16.978122752 +0000 UTC m=+27.141334603,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:25 crc kubenswrapper[4786]: E0313 15:03:25.014295 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecb1cad239a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb1cad239a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.915658138 +0000 UTC m=+2.078869989,LastTimestamp:2026-03-13 15:03:17.232151397 +0000 UTC m=+27.395363238,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:25 crc kubenswrapper[4786]: E0313 15:03:25.019168 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecb1d7d742e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecb1d7d742e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:51.929310254 +0000 UTC m=+2.092522105,LastTimestamp:2026-03-13 15:03:17.243753463 +0000 UTC m=+27.406965314,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:25 crc kubenswrapper[4786]: I0313 15:03:25.485687 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:26 crc kubenswrapper[4786]: I0313 15:03:26.483696 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:26 crc kubenswrapper[4786]: I0313 15:03:26.854019 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:03:26 crc kubenswrapper[4786]: I0313 15:03:26.854103 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:03:26 crc kubenswrapper[4786]: E0313 15:03:26.859432 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecc42fe91d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 15:03:26 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42fe91d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 15:03:26 crc kubenswrapper[4786]: body: Mar 13 15:03:26 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853496277 +0000 UTC m=+7.016708118,LastTimestamp:2026-03-13 15:03:26.854075509 +0000 UTC m=+37.017287360,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:03:26 crc kubenswrapper[4786]: > Mar 13 15:03:26 crc kubenswrapper[4786]: E0313 15:03:26.866692 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecc42ffa9f5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42ffa9f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853567989 +0000 UTC m=+7.016779840,LastTimestamp:2026-03-13 15:03:26.854135941 +0000 UTC m=+37.017347792,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 15:03:27 crc kubenswrapper[4786]: I0313 15:03:27.486575 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:28 crc kubenswrapper[4786]: I0313 15:03:28.485990 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:29 crc kubenswrapper[4786]: I0313 15:03:29.217125 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 15:03:29 crc kubenswrapper[4786]: I0313 15:03:29.234630 4786 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 15:03:29 crc kubenswrapper[4786]: I0313 15:03:29.485929 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:30 crc kubenswrapper[4786]: I0313 15:03:30.486165 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:30 crc kubenswrapper[4786]: I0313 15:03:30.551687 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:30 crc kubenswrapper[4786]: I0313 15:03:30.554040 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:30 crc kubenswrapper[4786]: I0313 15:03:30.554145 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:30 crc kubenswrapper[4786]: I0313 15:03:30.554175 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:30 crc kubenswrapper[4786]: I0313 15:03:30.555470 4786 scope.go:117] "RemoveContainer" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" Mar 13 15:03:30 crc kubenswrapper[4786]: E0313 15:03:30.615480 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.486541 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.698813 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:31 crc kubenswrapper[4786]: E0313 15:03:31.700327 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.700348 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.700428 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.700455 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.700499 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:03:31 crc kubenswrapper[4786]: E0313 15:03:31.707964 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.759966 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.760695 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.763299 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b11b9abbb0f7a4aebc4513b896d83c8b1291c2d03524b60aaf8d7349b9370d9" exitCode=255 Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.763357 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3b11b9abbb0f7a4aebc4513b896d83c8b1291c2d03524b60aaf8d7349b9370d9"} Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.763439 4786 scope.go:117] "RemoveContainer" containerID="ab5811146ed8c0e4e83df66e9a39c8b498db3ac963b285a4731a2727dbd23077" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.763622 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.764936 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.764993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.765022 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:31 crc kubenswrapper[4786]: I0313 15:03:31.765942 4786 scope.go:117] "RemoveContainer" containerID="3b11b9abbb0f7a4aebc4513b896d83c8b1291c2d03524b60aaf8d7349b9370d9" Mar 13 15:03:31 crc kubenswrapper[4786]: E0313 15:03:31.766216 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:32 crc kubenswrapper[4786]: I0313 15:03:32.485793 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:32 crc kubenswrapper[4786]: I0313 15:03:32.768068 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 15:03:33 crc kubenswrapper[4786]: I0313 15:03:33.487608 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:34 crc kubenswrapper[4786]: W0313 15:03:34.195439 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:34 crc kubenswrapper[4786]: E0313 15:03:34.195520 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 15:03:34 crc kubenswrapper[4786]: I0313 15:03:34.485618 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:35 crc kubenswrapper[4786]: I0313 15:03:35.478181 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:03:35 crc kubenswrapper[4786]: I0313 15:03:35.478384 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:35 crc kubenswrapper[4786]: I0313 15:03:35.479803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:35 crc kubenswrapper[4786]: I0313 15:03:35.479849 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:35 crc kubenswrapper[4786]: I0313 15:03:35.479879 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:35 crc kubenswrapper[4786]: I0313 15:03:35.480393 4786 scope.go:117] "RemoveContainer" containerID="3b11b9abbb0f7a4aebc4513b896d83c8b1291c2d03524b60aaf8d7349b9370d9" Mar 13 15:03:35 crc kubenswrapper[4786]: E0313 15:03:35.480560 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:35 crc kubenswrapper[4786]: I0313 15:03:35.487009 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.355721 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.356040 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.357943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.358004 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.358024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.359097 4786 scope.go:117] "RemoveContainer" containerID="3b11b9abbb0f7a4aebc4513b896d83c8b1291c2d03524b60aaf8d7349b9370d9" Mar 13 15:03:36 crc kubenswrapper[4786]: E0313 15:03:36.359385 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.485484 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.853363 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:03:36 crc kubenswrapper[4786]: I0313 15:03:36.853456 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:03:36 crc kubenswrapper[4786]: E0313 15:03:36.861460 4786 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6ecc42fe91d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 15:03:36 crc kubenswrapper[4786]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6ecc42fe91d5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 15:03:36 crc kubenswrapper[4786]: body: Mar 13 15:03:36 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:02:56.853496277 +0000 UTC m=+7.016708118,LastTimestamp:2026-03-13 15:03:36.853418969 +0000 UTC m=+47.016630820,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:03:36 crc kubenswrapper[4786]: > Mar 13 15:03:37 crc kubenswrapper[4786]: I0313 15:03:37.486162 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:37 crc kubenswrapper[4786]: W0313 15:03:37.922430 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 15:03:37 crc kubenswrapper[4786]: E0313 15:03:37.922524 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 15:03:38 crc kubenswrapper[4786]: I0313 15:03:38.487037 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:38 crc kubenswrapper[4786]: I0313 15:03:38.708063 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:38 crc kubenswrapper[4786]: E0313 15:03:38.708106 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 15:03:38 crc kubenswrapper[4786]: I0313 15:03:38.709807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:38 crc kubenswrapper[4786]: I0313 15:03:38.710031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:38 crc kubenswrapper[4786]: I0313 15:03:38.710163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:38 crc kubenswrapper[4786]: I0313 15:03:38.710312 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:03:38 crc kubenswrapper[4786]: E0313 15:03:38.717507 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 15:03:39 crc kubenswrapper[4786]: I0313 15:03:39.485684 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:39 crc kubenswrapper[4786]: W0313 15:03:39.487198 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 15:03:39 crc kubenswrapper[4786]: E0313 15:03:39.488034 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 15:03:40 crc kubenswrapper[4786]: I0313 15:03:40.486470 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:40 crc kubenswrapper[4786]: E0313 15:03:40.615615 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 15:03:40 crc kubenswrapper[4786]: W0313 15:03:40.617332 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 15:03:40 crc kubenswrapper[4786]: E0313 15:03:40.617431 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 15:03:41 crc kubenswrapper[4786]: I0313 15:03:41.484367 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:42 crc kubenswrapper[4786]: I0313 15:03:42.485526 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:43 crc kubenswrapper[4786]: I0313 15:03:43.399222 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 15:03:43 crc kubenswrapper[4786]: I0313 15:03:43.399426 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:43 crc kubenswrapper[4786]: I0313 15:03:43.401024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:43 crc kubenswrapper[4786]: I0313 15:03:43.401073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:43 crc kubenswrapper[4786]: I0313 15:03:43.401090 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:43 crc kubenswrapper[4786]: I0313 15:03:43.483509 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:44 crc kubenswrapper[4786]: I0313 15:03:44.486179 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:45 crc kubenswrapper[4786]: I0313 15:03:45.484141 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:45 crc kubenswrapper[4786]: E0313 15:03:45.714566 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 15:03:45 crc kubenswrapper[4786]: I0313 15:03:45.718592 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:45 crc kubenswrapper[4786]: I0313 15:03:45.720001 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:45 crc kubenswrapper[4786]: I0313 15:03:45.720066 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:45 crc kubenswrapper[4786]: I0313 15:03:45.720089 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:45 crc kubenswrapper[4786]: I0313 15:03:45.720130 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:03:45 crc kubenswrapper[4786]: E0313 15:03:45.726584 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.485614 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.852661 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.852730 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.852788 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.852970 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.854139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.854195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.854207 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.854703 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"34c2caa92970a47f32a973365c7f578f08cb80218cdd3dc3abfbbfcef84cc94a"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 15:03:46 crc kubenswrapper[4786]: I0313 15:03:46.854813 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://34c2caa92970a47f32a973365c7f578f08cb80218cdd3dc3abfbbfcef84cc94a" gracePeriod=30 Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.487546 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.819338 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.820494 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.821308 4786 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="34c2caa92970a47f32a973365c7f578f08cb80218cdd3dc3abfbbfcef84cc94a" exitCode=255 Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.821351 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"34c2caa92970a47f32a973365c7f578f08cb80218cdd3dc3abfbbfcef84cc94a"} Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.821386 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c0b50db9765db67b716e77b69c120deb3073a24b91ce5c6bb6f1f4510338e5d7"} Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.821406 4786 scope.go:117] "RemoveContainer" containerID="f3794d66cc8db9fcb32ef9b7bf307705985824b83d4eed2e2dad00bf1c4eb1de" Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.821573 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.823140 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.823201 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:47 crc kubenswrapper[4786]: I0313 15:03:47.823225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:48 crc kubenswrapper[4786]: I0313 15:03:48.485232 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:48 crc kubenswrapper[4786]: I0313 15:03:48.826279 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:03:49 crc kubenswrapper[4786]: I0313 15:03:49.486089 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:49 crc kubenswrapper[4786]: I0313 15:03:49.552109 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:49 crc kubenswrapper[4786]: I0313 15:03:49.553852 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:49 crc kubenswrapper[4786]: I0313 15:03:49.553950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:49 crc kubenswrapper[4786]: I0313 15:03:49.553975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:49 crc kubenswrapper[4786]: I0313 15:03:49.554932 4786 scope.go:117] "RemoveContainer" containerID="3b11b9abbb0f7a4aebc4513b896d83c8b1291c2d03524b60aaf8d7349b9370d9" Mar 13 15:03:49 crc kubenswrapper[4786]: E0313 15:03:49.555239 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:03:50 crc kubenswrapper[4786]: I0313 15:03:50.485396 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:50 crc kubenswrapper[4786]: E0313 15:03:50.615734 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 15:03:51 crc kubenswrapper[4786]: I0313 15:03:51.482865 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:51 crc kubenswrapper[4786]: I0313 15:03:51.993017 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:51 crc kubenswrapper[4786]: I0313 15:03:51.993271 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:51 crc kubenswrapper[4786]: I0313 15:03:51.995356 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:51 crc kubenswrapper[4786]: I0313 15:03:51.995725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:51 crc kubenswrapper[4786]: I0313 15:03:51.995744 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:52 crc kubenswrapper[4786]: I0313 15:03:52.485021 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:52 crc kubenswrapper[4786]: E0313 15:03:52.722775 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 15:03:52 crc kubenswrapper[4786]: I0313 15:03:52.727807 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:52 crc kubenswrapper[4786]: I0313 15:03:52.729704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:52 crc kubenswrapper[4786]: I0313 15:03:52.729748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:52 crc kubenswrapper[4786]: I0313 15:03:52.729759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:52 crc kubenswrapper[4786]: I0313 15:03:52.729787 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:03:52 crc kubenswrapper[4786]: E0313 15:03:52.736845 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 15:03:53 crc kubenswrapper[4786]: I0313 15:03:53.483883 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:53 crc kubenswrapper[4786]: I0313 15:03:53.853556 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:53 crc kubenswrapper[4786]: I0313 15:03:53.853712 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:53 crc kubenswrapper[4786]: I0313 15:03:53.854777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:53 crc kubenswrapper[4786]: I0313 15:03:53.854824 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:53 crc kubenswrapper[4786]: I0313 15:03:53.854837 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:53 crc kubenswrapper[4786]: I0313 15:03:53.858010 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:03:54 crc kubenswrapper[4786]: I0313 15:03:54.485029 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:54 crc kubenswrapper[4786]: I0313 15:03:54.844244 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:54 crc kubenswrapper[4786]: I0313 15:03:54.845252 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:54 crc kubenswrapper[4786]: I0313 15:03:54.845382 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:54 crc kubenswrapper[4786]: I0313 15:03:54.845469 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:55 crc kubenswrapper[4786]: I0313 15:03:55.037574 4786 csr.go:261] certificate signing request csr-zkrs9 is approved, waiting to be issued Mar 13 15:03:55 crc kubenswrapper[4786]: I0313 15:03:55.487619 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 15:03:56 crc kubenswrapper[4786]: I0313 15:03:56.050985 4786 csr.go:257] certificate signing request csr-zkrs9 is issued Mar 13 15:03:56 crc kubenswrapper[4786]: I0313 15:03:56.065613 4786 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 15:03:56 crc kubenswrapper[4786]: I0313 15:03:56.312283 4786 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 15:03:57 crc kubenswrapper[4786]: I0313 15:03:57.053092 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-09 11:13:49.39123096 +0000 UTC Mar 13 15:03:57 crc kubenswrapper[4786]: I0313 15:03:57.053131 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7244h9m52.338102457s for next certificate rotation Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.737752 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.738715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.738740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.738750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.738825 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.745103 4786 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.745319 4786 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 15:03:59 crc kubenswrapper[4786]: E0313 15:03:59.745334 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.747552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.747570 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.747578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.747605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.747615 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:03:59Z","lastTransitionTime":"2026-03-13T15:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:03:59 crc kubenswrapper[4786]: E0313 15:03:59.761491 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.769344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.769391 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.769403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.769420 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.769432 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:03:59Z","lastTransitionTime":"2026-03-13T15:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:03:59 crc kubenswrapper[4786]: E0313 15:03:59.779875 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.787517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.787566 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.787579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.787597 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.787610 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:03:59Z","lastTransitionTime":"2026-03-13T15:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:03:59 crc kubenswrapper[4786]: E0313 15:03:59.798338 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.806069 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.806126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.806138 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.806159 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:03:59 crc kubenswrapper[4786]: I0313 15:03:59.806173 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:03:59Z","lastTransitionTime":"2026-03-13T15:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:03:59 crc kubenswrapper[4786]: E0313 15:03:59.819144 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:03:59 crc kubenswrapper[4786]: E0313 15:03:59.819294 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:03:59 crc kubenswrapper[4786]: E0313 15:03:59.819326 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:03:59 crc kubenswrapper[4786]: E0313 15:03:59.919721 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.020756 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.121880 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.222802 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.323422 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.423672 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.524571 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.616675 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.625380 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.725650 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.826753 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:00 crc kubenswrapper[4786]: E0313 15:04:00.927188 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.027629 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.128088 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.228723 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.329556 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.430216 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.531061 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.551568 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.552764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.552806 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.552822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.553636 4786 scope.go:117] "RemoveContainer" containerID="3b11b9abbb0f7a4aebc4513b896d83c8b1291c2d03524b60aaf8d7349b9370d9" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.632030 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.732756 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.833443 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.861843 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.863640 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29"} Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.863809 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.871091 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.871275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.871364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:01 crc kubenswrapper[4786]: E0313 15:04:01.933619 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.998915 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:04:01 crc kubenswrapper[4786]: I0313 15:04:01.999174 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.000190 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.000213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.000220 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.033957 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.134779 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.235922 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.336588 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.437504 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.538080 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.639100 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.739769 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.840828 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.868333 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.868903 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.871505 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" exitCode=255 Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.871590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29"} Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.871659 4786 scope.go:117] "RemoveContainer" containerID="3b11b9abbb0f7a4aebc4513b896d83c8b1291c2d03524b60aaf8d7349b9370d9" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.871808 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.872972 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.873017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.873033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:02 crc kubenswrapper[4786]: I0313 15:04:02.873816 4786 scope.go:117] "RemoveContainer" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.874108 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:04:02 crc kubenswrapper[4786]: E0313 15:04:02.941410 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.041508 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.141650 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.242576 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.343667 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.444005 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.544433 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.644918 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.745316 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.845914 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:03 crc kubenswrapper[4786]: I0313 15:04:03.877389 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 15:04:03 crc kubenswrapper[4786]: E0313 15:04:03.947080 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.047816 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: I0313 15:04:04.088615 4786 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.148048 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.248567 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.349159 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.449392 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.550218 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.651020 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.752167 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.852722 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:04 crc kubenswrapper[4786]: E0313 15:04:04.953303 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.054033 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.154186 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.254643 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.355380 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.455577 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: I0313 15:04:05.478095 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:04:05 crc kubenswrapper[4786]: I0313 15:04:05.478334 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:04:05 crc kubenswrapper[4786]: I0313 15:04:05.479824 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:05 crc kubenswrapper[4786]: I0313 15:04:05.479888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:05 crc kubenswrapper[4786]: I0313 15:04:05.479900 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:05 crc kubenswrapper[4786]: I0313 15:04:05.480454 4786 scope.go:117] "RemoveContainer" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.480608 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.556060 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.657177 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.757755 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.858633 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:05 crc kubenswrapper[4786]: E0313 15:04:05.959777 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.060079 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.161093 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.262048 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: I0313 15:04:06.355231 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:04:06 crc kubenswrapper[4786]: I0313 15:04:06.355411 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 15:04:06 crc kubenswrapper[4786]: I0313 15:04:06.356613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:06 crc kubenswrapper[4786]: I0313 15:04:06.356649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:06 crc kubenswrapper[4786]: I0313 15:04:06.356661 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:06 crc kubenswrapper[4786]: I0313 15:04:06.357281 4786 scope.go:117] "RemoveContainer" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.357476 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.362481 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.463626 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.563744 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.664039 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: I0313 15:04:06.738331 4786 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.764359 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.865218 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:06 crc kubenswrapper[4786]: E0313 15:04:06.966329 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.067521 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.168665 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.171146 4786 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.271892 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.271943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.271959 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.271981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.271998 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:07Z","lastTransitionTime":"2026-03-13T15:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.375296 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.375355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.375395 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.375425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.375447 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:07Z","lastTransitionTime":"2026-03-13T15:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.478581 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.478642 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.478658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.478683 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.478718 4786 apiserver.go:52] "Watching apiserver" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.478703 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:07Z","lastTransitionTime":"2026-03-13T15:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.485654 4786 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.486290 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mvpcz","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-wp7vg","openshift-image-registry/node-ca-99jzx","openshift-machine-config-operator/machine-config-daemon-zqb49","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-7b6g9","openshift-multus/multus-additional-cni-plugins-pstw7","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.486691 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.486914 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.487023 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.487258 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.487320 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.487391 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.487477 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.487595 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.487672 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.487716 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.487946 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wp7vg" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.488177 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.488603 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.489364 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.490330 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.491276 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.492825 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494271 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494395 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494477 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494669 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494698 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494719 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494752 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494766 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.494910 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495143 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495170 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495180 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495238 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495275 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495278 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495313 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495331 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.499911 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495348 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495377 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495429 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495427 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495434 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495485 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.495562 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.500060 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.500315 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.501887 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.502133 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.502147 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.504073 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.504317 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.504423 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.523597 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.548415 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.562520 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.576209 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.581323 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.581440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.581449 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.581464 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.581474 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:07Z","lastTransitionTime":"2026-03-13T15:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.583484 4786 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.587525 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.603345 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.613006 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.626608 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637592 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637639 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637660 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637710 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637737 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637761 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637782 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637802 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637822 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637843 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637885 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637906 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637928 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637948 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.637969 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638001 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638021 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638046 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638071 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638094 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638114 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638135 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638157 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638181 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638202 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638221 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638259 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638279 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638301 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638323 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638343 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638364 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638385 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638405 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638429 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638451 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638471 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638492 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638512 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638531 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638551 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638573 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638592 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638611 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638639 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638660 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638681 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638702 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638722 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638743 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638762 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638783 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638804 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638826 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638847 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638884 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638908 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638931 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638951 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638971 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638995 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639016 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639039 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639059 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639080 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639102 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639124 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639147 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639253 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639305 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639340 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639366 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639391 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639440 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639464 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639490 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639511 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639536 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639560 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639581 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639603 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639623 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639646 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639671 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639693 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639714 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639736 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639758 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639781 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639809 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639840 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639901 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639937 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639968 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640001 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640034 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640067 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640133 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640252 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640283 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640315 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640339 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640365 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640394 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640426 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640466 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640497 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640529 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640567 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640597 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640620 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640644 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640665 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640686 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640710 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640739 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640768 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640801 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640885 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640924 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640959 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641002 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641035 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641064 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641132 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641206 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641239 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641269 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641298 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641324 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641350 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641411 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641440 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641469 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641500 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641534 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641567 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641605 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641639 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641671 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641709 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641734 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641757 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641780 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641804 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641826 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641850 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641905 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641929 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641953 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641978 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642005 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642027 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642059 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642083 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642106 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642126 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642161 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642206 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642264 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642286 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642308 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642330 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642351 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642372 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642394 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642420 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642511 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642545 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642594 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642630 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642664 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642697 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642728 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642769 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642802 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642833 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642895 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642931 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642960 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643071 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643108 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643134 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643157 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643184 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643214 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643249 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643347 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643402 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643454 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzsnc\" (UniqueName: \"kubernetes.io/projected/f5037a43-3be3-4ba3-bed7-1ef82690e33e-kube-api-access-dzsnc\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643526 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-log-socket\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643560 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-os-release\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643626 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643660 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44x4\" (UniqueName: \"kubernetes.io/projected/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-kube-api-access-c44x4\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-slash\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643726 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-var-lib-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643767 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643801 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-cni-bin\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-cni-multus\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643890 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-kubelet\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643923 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-env-overrides\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643964 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644001 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644035 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/930a5a92-be71-4866-aa6f-95a98647bc33-multus-daemon-config\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5037a43-3be3-4ba3-bed7-1ef82690e33e-host\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644107 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644180 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-system-cni-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644212 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-socket-dir-parent\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644248 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644284 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-system-cni-dir\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644317 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b5e37b9c-1965-4321-a9fc-6babbd05c395-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644356 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644386 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-conf-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-netd\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644455 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644487 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/930a5a92-be71-4866-aa6f-95a98647bc33-cni-binary-copy\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644519 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-rootfs\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-os-release\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644583 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-hostroot\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644643 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77lq5\" (UniqueName: \"kubernetes.io/projected/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-kube-api-access-77lq5\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644679 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-cnibin\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644708 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-cnibin\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644735 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-systemd-units\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644762 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-etc-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644795 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trc5m\" (UniqueName: \"kubernetes.io/projected/930a5a92-be71-4866-aa6f-95a98647bc33-kube-api-access-trc5m\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644824 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-systemd\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644889 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644928 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovn-node-metrics-cert\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644966 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645000 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645036 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-kubelet\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645066 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-ovn\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645092 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5037a43-3be3-4ba3-bed7-1ef82690e33e-serviceca\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645119 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-node-log\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645171 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5e37b9c-1965-4321-a9fc-6babbd05c395-cni-binary-copy\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645210 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-mcd-auth-proxy-config\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645241 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-cni-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645271 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-multus-certs\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644633 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645303 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-script-lib\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645337 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-k8s-cni-cncf-io\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645366 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-netns\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645394 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-bin\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645424 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ab1ab68-cf57-443b-aa31-ac336c1d86ce-hosts-file\") pod \"node-resolver-wp7vg\" (UID: \"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\") " pod="openshift-dns/node-resolver-wp7vg" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645456 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-282hv\" (UniqueName: \"kubernetes.io/projected/1ab1ab68-cf57-443b-aa31-ac336c1d86ce-kube-api-access-282hv\") pod \"node-resolver-wp7vg\" (UID: \"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\") " pod="openshift-dns/node-resolver-wp7vg" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldhl\" (UniqueName: \"kubernetes.io/projected/b5e37b9c-1965-4321-a9fc-6babbd05c395-kube-api-access-gldhl\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645524 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645561 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645593 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-config\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645627 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-proxy-tls\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645660 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-netns\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645691 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-etc-kubernetes\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645718 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645771 4786 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.647395 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.648185 4786 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638019 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638312 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638659 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.650078 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.638828 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639259 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639329 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639418 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639533 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639608 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639826 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639834 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.639964 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640028 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640126 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640313 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640534 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640599 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640567 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640820 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640841 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.640984 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641039 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641088 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641164 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641387 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641512 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641536 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641663 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641708 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641724 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641751 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641749 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641919 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.641942 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642226 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642517 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642545 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642570 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642625 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642648 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.642799 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643433 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643444 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643465 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643717 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643760 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643760 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.643794 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644021 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644120 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644232 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644304 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644537 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644743 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644781 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644807 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.644850 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645122 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645030 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645610 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645814 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645948 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.645962 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.646015 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.646028 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.646188 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.646218 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.646554 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.646749 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.646819 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.647063 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.647244 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.649618 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.649897 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.650085 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.650262 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.650556 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.650514 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.652280 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.653698 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.654106 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:08.154000275 +0000 UTC m=+78.317212126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.654723 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.654900 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:08.154830296 +0000 UTC m=+78.318042197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.652966 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.653076 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.655019 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.655471 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.655481 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.655767 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.656158 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.659391 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.659482 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.659917 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.660293 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.660543 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.660740 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.660758 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.660919 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.660953 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.661191 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.661219 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.661224 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.661264 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.661287 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.661430 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.661843 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.662670 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.663044 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.663392 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.664137 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.664145 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.664323 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.664948 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.664979 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.664983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.665190 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:08.165116498 +0000 UTC m=+78.328328359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.665282 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.665568 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.665727 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.666171 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.666199 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.666402 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.666419 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.666648 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.667288 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.667590 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.668102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.668137 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.668651 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.668921 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.669081 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.669341 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.669374 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.669566 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:04:08.169539366 +0000 UTC m=+78.332751427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.669579 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.669656 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.669693 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.669720 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.669975 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.670111 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.670156 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.670297 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.670550 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.670803 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.671188 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.670749 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.671505 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.671800 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.671998 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.672464 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.672495 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.673284 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.673653 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.673925 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.674034 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.674286 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.674722 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.675559 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.675726 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.678783 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.680166 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.680998 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.681265 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.681526 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.681815 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.681889 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.681914 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:07 crc kubenswrapper[4786]: E0313 15:04:07.682087 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:08.182003531 +0000 UTC m=+78.345215412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.684466 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.686378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.686411 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.686422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.686441 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.686454 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:07Z","lastTransitionTime":"2026-03-13T15:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.686623 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.688096 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.689251 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.690763 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.691029 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.691043 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.691101 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.691216 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.691515 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.691634 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.691707 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.691996 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.692476 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.692550 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.692512 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.693287 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.693921 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.694023 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.694092 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.695845 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.696012 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.696031 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.696816 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.696932 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.697134 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.697551 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.698247 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.698256 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.698552 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.698719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.699002 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.699037 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.698983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.699958 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.700712 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.703244 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.706262 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.711329 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.711573 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.713654 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.714923 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.715057 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.722044 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.722684 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.723541 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.727570 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746265 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-k8s-cni-cncf-io\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746315 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-netns\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746339 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-bin\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746362 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ab1ab68-cf57-443b-aa31-ac336c1d86ce-hosts-file\") pod \"node-resolver-wp7vg\" (UID: \"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\") " pod="openshift-dns/node-resolver-wp7vg" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746389 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-282hv\" (UniqueName: \"kubernetes.io/projected/1ab1ab68-cf57-443b-aa31-ac336c1d86ce-kube-api-access-282hv\") pod \"node-resolver-wp7vg\" (UID: \"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\") " pod="openshift-dns/node-resolver-wp7vg" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746414 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gldhl\" (UniqueName: \"kubernetes.io/projected/b5e37b9c-1965-4321-a9fc-6babbd05c395-kube-api-access-gldhl\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746439 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-config\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746487 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-proxy-tls\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746472 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-k8s-cni-cncf-io\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746508 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-netns\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746509 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-netns\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746583 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-bin\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746559 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-netns\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-etc-kubernetes\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746553 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-etc-kubernetes\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746732 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ab1ab68-cf57-443b-aa31-ac336c1d86ce-hosts-file\") pod \"node-resolver-wp7vg\" (UID: \"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\") " pod="openshift-dns/node-resolver-wp7vg" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.746772 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747042 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747104 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747141 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzsnc\" (UniqueName: \"kubernetes.io/projected/f5037a43-3be3-4ba3-bed7-1ef82690e33e-kube-api-access-dzsnc\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747179 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-os-release\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747279 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747387 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-os-release\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747664 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-config\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747722 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c44x4\" (UniqueName: \"kubernetes.io/projected/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-kube-api-access-c44x4\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747808 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-slash\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747842 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-var-lib-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747899 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-log-socket\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.747949 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-cni-bin\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.748029 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-cni-multus\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.748106 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-kubelet\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.748140 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-env-overrides\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.748149 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.748191 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-log-socket\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.748197 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-slash\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.748263 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/930a5a92-be71-4866-aa6f-95a98647bc33-multus-daemon-config\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.749773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-cni-multus\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.749818 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-cni-bin\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.749843 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-var-lib-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.750071 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-kubelet\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.750113 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5037a43-3be3-4ba3-bed7-1ef82690e33e-host\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751299 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5037a43-3be3-4ba3-bed7-1ef82690e33e-host\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751311 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-env-overrides\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751342 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-system-cni-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751390 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-system-cni-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751425 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-socket-dir-parent\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751777 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-socket-dir-parent\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751799 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-proxy-tls\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751842 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751925 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-system-cni-dir\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-system-cni-dir\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.751982 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b5e37b9c-1965-4321-a9fc-6babbd05c395-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752045 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-conf-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752069 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-netd\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/930a5a92-be71-4866-aa6f-95a98647bc33-cni-binary-copy\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752144 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-rootfs\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-os-release\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752282 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-hostroot\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752309 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-cnibin\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752355 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-cnibin\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752384 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-systemd-units\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752404 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-etc-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752451 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77lq5\" (UniqueName: \"kubernetes.io/projected/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-kube-api-access-77lq5\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752476 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trc5m\" (UniqueName: \"kubernetes.io/projected/930a5a92-be71-4866-aa6f-95a98647bc33-kube-api-access-trc5m\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752521 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-systemd\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752565 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovn-node-metrics-cert\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752629 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752649 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-kubelet\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752695 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-ovn\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5e37b9c-1965-4321-a9fc-6babbd05c395-cni-binary-copy\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752767 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-mcd-auth-proxy-config\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752791 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-cni-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752812 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-multus-certs\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752881 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5037a43-3be3-4ba3-bed7-1ef82690e33e-serviceca\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752909 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-node-log\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.752935 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-script-lib\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.753022 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b5e37b9c-1965-4321-a9fc-6babbd05c395-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.753172 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-systemd\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.753234 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.757685 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovn-node-metrics-cert\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.757942 4786 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.758031 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.758067 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-var-lib-kubelet\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.758094 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-ovn\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.758391 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5e37b9c-1965-4321-a9fc-6babbd05c395-cnibin\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.758436 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-rootfs\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.758515 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-os-release\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.758538 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-hostroot\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.758936 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5e37b9c-1965-4321-a9fc-6babbd05c395-cni-binary-copy\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.759081 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-node-log\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.759128 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-cni-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.759148 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-host-run-multus-certs\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.759632 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/930a5a92-be71-4866-aa6f-95a98647bc33-multus-daemon-config\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.759800 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/930a5a92-be71-4866-aa6f-95a98647bc33-cni-binary-copy\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.759897 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-cnibin\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.759933 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-systemd-units\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.759964 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/930a5a92-be71-4866-aa6f-95a98647bc33-multus-conf-dir\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760206 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-netd\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760354 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-etc-openvswitch\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760359 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-mcd-auth-proxy-config\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760396 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760764 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-script-lib\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760843 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760889 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760909 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760928 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760947 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760964 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760982 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.760999 4786 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761021 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761039 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761058 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761076 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761164 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761218 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761237 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761753 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761825 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761617 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5037a43-3be3-4ba3-bed7-1ef82690e33e-serviceca\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761849 4786 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.761971 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762005 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762033 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762061 4786 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762087 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762113 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762143 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762173 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762199 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762225 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762254 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762278 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762301 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762328 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762353 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762377 4786 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762401 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762425 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762449 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762472 4786 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762495 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762519 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762548 4786 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762571 4786 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762596 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762622 4786 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762647 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762673 4786 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762698 4786 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762723 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762747 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762772 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762798 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762821 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762848 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762909 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762936 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.762981 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763010 4786 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763036 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763064 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763087 4786 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763111 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763136 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763160 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763184 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763209 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763233 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763257 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763280 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763304 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763329 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763353 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763376 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763401 4786 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763425 4786 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763448 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763474 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763501 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763528 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763555 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763578 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763602 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763628 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763654 4786 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763681 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763705 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763729 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763754 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763782 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763806 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763837 4786 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763900 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763911 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldhl\" (UniqueName: \"kubernetes.io/projected/b5e37b9c-1965-4321-a9fc-6babbd05c395-kube-api-access-gldhl\") pod \"multus-additional-cni-plugins-pstw7\" (UID: \"b5e37b9c-1965-4321-a9fc-6babbd05c395\") " pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763930 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763955 4786 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.763979 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764006 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764030 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764055 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764079 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764114 4786 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764132 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764152 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764170 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764188 4786 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764206 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764224 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764241 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764258 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764280 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764302 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764326 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764353 4786 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764376 4786 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764401 4786 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764425 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764448 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764473 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764500 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764528 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764552 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764577 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764601 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764625 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764649 4786 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764672 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764697 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764722 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764747 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764773 4786 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764798 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764823 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764848 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764911 4786 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764940 4786 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764963 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.764987 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765008 4786 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765033 4786 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765056 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765130 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765205 4786 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765233 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765259 4786 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765282 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765304 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765327 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765351 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765373 4786 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765397 4786 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765421 4786 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765445 4786 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765468 4786 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765492 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765517 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765541 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765569 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765593 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765618 4786 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765642 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765665 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765690 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765715 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765740 4786 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765764 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765790 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765816 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765842 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765904 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765932 4786 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765956 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.765979 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766005 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766030 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766056 4786 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766081 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766104 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766127 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766150 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766174 4786 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766200 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766224 4786 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766248 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766271 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766294 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766317 4786 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766343 4786 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766366 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766391 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766415 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766439 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766463 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.766489 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.769572 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44x4\" (UniqueName: \"kubernetes.io/projected/6b929603-1f9d-4b41-9bf8-528d7fd4ad56-kube-api-access-c44x4\") pod \"machine-config-daemon-zqb49\" (UID: \"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\") " pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.774463 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-282hv\" (UniqueName: \"kubernetes.io/projected/1ab1ab68-cf57-443b-aa31-ac336c1d86ce-kube-api-access-282hv\") pod \"node-resolver-wp7vg\" (UID: \"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\") " pod="openshift-dns/node-resolver-wp7vg" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.776752 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzsnc\" (UniqueName: \"kubernetes.io/projected/f5037a43-3be3-4ba3-bed7-1ef82690e33e-kube-api-access-dzsnc\") pod \"node-ca-99jzx\" (UID: \"f5037a43-3be3-4ba3-bed7-1ef82690e33e\") " pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.778537 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77lq5\" (UniqueName: \"kubernetes.io/projected/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-kube-api-access-77lq5\") pod \"ovnkube-node-7b6g9\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.787053 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trc5m\" (UniqueName: \"kubernetes.io/projected/930a5a92-be71-4866-aa6f-95a98647bc33-kube-api-access-trc5m\") pod \"multus-mvpcz\" (UID: \"930a5a92-be71-4866-aa6f-95a98647bc33\") " pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.790017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.790063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.790083 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.790111 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.790129 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:07Z","lastTransitionTime":"2026-03-13T15:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.815298 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.831587 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.842558 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-99jzx" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.856630 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wp7vg" Mar 13 15:04:07 crc kubenswrapper[4786]: W0313 15:04:07.858842 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b83ab9e81ea43f422267a5a6369739c854876887703acaed14cabc8eb1ebf2c3 WatchSource:0}: Error finding container b83ab9e81ea43f422267a5a6369739c854876887703acaed14cabc8eb1ebf2c3: Status 404 returned error can't find the container with id b83ab9e81ea43f422267a5a6369739c854876887703acaed14cabc8eb1ebf2c3 Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.867558 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:04:07 crc kubenswrapper[4786]: W0313 15:04:07.876053 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5037a43_3be3_4ba3_bed7_1ef82690e33e.slice/crio-5bb4c92371d7cb95ddb4892d38066cdef043fa10bcd6f9ab0015d857665c883d WatchSource:0}: Error finding container 5bb4c92371d7cb95ddb4892d38066cdef043fa10bcd6f9ab0015d857665c883d: Status 404 returned error can't find the container with id 5bb4c92371d7cb95ddb4892d38066cdef043fa10bcd6f9ab0015d857665c883d Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.877837 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.886022 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mvpcz" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.891112 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-99jzx" event={"ID":"f5037a43-3be3-4ba3-bed7-1ef82690e33e","Type":"ContainerStarted","Data":"5bb4c92371d7cb95ddb4892d38066cdef043fa10bcd6f9ab0015d857665c883d"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.892440 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b83ab9e81ea43f422267a5a6369739c854876887703acaed14cabc8eb1ebf2c3"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.892820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.892887 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.892906 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.892929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.892948 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:07Z","lastTransitionTime":"2026-03-13T15:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.894842 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a14b85a1e13d764f942d2246b0d0bf174359fc5aa311d6588e89bc13f358c92a"} Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.897167 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pstw7" Mar 13 15:04:07 crc kubenswrapper[4786]: W0313 15:04:07.902529 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ab1ab68_cf57_443b_aa31_ac336c1d86ce.slice/crio-ed3278ac505ed9d33c099fedb86cd204b123cf36c97345f10a4002bbdc314ca2 WatchSource:0}: Error finding container ed3278ac505ed9d33c099fedb86cd204b123cf36c97345f10a4002bbdc314ca2: Status 404 returned error can't find the container with id ed3278ac505ed9d33c099fedb86cd204b123cf36c97345f10a4002bbdc314ca2 Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.904196 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:07 crc kubenswrapper[4786]: W0313 15:04:07.904998 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b929603_1f9d_4b41_9bf8_528d7fd4ad56.slice/crio-782251a5c2ccbd89a8912010b282d9a465b0085619dbc4e11d68f60efe632e2b WatchSource:0}: Error finding container 782251a5c2ccbd89a8912010b282d9a465b0085619dbc4e11d68f60efe632e2b: Status 404 returned error can't find the container with id 782251a5c2ccbd89a8912010b282d9a465b0085619dbc4e11d68f60efe632e2b Mar 13 15:04:07 crc kubenswrapper[4786]: W0313 15:04:07.936198 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-16ecfc526ccf04fc78033a95d603041f829676717d1cb166b936341c8f0e7b5a WatchSource:0}: Error finding container 16ecfc526ccf04fc78033a95d603041f829676717d1cb166b936341c8f0e7b5a: Status 404 returned error can't find the container with id 16ecfc526ccf04fc78033a95d603041f829676717d1cb166b936341c8f0e7b5a Mar 13 15:04:07 crc kubenswrapper[4786]: W0313 15:04:07.956569 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod930a5a92_be71_4866_aa6f_95a98647bc33.slice/crio-9c395393decf7f516881392c2586ac434c8042b731cabf4529d1ba61b0af22ee WatchSource:0}: Error finding container 9c395393decf7f516881392c2586ac434c8042b731cabf4529d1ba61b0af22ee: Status 404 returned error can't find the container with id 9c395393decf7f516881392c2586ac434c8042b731cabf4529d1ba61b0af22ee Mar 13 15:04:07 crc kubenswrapper[4786]: W0313 15:04:07.981791 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6a64e5_e5ca_401a_9653_e0419f9f46c4.slice/crio-7d571718649b373f903e968935f37898a3c1fa9c475267b78810fab24c60dc67 WatchSource:0}: Error finding container 7d571718649b373f903e968935f37898a3c1fa9c475267b78810fab24c60dc67: Status 404 returned error can't find the container with id 7d571718649b373f903e968935f37898a3c1fa9c475267b78810fab24c60dc67 Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.994882 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.994908 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.994917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.994930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:07 crc kubenswrapper[4786]: I0313 15:04:07.994940 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:07Z","lastTransitionTime":"2026-03-13T15:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.097758 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.097802 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.097818 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.097837 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.097876 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.170159 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170322 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:04:09.17029636 +0000 UTC m=+79.333508171 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.170388 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.170414 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.170435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170492 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170541 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:09.170530886 +0000 UTC m=+79.333742697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170550 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170578 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170596 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170606 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170583 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:09.170573657 +0000 UTC m=+79.333785468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.170707 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:09.17069277 +0000 UTC m=+79.333904571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.200353 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.200389 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.200401 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.200414 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.200424 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.271256 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.271487 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.271522 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.271537 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.271612 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:09.271595011 +0000 UTC m=+79.434806842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.302141 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.302174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.302185 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.302199 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.302210 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.404455 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.404504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.404516 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.404533 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.404546 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.507625 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.507702 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.507728 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.507759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.507784 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.551325 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:08 crc kubenswrapper[4786]: E0313 15:04:08.551528 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.562821 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.563885 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.566152 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.567450 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.569385 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.570483 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.571747 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.576047 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.576712 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.577378 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.577885 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.578680 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.579210 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.579714 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.580234 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.580874 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.581575 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.584146 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.584728 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.585334 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.586212 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.586908 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.587826 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.588462 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.588991 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.590688 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.592071 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.592715 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.593334 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.594209 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.594671 4786 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.594773 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.596821 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.597338 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.597797 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.599384 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.600520 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.601134 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.601749 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.602832 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.603320 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.604235 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.605241 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.605814 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.606719 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.607248 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.608119 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.608831 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.609675 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.609696 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.609772 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.609780 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.609794 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.609804 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.610197 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.610630 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.611488 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.612067 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.613016 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.712178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.712218 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.712227 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.712243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.712253 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.814388 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.814456 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.814470 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.814490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.814507 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.874770 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s"] Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.875235 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.877443 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.880451 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.898950 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:08Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.900747 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-99jzx" event={"ID":"f5037a43-3be3-4ba3-bed7-1ef82690e33e","Type":"ContainerStarted","Data":"df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.902614 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5e37b9c-1965-4321-a9fc-6babbd05c395" containerID="7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817" exitCode=0 Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.902687 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerDied","Data":"7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.902716 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerStarted","Data":"9c736878c5b647f9042749f781a907ae57e9286ed61f20b3b5e22dfa5f286847"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.905960 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.906025 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.906054 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"16ecfc526ccf04fc78033a95d603041f829676717d1cb166b936341c8f0e7b5a"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.908697 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mvpcz" event={"ID":"930a5a92-be71-4866-aa6f-95a98647bc33","Type":"ContainerStarted","Data":"4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.908751 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mvpcz" event={"ID":"930a5a92-be71-4866-aa6f-95a98647bc33","Type":"ContainerStarted","Data":"9c395393decf7f516881392c2586ac434c8042b731cabf4529d1ba61b0af22ee"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.911026 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wp7vg" event={"ID":"1ab1ab68-cf57-443b-aa31-ac336c1d86ce","Type":"ContainerStarted","Data":"701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.911095 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wp7vg" event={"ID":"1ab1ab68-cf57-443b-aa31-ac336c1d86ce","Type":"ContainerStarted","Data":"ed3278ac505ed9d33c099fedb86cd204b123cf36c97345f10a4002bbdc314ca2"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.912628 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.914288 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128" exitCode=0 Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.914346 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.914370 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"7d571718649b373f903e968935f37898a3c1fa9c475267b78810fab24c60dc67"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.916311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.916351 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.916366 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.916383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.916397 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:08Z","lastTransitionTime":"2026-03-13T15:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.917880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.917926 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.917945 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"782251a5c2ccbd89a8912010b282d9a465b0085619dbc4e11d68f60efe632e2b"} Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.927453 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:08Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.954100 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:08Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.968005 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:08Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.978275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db8c811b-a36c-4923-8b13-47f48d9ba696-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.978376 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brqx4\" (UniqueName: \"kubernetes.io/projected/db8c811b-a36c-4923-8b13-47f48d9ba696-kube-api-access-brqx4\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.978412 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db8c811b-a36c-4923-8b13-47f48d9ba696-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.978445 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db8c811b-a36c-4923-8b13-47f48d9ba696-env-overrides\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:08 crc kubenswrapper[4786]: I0313 15:04:08.987731 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:08Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.002964 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.017771 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.019624 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.019658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.019669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.019685 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.019697 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.032729 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.045356 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.057346 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.071980 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.079055 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db8c811b-a36c-4923-8b13-47f48d9ba696-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.080184 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brqx4\" (UniqueName: \"kubernetes.io/projected/db8c811b-a36c-4923-8b13-47f48d9ba696-kube-api-access-brqx4\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.080618 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db8c811b-a36c-4923-8b13-47f48d9ba696-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.081045 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db8c811b-a36c-4923-8b13-47f48d9ba696-env-overrides\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.081495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db8c811b-a36c-4923-8b13-47f48d9ba696-env-overrides\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.081655 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db8c811b-a36c-4923-8b13-47f48d9ba696-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.089005 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db8c811b-a36c-4923-8b13-47f48d9ba696-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.092829 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.107915 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.109329 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brqx4\" (UniqueName: \"kubernetes.io/projected/db8c811b-a36c-4923-8b13-47f48d9ba696-kube-api-access-brqx4\") pod \"ovnkube-control-plane-749d76644c-74m8s\" (UID: \"db8c811b-a36c-4923-8b13-47f48d9ba696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.119515 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.123594 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.123650 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.123668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.123688 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.123725 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.131202 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.145729 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.158623 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.173809 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.183753 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.183931 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:04:11.183906035 +0000 UTC m=+81.347117846 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.183984 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.184164 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.184203 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.184168 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.184253 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.184308 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:11.184285325 +0000 UTC m=+81.347497166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.184331 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.184342 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:11.184329096 +0000 UTC m=+81.347540947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.184351 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.184364 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.184426 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:11.184416868 +0000 UTC m=+81.347628799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.189728 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.201018 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.202924 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.214334 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: W0313 15:04:09.214562 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8c811b_a36c_4923_8b13_47f48d9ba696.slice/crio-43a0933f3e79fa1087874589c72285f5dc977dd75d20ef2070c65f099e018976 WatchSource:0}: Error finding container 43a0933f3e79fa1087874589c72285f5dc977dd75d20ef2070c65f099e018976: Status 404 returned error can't find the container with id 43a0933f3e79fa1087874589c72285f5dc977dd75d20ef2070c65f099e018976 Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.226618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.226650 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.226661 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.226677 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.226689 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.236556 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.259250 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.271381 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.284135 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.284632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.284773 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.284795 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.284809 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.284852 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:11.284836787 +0000 UTC m=+81.448048608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.302336 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.329508 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.329545 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.329557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.329574 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.329587 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.432506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.432555 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.432575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.432605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.432627 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.540050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.540093 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.540111 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.540128 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.540141 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.551221 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.551320 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.551420 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:09 crc kubenswrapper[4786]: E0313 15:04:09.551572 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.643646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.643895 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.643905 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.643918 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.643926 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.746588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.746623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.746632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.746645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.746656 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.849182 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.849202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.849209 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.849221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.849229 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.922583 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" event={"ID":"db8c811b-a36c-4923-8b13-47f48d9ba696","Type":"ContainerStarted","Data":"09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.922632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" event={"ID":"db8c811b-a36c-4923-8b13-47f48d9ba696","Type":"ContainerStarted","Data":"43a0933f3e79fa1087874589c72285f5dc977dd75d20ef2070c65f099e018976"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.924305 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerStarted","Data":"4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.934683 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.934718 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.941408 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.954187 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.954215 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.954225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.954241 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.954252 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:09Z","lastTransitionTime":"2026-03-13T15:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.958103 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.967828 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.981193 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:09 crc kubenswrapper[4786]: I0313 15:04:09.993783 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:09Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.009384 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.022168 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.035061 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.058246 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.059602 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.059631 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.059639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.059652 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.059663 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.071576 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.088511 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.102144 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.116101 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.128168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.128200 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.128209 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.128223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.128232 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.140844 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.147361 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.147397 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.147406 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.147422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.147432 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.159343 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.163154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.163211 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.163229 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.163256 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.163278 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.174520 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.180382 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.180419 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.180430 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.180446 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.180458 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.192823 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.196396 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.196448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.196457 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.196472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.196481 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.216073 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.216234 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.217894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.217946 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.217963 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.217986 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.218004 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.320783 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.320810 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.320819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.320832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.320842 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.341970 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2v688"] Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.342369 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.342420 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.358569 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.374668 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.383428 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.400216 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.413079 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.422818 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.422851 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.422875 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.422891 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.422902 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.425799 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.435531 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.452348 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.464746 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.479471 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.487716 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.496402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmqjc\" (UniqueName: \"kubernetes.io/projected/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-kube-api-access-pmqjc\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.496452 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.503171 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.523342 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.525470 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.525569 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.525628 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.525693 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.525766 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.539776 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.551060 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.551296 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.561355 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.579101 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.590574 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.597775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.597883 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmqjc\" (UniqueName: \"kubernetes.io/projected/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-kube-api-access-pmqjc\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.598085 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:10 crc kubenswrapper[4786]: E0313 15:04:10.598201 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs podName:2ded4bfa-6d71-4c0f-982d-aee3c61c5612 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:11.098175562 +0000 UTC m=+81.261387413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs") pod "network-metrics-daemon-2v688" (UID: "2ded4bfa-6d71-4c0f-982d-aee3c61c5612") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.602775 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.620646 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.623960 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmqjc\" (UniqueName: \"kubernetes.io/projected/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-kube-api-access-pmqjc\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.627639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.627819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.627830 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.627844 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.627868 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.644925 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.662050 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.681485 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.702039 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.718124 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.730725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.730772 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.730786 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.730806 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.730819 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.735011 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.748900 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.763822 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.785279 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.834094 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.834152 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.834163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.834196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.834221 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.937676 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.937722 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.937734 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.937751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.937766 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:10Z","lastTransitionTime":"2026-03-13T15:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.952478 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.952554 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.952567 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.952580 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.954765 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.959377 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" event={"ID":"db8c811b-a36c-4923-8b13-47f48d9ba696","Type":"ContainerStarted","Data":"1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.962047 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5e37b9c-1965-4321-a9fc-6babbd05c395" containerID="4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063" exitCode=0 Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.962118 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerDied","Data":"4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063"} Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.975966 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:10 crc kubenswrapper[4786]: I0313 15:04:10.997009 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.015204 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.030598 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.041469 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.041504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.041516 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.041808 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.041831 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.043033 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.057832 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.071682 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.084699 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.100375 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.103751 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.103918 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.104917 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs podName:2ded4bfa-6d71-4c0f-982d-aee3c61c5612 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:12.104149255 +0000 UTC m=+82.267361076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs") pod "network-metrics-daemon-2v688" (UID: "2ded4bfa-6d71-4c0f-982d-aee3c61c5612") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.116327 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.130349 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.146226 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.146297 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.146318 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.146351 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.146373 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.146821 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.166742 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.179748 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.191684 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.203963 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.206970 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207178 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:04:15.207143077 +0000 UTC m=+85.370354888 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.207247 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.207303 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.207340 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207432 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207493 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:15.207472515 +0000 UTC m=+85.370684316 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207535 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207578 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:15.207571038 +0000 UTC m=+85.370782849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207586 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207621 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207635 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.207689 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:15.20767151 +0000 UTC m=+85.370883311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.222422 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.231873 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.243333 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.249071 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.249106 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.249114 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.249129 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.249139 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.254961 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.267779 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.277933 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.287499 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.297862 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.308330 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.308525 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.308567 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.308580 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.308654 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:15.308628173 +0000 UTC m=+85.471839984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.310291 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.319467 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.333227 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.348530 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.350933 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.350970 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.350979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.351001 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.351010 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.453897 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.453939 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.453949 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.453963 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.453973 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.551403 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.551525 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.551403 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.552208 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.552325 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:11 crc kubenswrapper[4786]: E0313 15:04:11.552451 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.556609 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.556669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.556688 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.556710 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.556727 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.660063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.660116 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.660127 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.660146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.660157 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.765081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.765146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.765168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.765201 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.765226 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.868078 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.868320 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.868346 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.868375 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.868399 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.972198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.973184 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.973224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.973247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.973279 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:11Z","lastTransitionTime":"2026-03-13T15:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.976484 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5e37b9c-1965-4321-a9fc-6babbd05c395" containerID="eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3" exitCode=0 Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.976745 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerDied","Data":"eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3"} Mar 13 15:04:11 crc kubenswrapper[4786]: I0313 15:04:11.997617 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.030987 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.051184 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.064496 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.076380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.076424 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.076435 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.076453 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.076466 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.082353 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.105174 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.117184 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:12 crc kubenswrapper[4786]: E0313 15:04:12.117386 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:12 crc kubenswrapper[4786]: E0313 15:04:12.117464 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs podName:2ded4bfa-6d71-4c0f-982d-aee3c61c5612 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:14.117443152 +0000 UTC m=+84.280654973 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs") pod "network-metrics-daemon-2v688" (UID: "2ded4bfa-6d71-4c0f-982d-aee3c61c5612") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.122958 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.146054 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.173176 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.190152 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.190196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.190208 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.190225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.190237 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.194783 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.212434 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.235510 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.257262 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.277467 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:12Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.292146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.292173 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.292181 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.292197 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.292207 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.394599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.394642 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.394653 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.394670 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.394682 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.498217 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.498281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.498300 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.498327 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.498344 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.552218 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:12 crc kubenswrapper[4786]: E0313 15:04:12.552397 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.601684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.601754 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.601773 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.601799 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.601817 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.704410 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.704481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.704501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.704529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.704548 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.808076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.808164 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.808183 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.808208 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.808225 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.911787 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.911852 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.911905 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.911935 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.911953 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:12Z","lastTransitionTime":"2026-03-13T15:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.987291 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5"} Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.991147 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5e37b9c-1965-4321-a9fc-6babbd05c395" containerID="9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87" exitCode=0 Mar 13 15:04:12 crc kubenswrapper[4786]: I0313 15:04:12.991218 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerDied","Data":"9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.010272 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.015235 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.015268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.015279 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.015294 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.015306 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.032669 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.055724 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.071323 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.090289 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.117798 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.117839 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.117850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.117885 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.117895 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.120787 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.132429 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.149987 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.171987 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.193841 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.211446 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.221338 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.221380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.221388 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.221403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.221413 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.233026 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.249152 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.267709 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:13Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.324561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.324608 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.324623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.324644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.324660 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.427251 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.427303 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.427321 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.427343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.427361 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.529674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.529783 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.529804 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.529833 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.529884 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.551652 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.551681 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:13 crc kubenswrapper[4786]: E0313 15:04:13.551776 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:13 crc kubenswrapper[4786]: E0313 15:04:13.551937 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.551653 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:13 crc kubenswrapper[4786]: E0313 15:04:13.552072 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.633369 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.633438 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.633460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.633490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.633515 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.736741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.736831 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.736843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.736880 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.736891 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.839800 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.839881 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.839894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.839909 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.839921 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.942768 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.942821 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.942839 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.942893 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.942913 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:13Z","lastTransitionTime":"2026-03-13T15:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:13 crc kubenswrapper[4786]: I0313 15:04:13.999356 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5e37b9c-1965-4321-a9fc-6babbd05c395" containerID="14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298" exitCode=0 Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:13.999427 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerDied","Data":"14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.016847 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.035300 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.045417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.045461 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.045478 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.045501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.045518 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.055849 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.087082 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.104073 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.120735 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.136226 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.138632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:14 crc kubenswrapper[4786]: E0313 15:04:14.138818 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:14 crc kubenswrapper[4786]: E0313 15:04:14.138925 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs podName:2ded4bfa-6d71-4c0f-982d-aee3c61c5612 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:18.13890385 +0000 UTC m=+88.302115671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs") pod "network-metrics-daemon-2v688" (UID: "2ded4bfa-6d71-4c0f-982d-aee3c61c5612") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.148134 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.148236 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.148294 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.148320 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.148338 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.158763 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.169762 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.182942 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.197850 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.213961 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.242815 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.250249 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.250285 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.250297 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.250312 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.250322 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.254389 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:14Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.352950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.353166 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.353245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.353322 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.353380 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.455728 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.456117 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.456256 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.456392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.456520 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.590422 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:14 crc kubenswrapper[4786]: E0313 15:04:14.590686 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.591777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.591834 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.591853 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.591909 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.591928 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.694738 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.695113 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.695131 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.695154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.695173 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.797668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.797727 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.797748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.797772 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.797790 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.900445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.900507 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.900519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.900558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:14 crc kubenswrapper[4786]: I0313 15:04:14.900571 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:14Z","lastTransitionTime":"2026-03-13T15:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.002964 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.002999 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.003007 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.003022 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.003037 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.006972 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.007090 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.007257 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.007297 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.013016 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5e37b9c-1965-4321-a9fc-6babbd05c395" containerID="6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35" exitCode=0 Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.013078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerDied","Data":"6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.024539 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.042941 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.047412 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.050015 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.064972 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.075006 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.090194 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.105557 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.106768 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.106889 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.106917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.106949 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.106974 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.117153 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.130202 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.140918 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.159102 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.172677 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.185344 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.197672 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.209739 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.209781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.209791 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.209807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.209820 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.213203 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.229268 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.244464 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.249371 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.249503 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.249531 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.249555 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249584 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:04:23.249554742 +0000 UTC m=+93.412766553 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249630 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249679 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:23.249664305 +0000 UTC m=+93.412876126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249729 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249772 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:23.249758697 +0000 UTC m=+93.412970508 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249830 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249840 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249850 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.249901 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:23.24989434 +0000 UTC m=+93.413106141 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.259227 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.274222 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.289801 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.307509 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.311523 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.311593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.311606 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.311621 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.311632 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.322568 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.335159 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.350606 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.350726 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.350751 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.350765 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.350822 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:23.350807082 +0000 UTC m=+93.514018903 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.350782 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.378977 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.392834 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.405600 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.414076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.414112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.414124 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.414140 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.414153 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.430647 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.443728 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:15Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.517033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.517075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.517091 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.517113 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.517130 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.551622 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.551835 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.551982 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.552180 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.551992 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:15 crc kubenswrapper[4786]: E0313 15:04:15.552359 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.620715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.620776 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.620799 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.620828 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.620850 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.724414 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.724483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.724499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.724520 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.724538 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.826576 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.826615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.826627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.826641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.826654 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.929704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.929761 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.929778 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.929802 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:15 crc kubenswrapper[4786]: I0313 15:04:15.929821 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:15Z","lastTransitionTime":"2026-03-13T15:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.019964 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" event={"ID":"b5e37b9c-1965-4321-a9fc-6babbd05c395","Type":"ContainerStarted","Data":"dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.032839 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.032883 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.032892 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.032908 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.032921 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.040014 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.072334 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.107957 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.122781 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.135314 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.135355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.135369 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.135384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.135395 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.137705 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.158966 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.169923 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.181657 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.198818 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.210100 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.222549 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.237030 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.238211 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.238251 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.238262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.238297 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.238309 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.249220 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.261979 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:16Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.341061 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.341092 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.341101 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.341135 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.341146 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.446919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.446981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.446998 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.447023 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.447040 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.550021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.550075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.550092 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.550118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.550141 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.553914 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:16 crc kubenswrapper[4786]: E0313 15:04:16.554206 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.661531 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.661576 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.661596 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.661616 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.661629 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.764587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.764631 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.764682 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.764704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.764721 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.867272 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.867343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.867364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.867390 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.867408 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.970313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.970506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.970515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.970527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:16 crc kubenswrapper[4786]: I0313 15:04:16.970536 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:16Z","lastTransitionTime":"2026-03-13T15:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.073476 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.073536 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.073557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.073581 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.073599 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.176468 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.176527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.176545 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.176567 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.176583 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.279450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.279501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.279518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.279540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.279557 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.382719 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.382764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.382773 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.382786 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.382796 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.485503 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.485569 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.485586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.485611 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.485633 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.551225 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:17 crc kubenswrapper[4786]: E0313 15:04:17.551342 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.553487 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:17 crc kubenswrapper[4786]: E0313 15:04:17.553541 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.553903 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:17 crc kubenswrapper[4786]: E0313 15:04:17.553955 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.561514 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.571832 4786 scope.go:117] "RemoveContainer" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" Mar 13 15:04:17 crc kubenswrapper[4786]: E0313 15:04:17.572015 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.572045 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.589362 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.589416 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.589433 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.589457 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.589477 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.692596 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.692655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.692664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.692681 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.692692 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.796114 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.796263 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.796302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.796325 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.796349 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.899781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.899881 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.899909 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.899939 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:17 crc kubenswrapper[4786]: I0313 15:04:17.899963 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:17Z","lastTransitionTime":"2026-03-13T15:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.003641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.004089 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.004150 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.004184 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.004208 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.029729 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/0.log" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.033634 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba" exitCode=1 Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.033748 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.034802 4786 scope.go:117] "RemoveContainer" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" Mar 13 15:04:18 crc kubenswrapper[4786]: E0313 15:04:18.035141 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.035184 4786 scope.go:117] "RemoveContainer" containerID="f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.055335 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.076556 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.106025 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.109003 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.109026 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.109036 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.109053 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.109063 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.128353 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.158230 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.174981 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.178174 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:18 crc kubenswrapper[4786]: E0313 15:04:18.178496 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:18 crc kubenswrapper[4786]: E0313 15:04:18.178584 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs podName:2ded4bfa-6d71-4c0f-982d-aee3c61c5612 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:26.178561368 +0000 UTC m=+96.341773219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs") pod "network-metrics-daemon-2v688" (UID: "2ded4bfa-6d71-4c0f-982d-aee3c61c5612") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.200189 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.212217 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.212272 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.212285 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.212304 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.212319 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.221117 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.239665 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.264546 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.293719 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.313117 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.316178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.316264 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.316306 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.316327 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.316341 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.333180 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.356395 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:17Z\\\",\\\"message\\\":\\\"tworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:17.306421 6418 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.306474 6418 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.306506 6418 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 15:04:17.306666 6418 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.307739 6418 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 15:04:17.307784 6418 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:17.307837 6418 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.371022 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.394647 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:18Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.418616 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.418645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.418657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.418674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.418688 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.528500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.528532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.528541 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.528555 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.528568 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.551624 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:18 crc kubenswrapper[4786]: E0313 15:04:18.552064 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.630934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.630981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.630993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.631012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.631024 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.733299 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.733657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.733912 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.735007 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.735141 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.838296 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.838613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.838798 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.838973 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.839115 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.941652 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.942041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.942259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.942423 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:18 crc kubenswrapper[4786]: I0313 15:04:18.942734 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:18Z","lastTransitionTime":"2026-03-13T15:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.040688 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/0.log" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.044393 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.044894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.044924 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.044937 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.044953 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.044967 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.052927 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.078247 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.104500 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.129311 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.147597 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.147645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.147662 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.147684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.147701 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.150601 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.172112 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.190914 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.215561 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.226496 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.248385 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.249705 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.249783 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.249809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.249840 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.249911 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.267390 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.288753 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.304943 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.320478 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.346113 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:17Z\\\",\\\"message\\\":\\\"tworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:17.306421 6418 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.306474 6418 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.306506 6418 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 15:04:17.306666 6418 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.307739 6418 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 15:04:17.307784 6418 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:17.307837 6418 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.353197 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.353246 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.353258 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.353275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.353289 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.368682 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.382612 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:19Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.456252 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.456291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.456307 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.456330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.456347 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.551232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.551288 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.551372 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:19 crc kubenswrapper[4786]: E0313 15:04:19.551584 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:19 crc kubenswrapper[4786]: E0313 15:04:19.551952 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:19 crc kubenswrapper[4786]: E0313 15:04:19.552014 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.558832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.558903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.558920 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.558941 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.558958 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.661910 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.661968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.661992 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.662020 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.662048 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.765152 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.765202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.765216 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.765233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.765246 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.868145 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.868200 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.868224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.868252 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.868273 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.971315 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.971382 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.971404 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.971431 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:19 crc kubenswrapper[4786]: I0313 15:04:19.971452 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:19Z","lastTransitionTime":"2026-03-13T15:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.050556 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/1.log" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.051543 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/0.log" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.054822 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f" exitCode=1 Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.054874 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.054916 4786 scope.go:117] "RemoveContainer" containerID="f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.056026 4786 scope.go:117] "RemoveContainer" containerID="aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f" Mar 13 15:04:20 crc kubenswrapper[4786]: E0313 15:04:20.056284 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.074280 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.074321 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.074333 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.074351 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.074364 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.079742 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.107347 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.120844 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.133385 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.160654 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:17Z\\\",\\\"message\\\":\\\"tworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:17.306421 6418 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.306474 6418 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.306506 6418 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 15:04:17.306666 6418 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.307739 6418 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 15:04:17.307784 6418 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:17.307837 6418 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:19Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138522 6754 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138561 6754 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.146763 6754 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 15:04:19.146793 6754 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 15:04:19.146809 6754 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 15:04:19.146824 6754 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 15:04:19.148570 6754 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 15:04:19.148628 6754 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 15:04:19.149486 6754 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 15:04:19.149673 6754 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 15:04:19.149708 6754 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 15:04:19.149751 6754 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 15:04:19.149769 6754 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 15:04:19.150353 6754 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.178076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.178128 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.178145 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.178167 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.178184 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.180667 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.199355 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.225211 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.225712 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.225809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.225950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.226035 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.225220 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: E0313 15:04:20.266811 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.273844 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.273959 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.273981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.274011 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.274033 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.278848 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: E0313 15:04:20.294734 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.300302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.300454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.300552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.300651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.300736 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.301850 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: E0313 15:04:20.314932 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.316425 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.319725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.319902 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.320224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.320660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.320782 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.333080 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: E0313 15:04:20.337100 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.340984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.341045 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.341095 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.341126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.341151 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.345998 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: E0313 15:04:20.355750 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: E0313 15:04:20.355998 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.357466 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.357494 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.357508 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.357527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.357556 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.363703 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.378102 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.392360 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.460901 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.460954 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.460971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.460996 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.461014 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.551537 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:20 crc kubenswrapper[4786]: E0313 15:04:20.551735 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.564056 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.564462 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.564603 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.564737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.564890 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.576774 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.594750 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.612632 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.631978 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.653138 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.669247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.669604 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.669796 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.669178 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.670024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.670221 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.689252 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.712168 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.732059 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.752942 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.774123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.774177 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.774196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.774220 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.774243 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.776142 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.789396 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.802913 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.830791 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f01fa0d57abee97edc0315cc8f252730cb609fe2962194c72496d1beec3055ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:17Z\\\",\\\"message\\\":\\\"tworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:17.306421 6418 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.306474 6418 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.306506 6418 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 15:04:17.306666 6418 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 15:04:17.307739 6418 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 15:04:17.307784 6418 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:17.307837 6418 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:19Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138522 6754 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138561 6754 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.146763 6754 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 15:04:19.146793 6754 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 15:04:19.146809 6754 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 15:04:19.146824 6754 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 15:04:19.148570 6754 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 15:04:19.148628 6754 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 15:04:19.149486 6754 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 15:04:19.149673 6754 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 15:04:19.149708 6754 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 15:04:19.149751 6754 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 15:04:19.149769 6754 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 15:04:19.150353 6754 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.848118 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.868486 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.876169 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.876217 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.876236 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.876260 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.876281 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.978451 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.978480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.978491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.978509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:20 crc kubenswrapper[4786]: I0313 15:04:20.978521 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:20Z","lastTransitionTime":"2026-03-13T15:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.060043 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/1.log" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.064077 4786 scope.go:117] "RemoveContainer" containerID="aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f" Mar 13 15:04:21 crc kubenswrapper[4786]: E0313 15:04:21.064275 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.080798 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.080943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.080973 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.081014 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.081043 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.086938 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.108404 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.127745 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.145192 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.181453 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:19Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138522 6754 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138561 6754 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.146763 6754 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 15:04:19.146793 6754 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 15:04:19.146809 6754 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 15:04:19.146824 6754 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 15:04:19.148570 6754 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 15:04:19.148628 6754 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 15:04:19.149486 6754 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 15:04:19.149673 6754 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 15:04:19.149708 6754 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 15:04:19.149751 6754 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 15:04:19.149769 6754 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 15:04:19.150353 6754 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.185577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.185625 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.185644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.185675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.185697 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.201426 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.221736 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.236351 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.248613 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.260521 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.273107 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.285438 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.288248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.288305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.288327 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.288351 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.288368 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.298144 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.313850 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.329710 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.344040 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.390645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.390681 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.390692 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.390709 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.390720 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.493349 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.493400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.493412 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.493429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.493441 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.551454 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.551497 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.551520 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:21 crc kubenswrapper[4786]: E0313 15:04:21.551667 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:21 crc kubenswrapper[4786]: E0313 15:04:21.551798 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:21 crc kubenswrapper[4786]: E0313 15:04:21.552023 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.596641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.596677 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.596687 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.596703 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.596714 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.699120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.699184 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.699208 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.699239 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.699258 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.802673 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.802715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.802727 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.802741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.802752 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.911534 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.911580 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.911591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.911607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:21 crc kubenswrapper[4786]: I0313 15:04:21.911619 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:21Z","lastTransitionTime":"2026-03-13T15:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.015047 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.015122 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.015147 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.015179 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.015201 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.117497 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.117558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.117577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.117600 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.117617 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.219591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.219641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.219660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.219684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.219701 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.323191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.323228 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.323237 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.323252 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.323264 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.425906 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.425961 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.425978 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.426001 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.426017 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.529292 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.529343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.529360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.529386 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.529404 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.551717 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:22 crc kubenswrapper[4786]: E0313 15:04:22.551815 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.631688 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.631765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.631791 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.631822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.631849 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.734592 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.734627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.734639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.734654 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.734665 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.836820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.837002 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.837074 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.837143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.837224 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.940137 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.940182 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.940194 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.940211 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:22 crc kubenswrapper[4786]: I0313 15:04:22.940223 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:22Z","lastTransitionTime":"2026-03-13T15:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.043377 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.043425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.043436 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.043452 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.043461 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.146205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.146253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.146264 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.146282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.146295 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.249013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.249061 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.249072 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.249086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.249097 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.336300 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336487 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:04:39.336461383 +0000 UTC m=+109.499673194 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.336539 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.336582 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.336604 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336732 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336764 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:39.33675798 +0000 UTC m=+109.499969791 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336786 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336801 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336846 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336889 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336923 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:39.336890633 +0000 UTC m=+109.500102484 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.336964 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:39.336946785 +0000 UTC m=+109.500158706 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.350772 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.350811 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.350826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.350840 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.350866 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.437634 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.437893 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.437937 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.437960 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.438053 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:39.4380238 +0000 UTC m=+109.601235651 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.452969 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.453040 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.453064 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.453094 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.453116 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.551179 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.551176 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.551664 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.551264 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.551828 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:23 crc kubenswrapper[4786]: E0313 15:04:23.552010 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.556997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.557081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.557108 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.557140 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.557163 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.660254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.660317 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.660338 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.660362 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.660379 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.762600 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.762636 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.762645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.762659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.762668 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.865024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.865077 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.865096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.865119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.865136 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.967655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.967698 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.967711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.967726 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:23 crc kubenswrapper[4786]: I0313 15:04:23.967738 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:23Z","lastTransitionTime":"2026-03-13T15:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.069410 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.069463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.069483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.069511 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.069531 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.172629 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.172718 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.172741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.172773 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.172793 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.275498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.275567 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.275590 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.275623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.275645 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.379694 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.379745 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.379755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.379777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.379789 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.482630 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.482683 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.482697 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.482716 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.482731 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.551616 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:24 crc kubenswrapper[4786]: E0313 15:04:24.552042 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.585410 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.585662 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.586010 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.586235 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.586447 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.689206 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.689260 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.689284 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.689313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.689335 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.791955 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.792015 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.792024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.792041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.792051 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.894690 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.894737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.894753 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.894777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.894793 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.996955 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.996984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.996994 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.997008 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:24 crc kubenswrapper[4786]: I0313 15:04:24.997017 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:24Z","lastTransitionTime":"2026-03-13T15:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.099645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.099684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.099697 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.099711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.099722 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.201923 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.201946 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.201954 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.201965 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.201973 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.307654 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.307706 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.307718 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.307736 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.307749 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.410411 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.410471 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.410492 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.410517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.410533 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.512948 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.513014 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.513033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.513058 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.513077 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.551594 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.551680 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.551752 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:25 crc kubenswrapper[4786]: E0313 15:04:25.552145 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:25 crc kubenswrapper[4786]: E0313 15:04:25.552453 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:25 crc kubenswrapper[4786]: E0313 15:04:25.552515 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.568350 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.615513 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.615575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.615595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.615619 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.615637 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.718586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.718671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.718695 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.718727 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.718750 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.821286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.821320 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.821329 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.821341 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.821350 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.924380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.924448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.924466 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.924490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:25 crc kubenswrapper[4786]: I0313 15:04:25.924516 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:25Z","lastTransitionTime":"2026-03-13T15:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.027165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.027229 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.027246 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.027272 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.027289 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.129713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.129808 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.129827 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.129895 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.129917 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.233055 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.233162 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.233187 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.233217 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.233238 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.267076 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:26 crc kubenswrapper[4786]: E0313 15:04:26.267347 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:26 crc kubenswrapper[4786]: E0313 15:04:26.267468 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs podName:2ded4bfa-6d71-4c0f-982d-aee3c61c5612 nodeName:}" failed. No retries permitted until 2026-03-13 15:04:42.267433387 +0000 UTC m=+112.430645248 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs") pod "network-metrics-daemon-2v688" (UID: "2ded4bfa-6d71-4c0f-982d-aee3c61c5612") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.336564 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.336637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.336658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.336685 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.336706 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.439421 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.439476 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.439496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.439518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.439534 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.519194 4786 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.542322 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.542384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.542400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.542423 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.542440 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.551699 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:26 crc kubenswrapper[4786]: E0313 15:04:26.551958 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.645678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.645730 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.645741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.645760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.645771 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.748636 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.748695 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.748716 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.748738 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.748755 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.852008 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.852107 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.852132 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.852162 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.852189 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.955510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.955573 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.955590 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.955615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:26 crc kubenswrapper[4786]: I0313 15:04:26.955632 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:26Z","lastTransitionTime":"2026-03-13T15:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.058445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.058502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.058516 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.058539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.058552 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.161571 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.161615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.161627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.161646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.161663 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.265228 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.265276 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.265289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.265305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.265318 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.368346 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.368400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.368417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.368440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.368458 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.470213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.470281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.470296 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.470321 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.470337 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.551524 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.551598 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.551527 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:27 crc kubenswrapper[4786]: E0313 15:04:27.551727 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:27 crc kubenswrapper[4786]: E0313 15:04:27.551888 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:27 crc kubenswrapper[4786]: E0313 15:04:27.552120 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.574065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.574124 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.574146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.574169 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.574188 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.678265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.678481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.678513 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.678590 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.678616 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.781364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.781413 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.781426 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.781444 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.781457 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.884618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.884701 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.884716 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.884735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.884745 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.987384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.987445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.987463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.987487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:27 crc kubenswrapper[4786]: I0313 15:04:27.987504 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:27Z","lastTransitionTime":"2026-03-13T15:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.090370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.090423 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.090439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.090462 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.090478 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.193762 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.193811 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.193830 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.193875 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.193894 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.296225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.296277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.296296 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.296320 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.296340 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.398837 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.398926 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.398943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.398964 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.398982 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.502530 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.502610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.502638 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.502680 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.502698 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.551239 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:28 crc kubenswrapper[4786]: E0313 15:04:28.551396 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.604944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.604975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.604984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.604995 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.605004 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.707222 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.707251 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.707259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.707272 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.707281 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.810297 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.810361 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.810378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.810401 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.810419 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.913135 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.913209 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.913221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.913256 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:28 crc kubenswrapper[4786]: I0313 15:04:28.913270 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:28Z","lastTransitionTime":"2026-03-13T15:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.016563 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.016628 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.016650 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.016678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.016706 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.120075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.120181 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.120198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.120230 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.120256 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.224415 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.224479 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.224498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.224523 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.224542 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.327643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.327699 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.327715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.327733 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.327748 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.432319 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.432383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.432400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.432426 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.432444 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.536795 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.536899 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.536924 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.536953 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.536975 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.551455 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.551583 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:29 crc kubenswrapper[4786]: E0313 15:04:29.551770 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.551820 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:29 crc kubenswrapper[4786]: E0313 15:04:29.552016 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:29 crc kubenswrapper[4786]: E0313 15:04:29.552169 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.640495 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.640572 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.640597 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.640625 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.640647 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.743282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.743334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.743348 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.743364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.743376 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.846358 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.846429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.846454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.846483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.846508 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.949001 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.949068 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.949089 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.949115 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:29 crc kubenswrapper[4786]: I0313 15:04:29.949135 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:29Z","lastTransitionTime":"2026-03-13T15:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.051957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.052016 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.052051 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.052076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.052095 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.155268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.155333 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.155355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.155380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.155399 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.259069 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.259146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.259170 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.259201 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.259224 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.362231 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.362290 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.362308 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.362332 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.362355 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.386195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.386235 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.386245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.386259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.386270 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: E0313 15:04:30.402169 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.406582 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.406622 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.406632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.406646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.406656 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: E0313 15:04:30.418767 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.422429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.422490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.422515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.422543 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.422565 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: E0313 15:04:30.435802 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.439558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.439587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.439598 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.439645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.439657 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: E0313 15:04:30.454215 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.458307 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.458422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.458446 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.458502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.458521 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: E0313 15:04:30.477427 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: E0313 15:04:30.477567 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.479731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.479788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.479811 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.479842 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.479894 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.551469 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:30 crc kubenswrapper[4786]: E0313 15:04:30.551838 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.576014 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.585614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.585697 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.585721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.585751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.585779 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.589493 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.609981 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.624828 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.641991 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.656177 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.671454 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.688944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.689011 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.689028 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.689053 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.689071 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.689842 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.704970 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.720053 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.737185 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.752959 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.766774 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.793491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.793522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.793533 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.793549 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.793560 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.801346 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:19Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138522 6754 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138561 6754 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.146763 6754 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 15:04:19.146793 6754 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 15:04:19.146809 6754 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 15:04:19.146824 6754 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 15:04:19.148570 6754 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 15:04:19.148628 6754 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 15:04:19.149486 6754 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 15:04:19.149673 6754 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 15:04:19.149708 6754 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 15:04:19.149751 6754 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 15:04:19.149769 6754 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 15:04:19.150353 6754 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.843807 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.869050 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.882297 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:30Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.896174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.896213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.896232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.896251 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.896267 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:30Z","lastTransitionTime":"2026-03-13T15:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.999901 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:30 crc kubenswrapper[4786]: I0313 15:04:30.999986 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.000010 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.000037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.000084 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.102429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.102511 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.102528 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.102581 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.102602 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.205667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.205719 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.205740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.205771 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.205791 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.309302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.309417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.309439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.309494 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.309512 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.413039 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.413102 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.413121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.413145 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.413164 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.515949 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.516031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.516068 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.516098 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.516124 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.551724 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.551781 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:31 crc kubenswrapper[4786]: E0313 15:04:31.551948 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.552001 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:31 crc kubenswrapper[4786]: E0313 15:04:31.552467 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:31 crc kubenswrapper[4786]: E0313 15:04:31.552742 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.552941 4786 scope.go:117] "RemoveContainer" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" Mar 13 15:04:31 crc kubenswrapper[4786]: E0313 15:04:31.553513 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.618587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.618695 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.618721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.618748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.618769 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.725712 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.725813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.725913 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.725986 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.726007 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.829496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.829550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.829566 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.829589 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.829607 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.932575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.932627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.932645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.932667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:31 crc kubenswrapper[4786]: I0313 15:04:31.932685 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:31Z","lastTransitionTime":"2026-03-13T15:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.035585 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.035628 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.035642 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.035659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.035671 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.138613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.138678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.138697 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.138722 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.138741 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.241582 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.241648 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.241672 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.241704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.241728 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.344755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.344823 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.344845 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.344910 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.344936 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.448363 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.448420 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.448439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.448467 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.448486 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.551491 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.552119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.552397 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.552646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.552831 4786 scope.go:117] "RemoveContainer" containerID="aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.553106 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: E0313 15:04:32.552215 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.553315 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.656558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.656958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.657096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.657224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.657361 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.760908 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.760956 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.760975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.760996 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.761014 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.864271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.864322 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.864333 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.864350 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.864361 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.967773 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.967827 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.967844 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.967893 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:32 crc kubenswrapper[4786]: I0313 15:04:32.967910 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:32Z","lastTransitionTime":"2026-03-13T15:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.070849 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.070953 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.070978 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.071021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.071058 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.103678 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/1.log" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.107361 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.107796 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.123918 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.139189 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.155047 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.173527 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.174167 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.174219 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.174237 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.174259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.174276 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.195115 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.217786 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.231415 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.243443 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.254483 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.265636 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.280201 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.280265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.280285 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.280314 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.280330 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.284388 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.303332 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.316744 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.329779 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.348825 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:19Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138522 6754 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138561 6754 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.146763 6754 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 15:04:19.146793 6754 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 15:04:19.146809 6754 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 15:04:19.146824 6754 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 15:04:19.148570 6754 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 15:04:19.148628 6754 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 15:04:19.149486 6754 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 15:04:19.149673 6754 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 15:04:19.149708 6754 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 15:04:19.149751 6754 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 15:04:19.149769 6754 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 15:04:19.150353 6754 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.369014 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.383782 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.384032 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.384191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.384212 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.384239 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.384263 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.487193 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.487261 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.487288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.487318 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.487339 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.551645 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.551750 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.551829 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:33 crc kubenswrapper[4786]: E0313 15:04:33.551772 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:33 crc kubenswrapper[4786]: E0313 15:04:33.552028 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:33 crc kubenswrapper[4786]: E0313 15:04:33.552134 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.590254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.590291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.590299 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.590313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.590322 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.693542 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.693595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.693613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.693645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.693665 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.796253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.796329 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.796346 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.796376 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.796398 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.899439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.899505 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.899529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.899558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:33 crc kubenswrapper[4786]: I0313 15:04:33.899584 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:33Z","lastTransitionTime":"2026-03-13T15:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.002241 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.002278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.002288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.002321 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.002333 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.105450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.105501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.105517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.105538 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.105554 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.113174 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/2.log" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.114133 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/1.log" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.118058 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592" exitCode=1 Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.118108 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.118161 4786 scope.go:117] "RemoveContainer" containerID="aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.119287 4786 scope.go:117] "RemoveContainer" containerID="0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592" Mar 13 15:04:34 crc kubenswrapper[4786]: E0313 15:04:34.119594 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.144107 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.162461 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.179829 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.201004 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.208844 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.208918 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.208935 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.208957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.208974 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.222486 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.240959 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.261506 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.278491 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.296769 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.311797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.311886 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.311907 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.311931 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.311948 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.315533 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.331914 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.351602 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.374623 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.408573 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.415690 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.415732 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.415744 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.415762 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.415775 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.431830 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.449776 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.479845 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aef5e19154ff1b9b957f7efa477d2d320dbc74762802d41700eb1aba4089f79f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:19Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138522 6754 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.138561 6754 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 15:04:19.146763 6754 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 15:04:19.146793 6754 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 15:04:19.146809 6754 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 15:04:19.146824 6754 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 15:04:19.148570 6754 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 15:04:19.148628 6754 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 15:04:19.149486 6754 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 15:04:19.149673 6754 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 15:04:19.149708 6754 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 15:04:19.149751 6754 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 15:04:19.149769 6754 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 15:04:19.150353 6754 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:34Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.518997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.519045 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.519063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.519084 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.519099 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.551893 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:34 crc kubenswrapper[4786]: E0313 15:04:34.552004 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.621529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.621583 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.621593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.621608 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.621619 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.724687 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.724739 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.724750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.724763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.724772 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.828050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.828119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.828137 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.828161 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.828177 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.931331 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.931378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.931395 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.931418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:34 crc kubenswrapper[4786]: I0313 15:04:34.931434 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:34Z","lastTransitionTime":"2026-03-13T15:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.034577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.034644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.034662 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.034689 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.034705 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.124344 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/2.log" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.128089 4786 scope.go:117] "RemoveContainer" containerID="0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592" Mar 13 15:04:35 crc kubenswrapper[4786]: E0313 15:04:35.128237 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.138375 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.138416 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.138425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.138442 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.138453 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.143957 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.155800 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.172111 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.186686 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.215134 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.231681 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.241042 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.241075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.241083 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.241097 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.241106 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.249184 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.281516 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.297362 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.312380 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.325789 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.341141 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.342785 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.342816 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.342844 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.342881 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.342893 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.360505 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.371577 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.389914 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.404824 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.416198 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:35Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.448921 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.448981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.448995 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.449012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.449028 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.551008 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.551057 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.551027 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:35 crc kubenswrapper[4786]: E0313 15:04:35.551135 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:35 crc kubenswrapper[4786]: E0313 15:04:35.551299 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:35 crc kubenswrapper[4786]: E0313 15:04:35.551574 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.553449 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.553529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.553552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.553583 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.553604 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.656976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.657044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.657064 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.657087 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.657105 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.759745 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.759808 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.759827 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.759851 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.759899 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.862442 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.862503 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.862520 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.862545 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.862562 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.965929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.966015 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.966040 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.966070 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:35 crc kubenswrapper[4786]: I0313 15:04:35.966094 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:35Z","lastTransitionTime":"2026-03-13T15:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.069488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.069574 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.069593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.069617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.069636 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.171803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.171909 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.171935 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.171964 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.171981 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.274434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.274503 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.274526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.274555 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.274574 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.377539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.377616 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.377642 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.377671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.377693 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.481275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.481392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.481418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.481450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.481471 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.552072 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:36 crc kubenswrapper[4786]: E0313 15:04:36.552259 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.584566 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.584627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.584644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.584667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.584685 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.687544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.687591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.687603 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.687620 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.687632 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.790286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.790352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.790371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.790396 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.790416 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.892894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.892952 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.892971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.892994 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.893014 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.996460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.996527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.996544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.996568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:36 crc kubenswrapper[4786]: I0313 15:04:36.996587 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:36Z","lastTransitionTime":"2026-03-13T15:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.099460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.099513 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.099529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.099552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.099569 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.202205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.202259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.202277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.202298 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.202317 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.305588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.305654 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.305671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.305698 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.305716 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.408952 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.409082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.409101 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.409126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.409143 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.512358 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.512413 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.512429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.512454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.512471 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.551420 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.551471 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:37 crc kubenswrapper[4786]: E0313 15:04:37.551632 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.551686 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:37 crc kubenswrapper[4786]: E0313 15:04:37.551782 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:37 crc kubenswrapper[4786]: E0313 15:04:37.551976 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.615614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.615680 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.615701 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.615729 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.615747 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.718349 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.718430 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.718456 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.718485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.718508 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.821623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.821698 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.821720 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.821750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.821774 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.925715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.925764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.925782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.925807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:37 crc kubenswrapper[4786]: I0313 15:04:37.925825 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:37Z","lastTransitionTime":"2026-03-13T15:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.029412 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.029466 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.029484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.029509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.029526 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.132846 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.132941 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.132959 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.132981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.132997 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.236393 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.236460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.236481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.236506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.236525 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.339485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.339544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.339562 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.339588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.339605 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.442248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.442334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.442354 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.442379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.442398 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.545640 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.545674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.545682 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.545695 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.545705 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.552205 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:38 crc kubenswrapper[4786]: E0313 15:04:38.552419 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.649223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.649262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.649272 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.649286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.649297 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.752140 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.752209 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.752234 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.752263 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.752287 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.855092 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.855153 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.855175 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.855204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.855226 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.959347 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.959412 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.959434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.959463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:38 crc kubenswrapper[4786]: I0313 15:04:38.959485 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:38Z","lastTransitionTime":"2026-03-13T15:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.063524 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.063584 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.063601 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.063624 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.063641 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.166458 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.166523 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.166546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.166575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.166600 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.269384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.269465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.269491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.269519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.269540 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.372881 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.372950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.372975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.373004 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.373027 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.423344 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.423492 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.423521 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.423544 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.423650 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:05:11.423611572 +0000 UTC m=+141.586823433 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.423659 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.423746 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:05:11.423725834 +0000 UTC m=+141.586937805 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.423743 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.423786 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.423807 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.423916 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 15:05:11.423893129 +0000 UTC m=+141.587104990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.424464 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.424556 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:05:11.424537154 +0000 UTC m=+141.587749005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.476299 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.476344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.476361 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.476383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.476399 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.524995 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.525197 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.525231 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.525250 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.525329 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 15:05:11.525308212 +0000 UTC m=+141.688520053 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.551658 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.551780 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.552103 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.552360 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.552482 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:39 crc kubenswrapper[4786]: E0313 15:04:39.552619 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.580212 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.580285 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.580305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.580332 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.580354 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.683674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.683721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.683738 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.683760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.683777 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.786691 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.786819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.786841 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.786909 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.786935 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.890502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.890556 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.890572 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.890595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.890612 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.993534 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.993586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.993602 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.993634 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:39 crc kubenswrapper[4786]: I0313 15:04:39.994552 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:39Z","lastTransitionTime":"2026-03-13T15:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.097903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.097951 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.097962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.097980 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.097992 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.200959 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.201021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.201037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.201060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.201074 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.303911 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.303955 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.303965 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.303980 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.303991 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.406196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.406262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.406280 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.406299 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.406311 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.509569 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.509629 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.509643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.509664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.509679 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.552034 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:40 crc kubenswrapper[4786]: E0313 15:04:40.552400 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.580825 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.580907 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.580925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.580945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.580962 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.583312 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: E0313 15:04:40.601645 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.606058 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.606128 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.606148 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.606173 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.606191 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.615003 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: E0313 15:04:40.630263 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.632281 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.635501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.635577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.635606 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.635639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.635669 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.651715 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: E0313 15:04:40.656443 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.660922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.660975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.660993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.661017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.661035 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.674064 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: E0313 15:04:40.680195 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.685428 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.685492 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.685515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.685544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.685565 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.690322 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: E0313 15:04:40.707494 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: E0313 15:04:40.707904 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.709748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.709803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.709821 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.709847 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.709900 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.710578 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.729212 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.745164 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.758134 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.775033 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.790567 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.807213 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.814273 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.814334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.814348 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.814373 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.814390 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.827735 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.850148 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.864584 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.880964 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:40Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.916747 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.916777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.916785 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.916797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:40 crc kubenswrapper[4786]: I0313 15:04:40.916806 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:40Z","lastTransitionTime":"2026-03-13T15:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.020167 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.020253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.020278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.020309 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.020331 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.123983 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.124045 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.124062 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.124086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.124105 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.227088 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.227168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.227192 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.227226 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.227249 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.330396 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.330430 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.330442 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.330458 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.330469 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.433160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.433245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.433265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.433296 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.433317 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.537120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.537177 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.537198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.537225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.537245 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.552034 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.552127 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.552159 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:41 crc kubenswrapper[4786]: E0313 15:04:41.552271 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:41 crc kubenswrapper[4786]: E0313 15:04:41.552375 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:41 crc kubenswrapper[4786]: E0313 15:04:41.552481 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.641707 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.641781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.641803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.641834 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.641910 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.745029 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.745076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.745095 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.745118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.745137 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.847712 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.847782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.847802 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.847826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.847842 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.950752 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.950808 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.950829 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.950888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:41 crc kubenswrapper[4786]: I0313 15:04:41.950913 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:41Z","lastTransitionTime":"2026-03-13T15:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.054075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.054128 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.054146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.054171 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.054189 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.156818 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.156909 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.156934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.156958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.156980 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.259805 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.259923 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.259949 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.259979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.259999 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.355972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:42 crc kubenswrapper[4786]: E0313 15:04:42.356153 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:42 crc kubenswrapper[4786]: E0313 15:04:42.356290 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs podName:2ded4bfa-6d71-4c0f-982d-aee3c61c5612 nodeName:}" failed. No retries permitted until 2026-03-13 15:05:14.356267088 +0000 UTC m=+144.519478929 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs") pod "network-metrics-daemon-2v688" (UID: "2ded4bfa-6d71-4c0f-982d-aee3c61c5612") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.362913 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.362993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.363013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.363040 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.363058 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.466148 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.466213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.466235 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.466266 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.466287 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.552015 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:42 crc kubenswrapper[4786]: E0313 15:04:42.552214 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.568395 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.568448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.568465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.568486 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.568502 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.671000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.671038 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.671047 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.671060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.671069 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.773986 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.774041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.774058 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.774079 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.774096 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.877295 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.877352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.877365 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.877381 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.877393 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.980138 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.980216 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.980233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.980260 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:42 crc kubenswrapper[4786]: I0313 15:04:42.980283 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:42Z","lastTransitionTime":"2026-03-13T15:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.082530 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.082564 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.082574 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.082586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.082595 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.185044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.185096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.185109 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.185130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.185142 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.288346 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.288391 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.288406 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.288425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.288437 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.392063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.392112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.392122 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.392137 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.392148 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.494311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.494357 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.494371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.494391 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.494404 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.552040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.552055 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:43 crc kubenswrapper[4786]: E0313 15:04:43.552630 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.552090 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:43 crc kubenswrapper[4786]: E0313 15:04:43.552783 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.553003 4786 scope.go:117] "RemoveContainer" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" Mar 13 15:04:43 crc kubenswrapper[4786]: E0313 15:04:43.552812 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.598102 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.598381 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.598398 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.598420 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.598436 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.701213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.701278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.701298 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.701333 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.701353 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.805489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.805566 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.805587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.805616 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.805635 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.908563 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.908637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.908659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.908683 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:43 crc kubenswrapper[4786]: I0313 15:04:43.908702 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:43Z","lastTransitionTime":"2026-03-13T15:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.011399 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.011434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.011446 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.011461 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.011472 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.113925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.113996 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.114014 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.114036 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.114052 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.169632 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.172013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.172573 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.196567 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.206655 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.216104 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.216151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.216170 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.216191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.216210 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.218659 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.233413 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.255349 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.275074 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.291064 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.318389 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.318439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.318448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.318462 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.318471 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.320100 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.341443 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.355105 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.370917 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.386550 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.401971 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.416942 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.420761 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.420822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.420839 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.420890 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.420915 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.433711 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.447569 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.465973 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:44Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.523852 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.523930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.523948 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.523971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.523988 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.551760 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:44 crc kubenswrapper[4786]: E0313 15:04:44.551890 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.626498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.626545 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.626558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.626576 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.626589 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.729551 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.729623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.729646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.729674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.729697 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.832358 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.832400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.832417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.832439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.832455 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.935112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.935155 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.935166 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.935184 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:44 crc kubenswrapper[4786]: I0313 15:04:44.935198 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:44Z","lastTransitionTime":"2026-03-13T15:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.038584 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.038627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.038638 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.038660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.038672 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.142149 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.142223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.142242 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.142271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.142290 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.245125 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.245172 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.245185 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.245203 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.245216 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.347637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.347718 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.347733 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.347762 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.347777 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.450515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.450578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.450595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.450618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.450635 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.551636 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.551700 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:45 crc kubenswrapper[4786]: E0313 15:04:45.551745 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.551642 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:45 crc kubenswrapper[4786]: E0313 15:04:45.551838 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:45 crc kubenswrapper[4786]: E0313 15:04:45.552039 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.553332 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.553397 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.553422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.553452 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.553476 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.656676 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.656757 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.656782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.656813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.656835 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.760390 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.760441 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.760460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.760485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.760502 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.863144 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.863210 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.863227 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.863255 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.863275 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.967149 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.967212 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.967230 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.967254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:45 crc kubenswrapper[4786]: I0313 15:04:45.967271 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:45Z","lastTransitionTime":"2026-03-13T15:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.070112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.070175 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.070191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.070217 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.070235 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.173496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.173557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.173605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.173632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.173654 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.276730 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.276771 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.276781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.276797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.276807 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.378736 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.378767 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.378776 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.378788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.378798 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.480349 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.480406 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.480422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.480444 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.480462 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.551548 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:46 crc kubenswrapper[4786]: E0313 15:04:46.552160 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.584150 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.584222 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.584244 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.584270 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.584289 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.686668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.686735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.686754 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.686776 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.686791 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.789595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.789651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.789667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.789689 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.789707 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.893123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.893920 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.894063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.894155 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.894239 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.997534 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.998027 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.998117 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.998194 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:46 crc kubenswrapper[4786]: I0313 15:04:46.998267 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:46Z","lastTransitionTime":"2026-03-13T15:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.101445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.101498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.101514 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.101538 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.101557 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.204490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.204558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.204580 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.204606 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.204626 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.307339 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.307462 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.307483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.307506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.307523 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.410901 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.410978 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.410997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.411021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.411038 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.515017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.515064 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.515081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.515103 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.515119 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.550997 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.551025 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.551092 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:47 crc kubenswrapper[4786]: E0313 15:04:47.551252 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:47 crc kubenswrapper[4786]: E0313 15:04:47.551579 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:47 crc kubenswrapper[4786]: E0313 15:04:47.551824 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.617788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.617852 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.617922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.617945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.617963 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.721725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.721789 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.721805 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.721829 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.721850 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.824784 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.824887 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.824944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.824981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.825083 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.927967 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.928013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.928029 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.928049 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:47 crc kubenswrapper[4786]: I0313 15:04:47.928063 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:47Z","lastTransitionTime":"2026-03-13T15:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.030680 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.030735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.030752 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.030777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.030795 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.134343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.134427 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.134453 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.134483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.134525 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.237981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.238060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.238080 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.238117 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.238151 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.341942 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.342012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.342034 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.342065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.342089 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.445775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.445840 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.445873 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.445903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.445920 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.548772 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.548826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.548838 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.548892 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.548907 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.551099 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:48 crc kubenswrapper[4786]: E0313 15:04:48.551189 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.552998 4786 scope.go:117] "RemoveContainer" containerID="0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592" Mar 13 15:04:48 crc kubenswrapper[4786]: E0313 15:04:48.553358 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.651669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.651712 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.651723 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.651739 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.651752 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.754180 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.754259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.754282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.754311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.754334 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.857743 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.857807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.857829 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.857893 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.857920 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.962144 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.962239 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.962267 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.962301 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:48 crc kubenswrapper[4786]: I0313 15:04:48.962336 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:48Z","lastTransitionTime":"2026-03-13T15:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.065192 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.065259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.065283 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.065311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.065334 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.168401 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.168474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.168496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.168524 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.168548 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.271607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.271666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.271683 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.271709 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.271728 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.374950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.375009 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.375026 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.375048 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.375065 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.478049 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.478117 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.478134 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.478159 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.478178 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.551026 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.551043 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.551209 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:49 crc kubenswrapper[4786]: E0313 15:04:49.551457 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:49 crc kubenswrapper[4786]: E0313 15:04:49.551555 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:49 crc kubenswrapper[4786]: E0313 15:04:49.551673 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.581577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.581649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.581660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.581676 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.581686 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.684241 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.684282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.684294 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.684311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.684321 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.786774 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.786822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.786835 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.786878 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.786897 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.889474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.889513 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.889525 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.889541 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.889551 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.992662 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.992747 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.992762 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.992777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:49 crc kubenswrapper[4786]: I0313 15:04:49.992787 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:49Z","lastTransitionTime":"2026-03-13T15:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.095198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.095249 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.095265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.095286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.095302 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.197621 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.197680 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.197698 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.197739 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.197757 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.301018 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.301065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.301076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.301107 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.301121 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.403797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.403940 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.403960 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.403987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.404005 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.504245 4786 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.550998 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.551163 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.567994 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.582143 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.598313 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.613920 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.627105 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.633945 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.640360 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.666692 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.688822 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.700054 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.715496 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.735324 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.745298 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.745356 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.745367 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.745403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.745415 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.748849 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.757161 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.761921 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.762341 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.762370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.762378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.762392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.762403 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.776278 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.776603 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.780420 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.780454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.780463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.780503 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.780517 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.791312 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.791426 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.794764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.794803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.794811 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.794826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.794837 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.803314 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.807321 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.811630 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.811668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.811677 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.811708 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.811723 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:04:50Z","lastTransitionTime":"2026-03-13T15:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:04:50 crc kubenswrapper[4786]: I0313 15:04:50.817623 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.824943 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:50Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:50 crc kubenswrapper[4786]: E0313 15:04:50.825088 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:04:51 crc kubenswrapper[4786]: I0313 15:04:51.551833 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:51 crc kubenswrapper[4786]: I0313 15:04:51.551953 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:51 crc kubenswrapper[4786]: I0313 15:04:51.552078 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:51 crc kubenswrapper[4786]: E0313 15:04:51.552464 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:51 crc kubenswrapper[4786]: E0313 15:04:51.552561 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:51 crc kubenswrapper[4786]: E0313 15:04:51.552611 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:52 crc kubenswrapper[4786]: I0313 15:04:52.551578 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:52 crc kubenswrapper[4786]: E0313 15:04:52.551811 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:53 crc kubenswrapper[4786]: I0313 15:04:53.552098 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:53 crc kubenswrapper[4786]: I0313 15:04:53.552169 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:53 crc kubenswrapper[4786]: E0313 15:04:53.552367 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:53 crc kubenswrapper[4786]: I0313 15:04:53.552415 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:53 crc kubenswrapper[4786]: E0313 15:04:53.552615 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:53 crc kubenswrapper[4786]: E0313 15:04:53.552827 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:54 crc kubenswrapper[4786]: I0313 15:04:54.551670 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:54 crc kubenswrapper[4786]: E0313 15:04:54.551893 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.221135 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/0.log" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.221194 4786 generic.go:334] "Generic (PLEG): container finished" podID="930a5a92-be71-4866-aa6f-95a98647bc33" containerID="4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084" exitCode=1 Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.221224 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mvpcz" event={"ID":"930a5a92-be71-4866-aa6f-95a98647bc33","Type":"ContainerDied","Data":"4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084"} Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.221616 4786 scope.go:117] "RemoveContainer" containerID="4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.240186 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.251449 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.264337 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.277002 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.294344 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.307185 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.321044 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.336524 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.354303 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.365374 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.376754 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.400468 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.420676 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.437635 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.449326 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.463726 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.474144 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:55Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.551663 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.551697 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:55 crc kubenswrapper[4786]: I0313 15:04:55.551761 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:55 crc kubenswrapper[4786]: E0313 15:04:55.551783 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:55 crc kubenswrapper[4786]: E0313 15:04:55.551978 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:55 crc kubenswrapper[4786]: E0313 15:04:55.552065 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:55 crc kubenswrapper[4786]: E0313 15:04:55.635194 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.228787 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/0.log" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.228931 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mvpcz" event={"ID":"930a5a92-be71-4866-aa6f-95a98647bc33","Type":"ContainerStarted","Data":"de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383"} Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.253361 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.268457 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.288832 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.308173 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.327998 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.350066 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.360308 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.369846 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.390848 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.407643 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.455271 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.474614 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.493099 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.508272 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.525096 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.536395 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.547041 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.551949 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:56 crc kubenswrapper[4786]: E0313 15:04:56.552051 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.573058 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.588426 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.599117 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.614822 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.629979 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.641900 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.654810 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.667217 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.678541 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.690629 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.702573 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.711493 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.722309 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.735312 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.756521 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.769068 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.780594 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:56 crc kubenswrapper[4786]: I0313 15:04:56.798665 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:56Z is after 2025-08-24T17:21:41Z" Mar 13 15:04:57 crc kubenswrapper[4786]: I0313 15:04:57.551178 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:57 crc kubenswrapper[4786]: I0313 15:04:57.551242 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:57 crc kubenswrapper[4786]: I0313 15:04:57.551183 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:57 crc kubenswrapper[4786]: E0313 15:04:57.551390 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:57 crc kubenswrapper[4786]: E0313 15:04:57.551488 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:57 crc kubenswrapper[4786]: E0313 15:04:57.551573 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:04:58 crc kubenswrapper[4786]: I0313 15:04:58.551162 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:04:58 crc kubenswrapper[4786]: E0313 15:04:58.551340 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:04:59 crc kubenswrapper[4786]: I0313 15:04:59.551960 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:04:59 crc kubenswrapper[4786]: I0313 15:04:59.551995 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:04:59 crc kubenswrapper[4786]: I0313 15:04:59.552104 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:04:59 crc kubenswrapper[4786]: E0313 15:04:59.552169 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:04:59 crc kubenswrapper[4786]: E0313 15:04:59.552304 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:04:59 crc kubenswrapper[4786]: E0313 15:04:59.552450 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.551075 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:00 crc kubenswrapper[4786]: E0313 15:05:00.551250 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.572285 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.586711 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.602738 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.623377 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: E0313 15:05:00.636741 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.645603 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.662882 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.678133 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.695160 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.709763 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.720322 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.730740 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.745005 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.757928 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.789416 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.804351 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.817952 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:00 crc kubenswrapper[4786]: I0313 15:05:00.837447 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:00Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.192804 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.192932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.192962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.192991 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.193015 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:01Z","lastTransitionTime":"2026-03-13T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.209809 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:01Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.213817 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.213941 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.213958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.213981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.213997 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:01Z","lastTransitionTime":"2026-03-13T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.227623 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:01Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.232624 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.232655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.232664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.232678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.232687 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:01Z","lastTransitionTime":"2026-03-13T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.252154 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:01Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.255623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.255651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.255660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.255674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.255684 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:01Z","lastTransitionTime":"2026-03-13T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.271909 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:01Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.275467 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.275514 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.275526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.275546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.275560 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:01Z","lastTransitionTime":"2026-03-13T15:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.290989 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:01Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.291130 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.551662 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.551694 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.552080 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:01 crc kubenswrapper[4786]: I0313 15:05:01.551688 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.551852 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:01 crc kubenswrapper[4786]: E0313 15:05:01.552220 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:02 crc kubenswrapper[4786]: I0313 15:05:02.551802 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:02 crc kubenswrapper[4786]: E0313 15:05:02.552079 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:02 crc kubenswrapper[4786]: I0313 15:05:02.568630 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 15:05:03 crc kubenswrapper[4786]: I0313 15:05:03.551471 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:03 crc kubenswrapper[4786]: I0313 15:05:03.551557 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:03 crc kubenswrapper[4786]: I0313 15:05:03.551602 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:03 crc kubenswrapper[4786]: E0313 15:05:03.551700 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:03 crc kubenswrapper[4786]: E0313 15:05:03.551806 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:03 crc kubenswrapper[4786]: E0313 15:05:03.552389 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:03 crc kubenswrapper[4786]: I0313 15:05:03.552811 4786 scope.go:117] "RemoveContainer" containerID="0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.255991 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/2.log" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.258520 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497"} Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.258986 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.271318 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.282986 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.300018 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.317507 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.335319 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.349256 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.365172 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.392930 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9d4a544-8099-42dc-96da-d731319b2200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca83c02c9c5b9d5492e1d45595c615398d17482f64c624ddfbcb42ca5c0f703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d58f4ad322fc52a0f282330b987459649fa571251a8dcbfceaad4346d4635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9969ac4ffe3ab34121b0c9826d14df87bd9af2a61adaa8f384c60628f27e32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.405356 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.416833 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.432100 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.450026 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.471663 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.486410 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.498462 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.517682 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.536295 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.546512 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:04Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:04 crc kubenswrapper[4786]: I0313 15:05:04.551900 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:04 crc kubenswrapper[4786]: E0313 15:05:04.552034 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.263165 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/3.log" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.263852 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/2.log" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.266982 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497" exitCode=1 Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.267034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497"} Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.267084 4786 scope.go:117] "RemoveContainer" containerID="0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.267787 4786 scope.go:117] "RemoveContainer" containerID="4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497" Mar 13 15:05:05 crc kubenswrapper[4786]: E0313 15:05:05.268033 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.284973 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.296623 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.308211 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.330504 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.342881 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.354136 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.365248 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.379679 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.389897 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9d4a544-8099-42dc-96da-d731319b2200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca83c02c9c5b9d5492e1d45595c615398d17482f64c624ddfbcb42ca5c0f703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d58f4ad322fc52a0f282330b987459649fa571251a8dcbfceaad4346d4635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9969ac4ffe3ab34121b0c9826d14df87bd9af2a61adaa8f384c60628f27e32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.397653 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.405351 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.416050 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.432299 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.442100 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.451583 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.467272 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0429892767f47374427361835a32ff5b48542b1ff270f3c3a3f3aee47e6f6592\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:33Z\\\",\\\"message\\\":\\\"nt:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 15:04:33.492295 6935 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI0313 15:04:33.492398 6935 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.776003ms\\\\nF0313 15:04:33.492418 6935 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:04:33Z is after 2025-08-24T17:21:41Z]\\\\nI0313 15:04:33.492234 6935 services_controller.go:451] Built ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:05:04Z\\\",\\\"message\\\":\\\"c openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI0313 15:05:04.586466 7284 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0313 15:05:04.586480 7284 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586490 7284 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586499 7284 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0313 15:05:04.586505 7284 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0313 15:05:04.586511 7284 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586527 7284 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 15:05:04.586577 7284 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initiali\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.479287 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.488550 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:05Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.551441 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:05 crc kubenswrapper[4786]: E0313 15:05:05.551626 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.551983 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:05 crc kubenswrapper[4786]: I0313 15:05:05.552035 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:05 crc kubenswrapper[4786]: E0313 15:05:05.552120 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:05 crc kubenswrapper[4786]: E0313 15:05:05.552158 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:05 crc kubenswrapper[4786]: E0313 15:05:05.638192 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.271523 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/3.log" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.274837 4786 scope.go:117] "RemoveContainer" containerID="4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497" Mar 13 15:05:06 crc kubenswrapper[4786]: E0313 15:05:06.275027 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.291258 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.303461 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.318341 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.332608 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.349387 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.366726 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.383550 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.400943 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.416924 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.433908 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9d4a544-8099-42dc-96da-d731319b2200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca83c02c9c5b9d5492e1d45595c615398d17482f64c624ddfbcb42ca5c0f703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d58f4ad322fc52a0f282330b987459649fa571251a8dcbfceaad4346d4635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9969ac4ffe3ab34121b0c9826d14df87bd9af2a61adaa8f384c60628f27e32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.446031 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.459005 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.476235 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.493566 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.521983 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.542206 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.552204 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:06 crc kubenswrapper[4786]: E0313 15:05:06.552405 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.560943 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:06 crc kubenswrapper[4786]: I0313 15:05:06.582498 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:05:04Z\\\",\\\"message\\\":\\\"c openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI0313 15:05:04.586466 7284 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0313 15:05:04.586480 7284 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586490 7284 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586499 7284 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0313 15:05:04.586505 7284 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0313 15:05:04.586511 7284 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586527 7284 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 15:05:04.586577 7284 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initiali\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:05:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:06Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:07 crc kubenswrapper[4786]: I0313 15:05:07.551030 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:07 crc kubenswrapper[4786]: I0313 15:05:07.551115 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:07 crc kubenswrapper[4786]: E0313 15:05:07.551141 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:07 crc kubenswrapper[4786]: I0313 15:05:07.551297 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:07 crc kubenswrapper[4786]: E0313 15:05:07.551510 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:07 crc kubenswrapper[4786]: E0313 15:05:07.551846 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:08 crc kubenswrapper[4786]: I0313 15:05:08.568670 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:08 crc kubenswrapper[4786]: E0313 15:05:08.568913 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:09 crc kubenswrapper[4786]: I0313 15:05:09.551942 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:09 crc kubenswrapper[4786]: I0313 15:05:09.552032 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:09 crc kubenswrapper[4786]: I0313 15:05:09.552079 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:09 crc kubenswrapper[4786]: E0313 15:05:09.552253 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:09 crc kubenswrapper[4786]: E0313 15:05:09.552359 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:09 crc kubenswrapper[4786]: E0313 15:05:09.552522 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.551488 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:10 crc kubenswrapper[4786]: E0313 15:05:10.551813 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.579631 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.597149 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.624311 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: E0313 15:05:10.638975 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.655088 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:05:04Z\\\",\\\"message\\\":\\\"c openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI0313 15:05:04.586466 7284 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0313 15:05:04.586480 7284 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586490 7284 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586499 7284 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0313 15:05:04.586505 7284 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0313 15:05:04.586511 7284 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586527 7284 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 15:05:04.586577 7284 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initiali\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:05:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.671661 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.682153 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.692465 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.705148 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.718733 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.734987 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.749931 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.760058 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.772561 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.787583 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9d4a544-8099-42dc-96da-d731319b2200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca83c02c9c5b9d5492e1d45595c615398d17482f64c624ddfbcb42ca5c0f703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d58f4ad322fc52a0f282330b987459649fa571251a8dcbfceaad4346d4635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9969ac4ffe3ab34121b0c9826d14df87bd9af2a61adaa8f384c60628f27e32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.799985 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.810671 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.826982 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:10 crc kubenswrapper[4786]: I0313 15:05:10.844214 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:10Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.309556 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.309617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.309635 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.309659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.309677 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:11Z","lastTransitionTime":"2026-03-13T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.328727 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.334241 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.334283 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.334301 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.334324 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.334339 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:11Z","lastTransitionTime":"2026-03-13T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.345759 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.350671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.350705 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.350714 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.350729 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.350738 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:11Z","lastTransitionTime":"2026-03-13T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.365294 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.369674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.369725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.369744 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.369769 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.369786 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:11Z","lastTransitionTime":"2026-03-13T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.388989 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.395016 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.395118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.395149 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.395176 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.395319 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:11Z","lastTransitionTime":"2026-03-13T15:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.413209 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:11Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.413422 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.497080 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.497194 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.497218 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.497236 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497283 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:15.497240001 +0000 UTC m=+205.660451852 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497368 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497466 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:06:15.497445285 +0000 UTC m=+205.660657166 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497381 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497676 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497689 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497721 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 15:06:15.497712451 +0000 UTC m=+205.660924482 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497302 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.497842 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 15:06:15.497834714 +0000 UTC m=+205.661046645 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.551600 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.551645 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.551680 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.551770 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.551849 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.551926 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:11 crc kubenswrapper[4786]: I0313 15:05:11.598422 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.598599 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.598617 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.598629 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:05:11 crc kubenswrapper[4786]: E0313 15:05:11.598680 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 15:06:15.598663835 +0000 UTC m=+205.761875656 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 15:05:12 crc kubenswrapper[4786]: I0313 15:05:12.551166 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:12 crc kubenswrapper[4786]: E0313 15:05:12.551275 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:13 crc kubenswrapper[4786]: I0313 15:05:13.551665 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:13 crc kubenswrapper[4786]: I0313 15:05:13.551665 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:13 crc kubenswrapper[4786]: I0313 15:05:13.551761 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:13 crc kubenswrapper[4786]: E0313 15:05:13.552964 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:13 crc kubenswrapper[4786]: E0313 15:05:13.553009 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:13 crc kubenswrapper[4786]: E0313 15:05:13.553286 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:14 crc kubenswrapper[4786]: I0313 15:05:14.429594 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:14 crc kubenswrapper[4786]: E0313 15:05:14.429916 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:05:14 crc kubenswrapper[4786]: E0313 15:05:14.429996 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs podName:2ded4bfa-6d71-4c0f-982d-aee3c61c5612 nodeName:}" failed. No retries permitted until 2026-03-13 15:06:18.429975044 +0000 UTC m=+208.593186895 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs") pod "network-metrics-daemon-2v688" (UID: "2ded4bfa-6d71-4c0f-982d-aee3c61c5612") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 15:05:14 crc kubenswrapper[4786]: I0313 15:05:14.552082 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:14 crc kubenswrapper[4786]: E0313 15:05:14.553045 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:15 crc kubenswrapper[4786]: I0313 15:05:15.551108 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:15 crc kubenswrapper[4786]: I0313 15:05:15.551146 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:15 crc kubenswrapper[4786]: E0313 15:05:15.551242 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:15 crc kubenswrapper[4786]: I0313 15:05:15.551382 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:15 crc kubenswrapper[4786]: E0313 15:05:15.551622 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:15 crc kubenswrapper[4786]: E0313 15:05:15.551938 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:15 crc kubenswrapper[4786]: E0313 15:05:15.640934 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:16 crc kubenswrapper[4786]: I0313 15:05:16.551806 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:16 crc kubenswrapper[4786]: E0313 15:05:16.552320 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:17 crc kubenswrapper[4786]: I0313 15:05:17.551402 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:17 crc kubenswrapper[4786]: I0313 15:05:17.551494 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:17 crc kubenswrapper[4786]: E0313 15:05:17.551774 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:17 crc kubenswrapper[4786]: I0313 15:05:17.552155 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:17 crc kubenswrapper[4786]: E0313 15:05:17.552214 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:17 crc kubenswrapper[4786]: E0313 15:05:17.552442 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:17 crc kubenswrapper[4786]: I0313 15:05:17.552707 4786 scope.go:117] "RemoveContainer" containerID="4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497" Mar 13 15:05:17 crc kubenswrapper[4786]: E0313 15:05:17.552998 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:05:18 crc kubenswrapper[4786]: I0313 15:05:18.551969 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:18 crc kubenswrapper[4786]: E0313 15:05:18.552190 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:19 crc kubenswrapper[4786]: I0313 15:05:19.551002 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:19 crc kubenswrapper[4786]: I0313 15:05:19.551058 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:19 crc kubenswrapper[4786]: E0313 15:05:19.551213 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:19 crc kubenswrapper[4786]: I0313 15:05:19.551333 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:19 crc kubenswrapper[4786]: E0313 15:05:19.551467 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:19 crc kubenswrapper[4786]: E0313 15:05:19.551579 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.551795 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:20 crc kubenswrapper[4786]: E0313 15:05:20.552012 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.584022 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe9874c-2240-4784-a0ea-2bf4414ea40e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ea1d859a367e393fd15b7b026f9d4347b647c05937c68a45249904062c744d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7a27ee5730b36176476ced7949e70f5105d31b708591718d3312f9659f4dbf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dea930a65f23a671f8e1b4fe01d7d473f39d1a4e71fada65a37c471c2fe15871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://adced05a051d58e3727d2bd1b32f554199e4521e7fc7dffdfc362bdbfe02bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac697f20f1746e23d1b04ff9d8155478cef5ac180f3448d48a2520470649956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39af78f2f11f656c56f8f2f9c9da0b210ed3642b684530a49f0cfc65328671b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f3b966aaa356c1a5b20e1bd5f5f769a8b77a32f5b2d1f5b83f524d0e61ffd8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88f423df7f30b17d85e83e2999d5c423b96c79859e4e92346bc5846e77c42fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.606154 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.628427 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: E0313 15:05:20.642018 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.665716 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:05:04Z\\\",\\\"message\\\":\\\"c openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI0313 15:05:04.586466 7284 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0313 15:05:04.586480 7284 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586490 7284 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586499 7284 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0313 15:05:04.586505 7284 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0313 15:05:04.586511 7284 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0313 15:05:04.586527 7284 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 15:05:04.586577 7284 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initiali\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:05:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7b6g9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.682325 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f2e4a0c-c8c1-449c-baed-b06c9c647246\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T15:04:02Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 15:04:02.151835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 15:04:02.152011 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 15:04:02.152686 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2798670859/tls.crt::/tmp/serving-cert-2798670859/tls.key\\\\\\\"\\\\nI0313 15:04:02.381487 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 15:04:02.385643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 15:04:02.385665 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 15:04:02.385686 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 15:04:02.385691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 15:04:02.390579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 15:04:02.390693 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390750 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 15:04:02.390801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 15:04:02.390887 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0313 15:04:02.390618 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 15:04:02.390954 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 15:04:02.391052 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 15:04:02.391552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.695238 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wp7vg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ab1ab68-cf57-443b-aa31-ac336c1d86ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://701a50e052464c54e9d73e217446cff3109a6ede4a3637b6b3b362814ae56c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-282hv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wp7vg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.711487 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5b7ebfe35d8b0538aa3c60cde5ef66a819a4a2a86dd2540ef914119b2fd4cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.729776 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.743178 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1266b50f167a12006589d2a163ad40bbd0d6a3c6ea7027bb7c062f2f166fdbc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.762409 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mvpcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"930a5a92-be71-4866-aa6f-95a98647bc33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T15:04:54Z\\\",\\\"message\\\":\\\"2026-03-13T15:04:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3\\\\n2026-03-13T15:04:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1b833de5-14b8-41c2-bdb4-8779d46f0ce3 to /host/opt/cni/bin/\\\\n2026-03-13T15:04:09Z [verbose] multus-daemon started\\\\n2026-03-13T15:04:09Z [verbose] Readiness Indicator file check\\\\n2026-03-13T15:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trc5m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mvpcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.775968 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b929603-1f9d-4b41-9bf8-528d7fd4ad56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d12b11b0d266e6fee512f1c57f0060d18904df0052b4ba0632a29122e88da540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c44x4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqb49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.790294 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db8c811b-a36c-4923-8b13-47f48d9ba696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e314ad3ed67f9fa3cb73624135db97dce384918f65de58ec494960e5a90949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baec6daa94b3a627468bb88d251ddaf55d9805cf2f2d04d0058ea91117e1cdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-brqx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-74m8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.839291 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v688" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmqjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v688\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.851366 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9d4a544-8099-42dc-96da-d731319b2200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ca83c02c9c5b9d5492e1d45595c615398d17482f64c624ddfbcb42ca5c0f703\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d58f4ad322fc52a0f282330b987459649fa571251a8dcbfceaad4346d4635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9969ac4ffe3ab34121b0c9826d14df87bd9af2a61adaa8f384c60628f27e32a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d24acf373322ed57d5d52d60ba1a2fc5e9d1501e48d5df902a97d83b5d89aed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.864713 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ba669c-5e47-4778-a599-ffdfc81189ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7875611da5b28e109605cdf6405383d5f9b7d46bf186fde93df6afaddc382bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:02:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c4279cda7eee267f24cf3bf566ffe83bc913ce419e45c49740772bf0382cc09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:02:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:02:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.877896 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-99jzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5037a43-3be3-4ba3-bed7-1ef82690e33e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df742c6b34bc4c560368c3cb8ac73fd59462b232336833328c83b7c82112fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dzsnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-99jzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.896088 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9952fb982f3a25d9a5087bb90c66b8c4260fd7d8d1c6a5b043107ff293ef92e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8cd9bbc59e2187bb988eecf3d37d39ada163e606e87225d3ffc96ae399c3cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:20 crc kubenswrapper[4786]: I0313 15:05:20.910671 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pstw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5e37b9c-1965-4321-a9fc-6babbd05c395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T15:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcccde4eb846a000317d0f972f4c37e360f94b126313f1cf45574ea6f007448e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T15:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d10d349fe02f27757394f77c50bd70ff9478fb6608c3bc07caac16ff29c2817\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c5c849022a0559ec63ba8fe5d406c3077bfee6b9e4eb2dc9d29aff25d97d063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5f864d392d56e5565594d947976b6dbe790f1a256591298f92f20e847f1aa3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a47d5826bc82da8fad2b85d73b883e4b2c92551021d4f897741643408e85d87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14b3262837e7d03006d3fc5a5a3b4aa12bae934936d77cf997753c75d062a298\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6510b6f306c2f0e6307e03ecd6a744da0b177afdff9fd1bafe7cec32211adc35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T15:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T15:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gldhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T15:04:07Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pstw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:20Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.551311 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.551538 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.551690 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.551535 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.551985 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.552092 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.730911 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.731185 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.731194 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.731207 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.731215 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:21Z","lastTransitionTime":"2026-03-13T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.744833 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.747838 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.747890 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.747903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.747919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.747931 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:21Z","lastTransitionTime":"2026-03-13T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.759073 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.762102 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.762142 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.762152 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.762166 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.762176 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:21Z","lastTransitionTime":"2026-03-13T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.775269 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.778591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.778614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.778625 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.778639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.778652 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:21Z","lastTransitionTime":"2026-03-13T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.791454 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.795223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.795294 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.795319 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.795347 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:21 crc kubenswrapper[4786]: I0313 15:05:21.795368 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:21Z","lastTransitionTime":"2026-03-13T15:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.814141 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T15:05:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"69102ab2-c57d-44ef-8cae-daa07cf79399\\\",\\\"systemUUID\\\":\\\"f3ae1b88-a7c2-4500-a284-224d87cf19ab\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T15:05:21Z is after 2025-08-24T17:21:41Z" Mar 13 15:05:21 crc kubenswrapper[4786]: E0313 15:05:21.814270 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:05:22 crc kubenswrapper[4786]: I0313 15:05:22.551151 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:22 crc kubenswrapper[4786]: E0313 15:05:22.551365 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:23 crc kubenswrapper[4786]: I0313 15:05:23.550980 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:23 crc kubenswrapper[4786]: I0313 15:05:23.551036 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:23 crc kubenswrapper[4786]: E0313 15:05:23.551238 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:23 crc kubenswrapper[4786]: E0313 15:05:23.551496 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:23 crc kubenswrapper[4786]: I0313 15:05:23.551653 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:23 crc kubenswrapper[4786]: E0313 15:05:23.551761 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:23 crc kubenswrapper[4786]: I0313 15:05:23.560952 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 15:05:24 crc kubenswrapper[4786]: I0313 15:05:24.551400 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:24 crc kubenswrapper[4786]: E0313 15:05:24.551797 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:25 crc kubenswrapper[4786]: I0313 15:05:25.551543 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:25 crc kubenswrapper[4786]: E0313 15:05:25.551718 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:25 crc kubenswrapper[4786]: I0313 15:05:25.552089 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:25 crc kubenswrapper[4786]: I0313 15:05:25.552102 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:25 crc kubenswrapper[4786]: E0313 15:05:25.552241 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:25 crc kubenswrapper[4786]: E0313 15:05:25.552360 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:25 crc kubenswrapper[4786]: E0313 15:05:25.644179 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:26 crc kubenswrapper[4786]: I0313 15:05:26.552203 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:26 crc kubenswrapper[4786]: E0313 15:05:26.552506 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:27 crc kubenswrapper[4786]: I0313 15:05:27.551096 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:27 crc kubenswrapper[4786]: I0313 15:05:27.551163 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:27 crc kubenswrapper[4786]: I0313 15:05:27.551096 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:27 crc kubenswrapper[4786]: E0313 15:05:27.551321 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:27 crc kubenswrapper[4786]: E0313 15:05:27.551415 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:27 crc kubenswrapper[4786]: E0313 15:05:27.551557 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:28 crc kubenswrapper[4786]: I0313 15:05:28.551631 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:28 crc kubenswrapper[4786]: E0313 15:05:28.551813 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:29 crc kubenswrapper[4786]: I0313 15:05:29.551108 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:29 crc kubenswrapper[4786]: E0313 15:05:29.551289 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:29 crc kubenswrapper[4786]: I0313 15:05:29.551590 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:29 crc kubenswrapper[4786]: E0313 15:05:29.551689 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:29 crc kubenswrapper[4786]: I0313 15:05:29.551921 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:29 crc kubenswrapper[4786]: E0313 15:05:29.552016 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.552020 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:30 crc kubenswrapper[4786]: E0313 15:05:30.552252 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.610692 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.610665506 podStartE2EDuration="1m13.610665506s" podCreationTimestamp="2026-03-13 15:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.59032966 +0000 UTC m=+160.753541511" watchObservedRunningTime="2026-03-13 15:05:30.610665506 +0000 UTC m=+160.773877317" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.610917 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=7.610912212 podStartE2EDuration="7.610912212s" podCreationTimestamp="2026-03-13 15:05:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.610895981 +0000 UTC m=+160.774107842" watchObservedRunningTime="2026-03-13 15:05:30.610912212 +0000 UTC m=+160.774124023" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.626827 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wp7vg" podStartSLOduration=94.626794088 podStartE2EDuration="1m34.626794088s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.626688485 +0000 UTC m=+160.789900296" watchObservedRunningTime="2026-03-13 15:05:30.626794088 +0000 UTC m=+160.790005939" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.644765 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mvpcz" podStartSLOduration=94.64473884 podStartE2EDuration="1m34.64473884s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.644365052 +0000 UTC m=+160.807576903" watchObservedRunningTime="2026-03-13 15:05:30.64473884 +0000 UTC m=+160.807950681" Mar 13 15:05:30 crc kubenswrapper[4786]: E0313 15:05:30.645250 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.686490 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podStartSLOduration=94.686440225 podStartE2EDuration="1m34.686440225s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.663311287 +0000 UTC m=+160.826523138" watchObservedRunningTime="2026-03-13 15:05:30.686440225 +0000 UTC m=+160.849652056" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.699710 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-74m8s" podStartSLOduration=94.699692852 podStartE2EDuration="1m34.699692852s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.686913286 +0000 UTC m=+160.850125107" watchObservedRunningTime="2026-03-13 15:05:30.699692852 +0000 UTC m=+160.862904663" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.778117 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pstw7" podStartSLOduration=94.77809072 podStartE2EDuration="1m34.77809072s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.777401475 +0000 UTC m=+160.940613306" watchObservedRunningTime="2026-03-13 15:05:30.77809072 +0000 UTC m=+160.941302581" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.793653 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=28.793630438 podStartE2EDuration="28.793630438s" podCreationTimestamp="2026-03-13 15:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.792724338 +0000 UTC m=+160.955936179" watchObservedRunningTime="2026-03-13 15:05:30.793630438 +0000 UTC m=+160.956842289" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.817109 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=73.817091824 podStartE2EDuration="1m13.817091824s" podCreationTimestamp="2026-03-13 15:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.804467461 +0000 UTC m=+160.967679282" watchObservedRunningTime="2026-03-13 15:05:30.817091824 +0000 UTC m=+160.980303645" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.845789 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-99jzx" podStartSLOduration=94.845771197 podStartE2EDuration="1m34.845771197s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.81776538 +0000 UTC m=+160.980977201" watchObservedRunningTime="2026-03-13 15:05:30.845771197 +0000 UTC m=+161.008983018" Mar 13 15:05:30 crc kubenswrapper[4786]: I0313 15:05:30.952828 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=65.952809387 podStartE2EDuration="1m5.952809387s" podCreationTimestamp="2026-03-13 15:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:30.937419792 +0000 UTC m=+161.100631603" watchObservedRunningTime="2026-03-13 15:05:30.952809387 +0000 UTC m=+161.116021198" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.552150 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.552212 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:31 crc kubenswrapper[4786]: E0313 15:05:31.552381 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.552616 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:31 crc kubenswrapper[4786]: E0313 15:05:31.552730 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:31 crc kubenswrapper[4786]: E0313 15:05:31.553325 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.554459 4786 scope.go:117] "RemoveContainer" containerID="4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497" Mar 13 15:05:31 crc kubenswrapper[4786]: E0313 15:05:31.554718 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7b6g9_openshift-ovn-kubernetes(0c6a64e5-e5ca-401a-9653-e0419f9f46c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.893645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.893723 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.893742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.893765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.893782 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T15:05:31Z","lastTransitionTime":"2026-03-13T15:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.968515 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x"] Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.968872 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.971122 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.971309 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.972000 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 15:05:31 crc kubenswrapper[4786]: I0313 15:05:31.974556 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.131941 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6c7f8893-b451-4bd4-849c-95c09f2da2ff-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.132037 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c7f8893-b451-4bd4-849c-95c09f2da2ff-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.132066 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c7f8893-b451-4bd4-849c-95c09f2da2ff-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.132086 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c7f8893-b451-4bd4-849c-95c09f2da2ff-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.132124 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6c7f8893-b451-4bd4-849c-95c09f2da2ff-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.233612 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6c7f8893-b451-4bd4-849c-95c09f2da2ff-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.233828 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6c7f8893-b451-4bd4-849c-95c09f2da2ff-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.234218 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c7f8893-b451-4bd4-849c-95c09f2da2ff-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.234291 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c7f8893-b451-4bd4-849c-95c09f2da2ff-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.234335 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c7f8893-b451-4bd4-849c-95c09f2da2ff-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.234415 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6c7f8893-b451-4bd4-849c-95c09f2da2ff-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.234586 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6c7f8893-b451-4bd4-849c-95c09f2da2ff-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.236058 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c7f8893-b451-4bd4-849c-95c09f2da2ff-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.243841 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c7f8893-b451-4bd4-849c-95c09f2da2ff-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.264999 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c7f8893-b451-4bd4-849c-95c09f2da2ff-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q8z2x\" (UID: \"6c7f8893-b451-4bd4-849c-95c09f2da2ff\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.296307 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" Mar 13 15:05:32 crc kubenswrapper[4786]: W0313 15:05:32.320147 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c7f8893_b451_4bd4_849c_95c09f2da2ff.slice/crio-8d1dd394a442fdae089c1e16035d78636673fe7b41b11eb315d06ddba17e6bc2 WatchSource:0}: Error finding container 8d1dd394a442fdae089c1e16035d78636673fe7b41b11eb315d06ddba17e6bc2: Status 404 returned error can't find the container with id 8d1dd394a442fdae089c1e16035d78636673fe7b41b11eb315d06ddba17e6bc2 Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.369199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" event={"ID":"6c7f8893-b451-4bd4-849c-95c09f2da2ff","Type":"ContainerStarted","Data":"8d1dd394a442fdae089c1e16035d78636673fe7b41b11eb315d06ddba17e6bc2"} Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.551552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:32 crc kubenswrapper[4786]: E0313 15:05:32.551841 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.584295 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 15:05:32 crc kubenswrapper[4786]: I0313 15:05:32.596444 4786 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 15:05:33 crc kubenswrapper[4786]: I0313 15:05:33.374277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" event={"ID":"6c7f8893-b451-4bd4-849c-95c09f2da2ff","Type":"ContainerStarted","Data":"4d4d07a108c2c4651a88cf95ca8843887407aa3b50ec7ac9bca616cc432b179a"} Mar 13 15:05:33 crc kubenswrapper[4786]: I0313 15:05:33.397009 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8z2x" podStartSLOduration=97.396979957 podStartE2EDuration="1m37.396979957s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:33.39578938 +0000 UTC m=+163.559001241" watchObservedRunningTime="2026-03-13 15:05:33.396979957 +0000 UTC m=+163.560191808" Mar 13 15:05:33 crc kubenswrapper[4786]: I0313 15:05:33.551959 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:33 crc kubenswrapper[4786]: I0313 15:05:33.552183 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:33 crc kubenswrapper[4786]: I0313 15:05:33.552248 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:33 crc kubenswrapper[4786]: E0313 15:05:33.552334 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:33 crc kubenswrapper[4786]: E0313 15:05:33.552491 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:33 crc kubenswrapper[4786]: E0313 15:05:33.552624 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:34 crc kubenswrapper[4786]: I0313 15:05:34.552170 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:34 crc kubenswrapper[4786]: E0313 15:05:34.552316 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:35 crc kubenswrapper[4786]: I0313 15:05:35.551542 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:35 crc kubenswrapper[4786]: I0313 15:05:35.551584 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:35 crc kubenswrapper[4786]: I0313 15:05:35.551557 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:35 crc kubenswrapper[4786]: E0313 15:05:35.551719 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:35 crc kubenswrapper[4786]: E0313 15:05:35.551791 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:35 crc kubenswrapper[4786]: E0313 15:05:35.552128 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:35 crc kubenswrapper[4786]: E0313 15:05:35.647130 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:36 crc kubenswrapper[4786]: I0313 15:05:36.552188 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:36 crc kubenswrapper[4786]: E0313 15:05:36.552789 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:37 crc kubenswrapper[4786]: I0313 15:05:37.551932 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:37 crc kubenswrapper[4786]: I0313 15:05:37.551990 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:37 crc kubenswrapper[4786]: E0313 15:05:37.552083 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:37 crc kubenswrapper[4786]: I0313 15:05:37.551990 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:37 crc kubenswrapper[4786]: E0313 15:05:37.552226 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:37 crc kubenswrapper[4786]: E0313 15:05:37.552312 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:38 crc kubenswrapper[4786]: I0313 15:05:38.551481 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:38 crc kubenswrapper[4786]: E0313 15:05:38.551703 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:39 crc kubenswrapper[4786]: I0313 15:05:39.552124 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:39 crc kubenswrapper[4786]: I0313 15:05:39.552181 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:39 crc kubenswrapper[4786]: I0313 15:05:39.552241 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:39 crc kubenswrapper[4786]: E0313 15:05:39.552336 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:39 crc kubenswrapper[4786]: E0313 15:05:39.552460 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:39 crc kubenswrapper[4786]: E0313 15:05:39.552573 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:40 crc kubenswrapper[4786]: I0313 15:05:40.551080 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:40 crc kubenswrapper[4786]: E0313 15:05:40.553080 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:40 crc kubenswrapper[4786]: E0313 15:05:40.647731 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.403826 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/1.log" Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.404891 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/0.log" Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.404959 4786 generic.go:334] "Generic (PLEG): container finished" podID="930a5a92-be71-4866-aa6f-95a98647bc33" containerID="de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383" exitCode=1 Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.405013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mvpcz" event={"ID":"930a5a92-be71-4866-aa6f-95a98647bc33","Type":"ContainerDied","Data":"de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383"} Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.405053 4786 scope.go:117] "RemoveContainer" containerID="4ea0a116ef481a7b6c993d07b2ebd89138bbfb5ea9946780346ed7aae490d084" Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.405621 4786 scope.go:117] "RemoveContainer" containerID="de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383" Mar 13 15:05:41 crc kubenswrapper[4786]: E0313 15:05:41.406016 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mvpcz_openshift-multus(930a5a92-be71-4866-aa6f-95a98647bc33)\"" pod="openshift-multus/multus-mvpcz" podUID="930a5a92-be71-4866-aa6f-95a98647bc33" Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.551429 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.551529 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:41 crc kubenswrapper[4786]: E0313 15:05:41.551616 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:41 crc kubenswrapper[4786]: I0313 15:05:41.551699 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:41 crc kubenswrapper[4786]: E0313 15:05:41.551802 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:41 crc kubenswrapper[4786]: E0313 15:05:41.551979 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:42 crc kubenswrapper[4786]: I0313 15:05:42.411138 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/1.log" Mar 13 15:05:42 crc kubenswrapper[4786]: I0313 15:05:42.551765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:42 crc kubenswrapper[4786]: E0313 15:05:42.552021 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:43 crc kubenswrapper[4786]: I0313 15:05:43.551421 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:43 crc kubenswrapper[4786]: I0313 15:05:43.551493 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:43 crc kubenswrapper[4786]: I0313 15:05:43.551421 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:43 crc kubenswrapper[4786]: E0313 15:05:43.551676 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:43 crc kubenswrapper[4786]: E0313 15:05:43.551794 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:43 crc kubenswrapper[4786]: E0313 15:05:43.552047 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:44 crc kubenswrapper[4786]: I0313 15:05:44.552175 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:44 crc kubenswrapper[4786]: E0313 15:05:44.552470 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:45 crc kubenswrapper[4786]: I0313 15:05:45.551992 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:45 crc kubenswrapper[4786]: I0313 15:05:45.551995 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:45 crc kubenswrapper[4786]: I0313 15:05:45.552032 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:45 crc kubenswrapper[4786]: E0313 15:05:45.552363 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:45 crc kubenswrapper[4786]: E0313 15:05:45.552152 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:45 crc kubenswrapper[4786]: E0313 15:05:45.552465 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:45 crc kubenswrapper[4786]: E0313 15:05:45.649306 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:46 crc kubenswrapper[4786]: I0313 15:05:46.551834 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:46 crc kubenswrapper[4786]: E0313 15:05:46.552100 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:46 crc kubenswrapper[4786]: I0313 15:05:46.553203 4786 scope.go:117] "RemoveContainer" containerID="4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497" Mar 13 15:05:47 crc kubenswrapper[4786]: I0313 15:05:47.338749 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2v688"] Mar 13 15:05:47 crc kubenswrapper[4786]: I0313 15:05:47.339176 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:47 crc kubenswrapper[4786]: E0313 15:05:47.339276 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:47 crc kubenswrapper[4786]: I0313 15:05:47.429898 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/3.log" Mar 13 15:05:47 crc kubenswrapper[4786]: I0313 15:05:47.431658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerStarted","Data":"c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d"} Mar 13 15:05:47 crc kubenswrapper[4786]: I0313 15:05:47.432560 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:05:47 crc kubenswrapper[4786]: I0313 15:05:47.466565 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podStartSLOduration=111.466544313 podStartE2EDuration="1m51.466544313s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:05:47.460302283 +0000 UTC m=+177.623514104" watchObservedRunningTime="2026-03-13 15:05:47.466544313 +0000 UTC m=+177.629756124" Mar 13 15:05:47 crc kubenswrapper[4786]: I0313 15:05:47.551301 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:47 crc kubenswrapper[4786]: I0313 15:05:47.551341 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:47 crc kubenswrapper[4786]: E0313 15:05:47.551452 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:47 crc kubenswrapper[4786]: E0313 15:05:47.551547 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:48 crc kubenswrapper[4786]: I0313 15:05:48.551790 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:48 crc kubenswrapper[4786]: E0313 15:05:48.551977 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:49 crc kubenswrapper[4786]: I0313 15:05:49.551485 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:49 crc kubenswrapper[4786]: I0313 15:05:49.551506 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:49 crc kubenswrapper[4786]: I0313 15:05:49.551631 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:49 crc kubenswrapper[4786]: E0313 15:05:49.551760 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:49 crc kubenswrapper[4786]: E0313 15:05:49.551882 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:49 crc kubenswrapper[4786]: E0313 15:05:49.552004 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:50 crc kubenswrapper[4786]: I0313 15:05:50.551633 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:50 crc kubenswrapper[4786]: E0313 15:05:50.553493 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:50 crc kubenswrapper[4786]: E0313 15:05:50.649976 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:51 crc kubenswrapper[4786]: I0313 15:05:51.551562 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:51 crc kubenswrapper[4786]: E0313 15:05:51.552085 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:51 crc kubenswrapper[4786]: I0313 15:05:51.551613 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:51 crc kubenswrapper[4786]: E0313 15:05:51.552199 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:51 crc kubenswrapper[4786]: I0313 15:05:51.551562 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:51 crc kubenswrapper[4786]: E0313 15:05:51.552283 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:52 crc kubenswrapper[4786]: I0313 15:05:52.551081 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:52 crc kubenswrapper[4786]: E0313 15:05:52.551290 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:53 crc kubenswrapper[4786]: I0313 15:05:53.551468 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:53 crc kubenswrapper[4786]: I0313 15:05:53.551557 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:53 crc kubenswrapper[4786]: E0313 15:05:53.551647 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:53 crc kubenswrapper[4786]: I0313 15:05:53.551575 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:53 crc kubenswrapper[4786]: E0313 15:05:53.551746 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:53 crc kubenswrapper[4786]: E0313 15:05:53.551845 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:54 crc kubenswrapper[4786]: I0313 15:05:54.551905 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:54 crc kubenswrapper[4786]: E0313 15:05:54.552093 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:55 crc kubenswrapper[4786]: I0313 15:05:55.551437 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:55 crc kubenswrapper[4786]: E0313 15:05:55.551652 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:55 crc kubenswrapper[4786]: I0313 15:05:55.551893 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:55 crc kubenswrapper[4786]: E0313 15:05:55.552004 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:55 crc kubenswrapper[4786]: I0313 15:05:55.552061 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:55 crc kubenswrapper[4786]: E0313 15:05:55.552266 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:55 crc kubenswrapper[4786]: I0313 15:05:55.552463 4786 scope.go:117] "RemoveContainer" containerID="de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383" Mar 13 15:05:55 crc kubenswrapper[4786]: E0313 15:05:55.650673 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:05:56 crc kubenswrapper[4786]: I0313 15:05:56.461943 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/1.log" Mar 13 15:05:56 crc kubenswrapper[4786]: I0313 15:05:56.462010 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mvpcz" event={"ID":"930a5a92-be71-4866-aa6f-95a98647bc33","Type":"ContainerStarted","Data":"459051b7219350f8c61d97d648b668c00d787af9c9750e356aae1c8c21b74b3a"} Mar 13 15:05:56 crc kubenswrapper[4786]: I0313 15:05:56.551369 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:56 crc kubenswrapper[4786]: E0313 15:05:56.551524 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:57 crc kubenswrapper[4786]: I0313 15:05:57.551808 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:57 crc kubenswrapper[4786]: I0313 15:05:57.551957 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:57 crc kubenswrapper[4786]: E0313 15:05:57.552081 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:57 crc kubenswrapper[4786]: I0313 15:05:57.551808 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:57 crc kubenswrapper[4786]: E0313 15:05:57.552293 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:57 crc kubenswrapper[4786]: E0313 15:05:57.552452 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:05:58 crc kubenswrapper[4786]: I0313 15:05:58.551823 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:05:58 crc kubenswrapper[4786]: E0313 15:05:58.552009 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:05:59 crc kubenswrapper[4786]: I0313 15:05:59.551732 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:05:59 crc kubenswrapper[4786]: E0313 15:05:59.551977 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v688" podUID="2ded4bfa-6d71-4c0f-982d-aee3c61c5612" Mar 13 15:05:59 crc kubenswrapper[4786]: I0313 15:05:59.552117 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:05:59 crc kubenswrapper[4786]: I0313 15:05:59.552251 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:05:59 crc kubenswrapper[4786]: E0313 15:05:59.552298 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 15:05:59 crc kubenswrapper[4786]: E0313 15:05:59.552570 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 15:06:00 crc kubenswrapper[4786]: I0313 15:06:00.551587 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:06:00 crc kubenswrapper[4786]: E0313 15:06:00.552808 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.551149 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.551160 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.551183 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.555297 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.555708 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.555800 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.555984 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.556186 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 15:06:01 crc kubenswrapper[4786]: I0313 15:06:01.556364 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.366191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.419064 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ffbml"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.419775 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.424326 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.425422 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.425439 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.431592 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w5kxr"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.432715 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.440262 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.440764 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.443391 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.444404 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.446547 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.446588 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.447629 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.447889 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.448197 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.448447 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.449066 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.449387 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.450035 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.450290 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.450406 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.450597 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.450995 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.453453 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.453699 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.454367 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.454991 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.460035 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.460191 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.460406 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.461185 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.461399 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.461552 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.461691 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.462374 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.462558 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.462878 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.463321 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-2tqr8"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.463972 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.464412 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.464759 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2bj4z"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.470272 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.474586 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c65p"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.475097 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.476013 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.476787 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.498074 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.498358 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.498740 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2bj4z" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.499544 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.500451 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.502664 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.502938 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.503156 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.503492 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.504835 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.505007 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.506718 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.508685 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507166 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507214 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.505237 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.505425 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.506678 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.509622 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507274 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507333 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507411 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507433 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507473 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507511 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507545 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.510130 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507608 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507671 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.510373 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507720 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507754 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.507791 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.505177 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.510262 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.514931 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.515440 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-b7krw"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.515774 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.515825 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.519912 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4rkfk"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.520564 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.521330 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2c944"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.521892 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.524675 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.525581 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5m4w2"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.526301 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.526803 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.526834 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.526900 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.527825 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.528516 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.528921 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7cf4m"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.529339 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.532093 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.532143 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.532340 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.532394 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.532458 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.533711 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.533996 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gm72l"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.534133 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.534489 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.534651 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.534841 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.535042 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.535188 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.535347 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.536376 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.537348 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.537878 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.538170 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.538662 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.538795 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539140 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539154 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539216 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539327 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539367 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539388 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539388 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539570 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539581 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.539826 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.541146 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.541589 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.541628 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.541691 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.541795 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.541837 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.541907 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.541946 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.542009 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.542037 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.542053 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.542119 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.542135 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.554964 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.555449 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.542015 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.557586 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.559518 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.559612 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.561337 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562474 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50609e3-d329-4c22-9be4-4100f122508d-serving-cert\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562523 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562573 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e50609e3-d329-4c22-9be4-4100f122508d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562685 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa22da4-e413-4089-9e4a-ef7a8f324435-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562716 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a2643aa-9180-475a-af73-8e7b311cc77c-serving-cert\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562745 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-etcd-client\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562770 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/233ce256-9724-4f50-a45b-a58aaaee2ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562795 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa22da4-e413-4089-9e4a-ef7a8f324435-config\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.562944 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/233ce256-9724-4f50-a45b-a58aaaee2ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.563016 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/963ffcec-87b4-480a-81d1-e19ecb7edb12-audit-dir\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.563094 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvp7q\" (UniqueName: \"kubernetes.io/projected/6aa22da4-e413-4089-9e4a-ef7a8f324435-kube-api-access-dvp7q\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.563189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-serving-cert\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.563471 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhr4z\" (UniqueName: \"kubernetes.io/projected/233ce256-9724-4f50-a45b-a58aaaee2ec8-kube-api-access-nhr4z\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.563615 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-encryption-config\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.563765 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-audit-policies\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.563849 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-config\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.568712 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5cs\" (UniqueName: \"kubernetes.io/projected/e50609e3-d329-4c22-9be4-4100f122508d-kube-api-access-lj5cs\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.572589 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhb9d\" (UniqueName: \"kubernetes.io/projected/963ffcec-87b4-480a-81d1-e19ecb7edb12-kube-api-access-nhb9d\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.572662 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-client-ca\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.572687 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.572733 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/233ce256-9724-4f50-a45b-a58aaaee2ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.572756 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fz8c\" (UniqueName: \"kubernetes.io/projected/3a2643aa-9180-475a-af73-8e7b311cc77c-kube-api-access-2fz8c\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.574302 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-z752l"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.574921 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vxx8z"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.575562 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.576221 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.576487 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.576687 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.579206 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.581290 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.581956 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.581982 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.582788 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wcljz"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.583414 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.585754 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.586676 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.587144 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.587191 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bhts5"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.587595 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.588291 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.589286 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.589490 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.590578 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.591642 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.592462 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.592583 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556906-bz7fd"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.593139 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.593549 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t98jm"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.593941 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.594506 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.594904 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.597178 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.597097 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.601146 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w5kxr"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.601261 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.603535 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.603909 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.604655 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.613344 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-996mk"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.614431 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-996mk" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.614640 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.615628 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.616932 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.617939 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2c944"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.619068 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vxx8z"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.620102 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.621078 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.621326 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.622060 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.623016 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.624894 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.625119 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2tqr8"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.626141 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.627282 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bhts5"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.628620 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2bj4z"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.630576 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ffbml"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.630617 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c65p"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.632247 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.634004 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.635326 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.636556 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.638367 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.641314 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gm72l"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.643057 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.643163 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.644552 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4h7qh"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.645175 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.648371 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.649896 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4rkfk"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.651383 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-996mk"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.653405 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7cf4m"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.654272 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5m4w2"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.661563 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.664814 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.666237 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wcljz"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.667079 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-b7krw"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.669745 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.671362 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-bz7fd"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.672555 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t98jm"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.673955 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674420 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a2643aa-9180-475a-af73-8e7b311cc77c-serving-cert\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674443 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-etcd-client\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674461 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/233ce256-9724-4f50-a45b-a58aaaee2ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674479 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa22da4-e413-4089-9e4a-ef7a8f324435-config\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674497 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/233ce256-9724-4f50-a45b-a58aaaee2ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/963ffcec-87b4-480a-81d1-e19ecb7edb12-audit-dir\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674526 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvp7q\" (UniqueName: \"kubernetes.io/projected/6aa22da4-e413-4089-9e4a-ef7a8f324435-kube-api-access-dvp7q\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-serving-cert\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674562 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhr4z\" (UniqueName: \"kubernetes.io/projected/233ce256-9724-4f50-a45b-a58aaaee2ec8-kube-api-access-nhr4z\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674578 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-encryption-config\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674595 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-audit-policies\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674612 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-config\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674627 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5cs\" (UniqueName: \"kubernetes.io/projected/e50609e3-d329-4c22-9be4-4100f122508d-kube-api-access-lj5cs\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674651 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhb9d\" (UniqueName: \"kubernetes.io/projected/963ffcec-87b4-480a-81d1-e19ecb7edb12-kube-api-access-nhb9d\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674667 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-client-ca\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674685 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674701 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/233ce256-9724-4f50-a45b-a58aaaee2ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674717 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fz8c\" (UniqueName: \"kubernetes.io/projected/3a2643aa-9180-475a-af73-8e7b311cc77c-kube-api-access-2fz8c\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674774 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50609e3-d329-4c22-9be4-4100f122508d-serving-cert\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674796 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674814 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e50609e3-d329-4c22-9be4-4100f122508d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.674834 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa22da4-e413-4089-9e4a-ef7a8f324435-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.675938 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.676755 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w5gbr"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.677354 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/233ce256-9724-4f50-a45b-a58aaaee2ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.677528 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w5gbr" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.677670 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-config\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.677928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa22da4-e413-4089-9e4a-ef7a8f324435-config\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.678035 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/963ffcec-87b4-480a-81d1-e19ecb7edb12-audit-dir\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.679335 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-znwnc"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.680497 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.681058 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.681257 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/233ce256-9724-4f50-a45b-a58aaaee2ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.681587 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e50609e3-d329-4c22-9be4-4100f122508d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.681817 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.682305 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.682741 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/963ffcec-87b4-480a-81d1-e19ecb7edb12-audit-policies\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.682830 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w5gbr"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.683403 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-serving-cert\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.684083 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-znwnc"] Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.684922 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-etcd-client\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.685194 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.698364 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa22da4-e413-4089-9e4a-ef7a8f324435-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.698756 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/963ffcec-87b4-480a-81d1-e19ecb7edb12-encryption-config\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.700115 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50609e3-d329-4c22-9be4-4100f122508d-serving-cert\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.702549 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a2643aa-9180-475a-af73-8e7b311cc77c-serving-cert\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.702578 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-client-ca\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.702565 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.733689 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.742398 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.762533 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.782011 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.801869 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.822094 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.842168 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.862827 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.887835 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.902030 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.922222 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.941718 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.964359 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 15:06:02 crc kubenswrapper[4786]: I0313 15:06:02.982156 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.002057 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.022824 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.042464 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.062606 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.082848 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.101722 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.123486 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.142555 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.162666 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.190465 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.202561 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.264020 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.282188 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-serving-cert\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.282525 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-oauth-serving-cert\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.282753 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d72361e-212d-4e7d-a4d3-b10141badfc3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9tj94\" (UID: \"5d72361e-212d-4e7d-a4d3-b10141badfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.282908 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.283426 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hpvc\" (UniqueName: \"kubernetes.io/projected/0c6ae29c-e743-4193-bce1-22b4c5732f45-kube-api-access-4hpvc\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.283650 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a55f76fb-e225-458d-aab9-03e376b09de9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.283920 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.284045 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55f76fb-e225-458d-aab9-03e376b09de9-config\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.284256 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-service-ca\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.284380 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-client\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.284576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-certificates\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.284752 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skxgq\" (UniqueName: \"kubernetes.io/projected/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-kube-api-access-skxgq\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.284918 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-config\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285055 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-service-ca\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285226 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-trusted-ca-bundle\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285328 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285370 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-bound-sa-token\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzp9\" (UniqueName: \"kubernetes.io/projected/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-kube-api-access-7vzp9\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285436 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8qb\" (UniqueName: \"kubernetes.io/projected/5d72361e-212d-4e7d-a4d3-b10141badfc3-kube-api-access-gh8qb\") pod \"cluster-samples-operator-665b6dd947-9tj94\" (UID: \"5d72361e-212d-4e7d-a4d3-b10141badfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285466 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-oauth-config\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285505 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285566 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-tls\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285663 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-config\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285708 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szt5s\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-kube-api-access-szt5s\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.285921 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:03.78590151 +0000 UTC m=+193.949113421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.285910 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fpj5\" (UniqueName: \"kubernetes.io/projected/7babe28d-a681-4e19-ba38-150d554380a2-kube-api-access-7fpj5\") pod \"downloads-7954f5f757-2bj4z\" (UID: \"7babe28d-a681-4e19-ba38-150d554380a2\") " pod="openshift-console/downloads-7954f5f757-2bj4z" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.286064 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-trusted-ca\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.286241 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.286367 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-ca\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.286550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.286769 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a55f76fb-e225-458d-aab9-03e376b09de9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.286953 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-serving-cert\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.302163 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.323321 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.343334 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.363187 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.383594 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.387622 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.387838 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:03.887812995 +0000 UTC m=+194.051024836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.387946 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-tls\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388016 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2rz\" (UniqueName: \"kubernetes.io/projected/3d726b70-e168-4d44-954a-c9a3c8b0db5c-kube-api-access-rr2rz\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388071 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-default-certificate\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388124 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388178 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-config\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388230 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8vph\" (UniqueName: \"kubernetes.io/projected/103f0ba8-4fee-41ce-bf68-9df3feddffc8-kube-api-access-h8vph\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388339 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fpj5\" (UniqueName: \"kubernetes.io/projected/7babe28d-a681-4e19-ba38-150d554380a2-kube-api-access-7fpj5\") pod \"downloads-7954f5f757-2bj4z\" (UID: \"7babe28d-a681-4e19-ba38-150d554380a2\") " pod="openshift-console/downloads-7954f5f757-2bj4z" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388394 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdwp5\" (UniqueName: \"kubernetes.io/projected/3c6e0341-e5cb-4912-b3fb-8caedc0d4e10-kube-api-access-fdwp5\") pod \"control-plane-machine-set-operator-78cbb6b69f-dtwcs\" (UID: \"3c6e0341-e5cb-4912-b3fb-8caedc0d4e10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388445 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388505 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388558 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-ca\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388606 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388726 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-metrics-certs\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388794 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvkds\" (UniqueName: \"kubernetes.io/projected/1eda6a73-a8cf-406d-ab33-394ec1982f4a-kube-api-access-nvkds\") pod \"auto-csr-approver-29556906-bz7fd\" (UID: \"1eda6a73-a8cf-406d-ab33-394ec1982f4a\") " pod="openshift-infra/auto-csr-approver-29556906-bz7fd" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388847 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz2w6\" (UniqueName: \"kubernetes.io/projected/29df0f63-b9e8-4694-b587-539d1fe80658-kube-api-access-hz2w6\") pod \"multus-admission-controller-857f4d67dd-vxx8z\" (UID: \"29df0f63-b9e8-4694-b587-539d1fe80658\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.388948 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-encryption-config\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389022 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e6058a-1d79-4bb5-a514-0c05c6185279-config\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f4582d-9706-4c08-852a-c6d6ac3694a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389121 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-serving-cert\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389175 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-oauth-serving-cert\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389227 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d72361e-212d-4e7d-a4d3-b10141badfc3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9tj94\" (UID: \"5d72361e-212d-4e7d-a4d3-b10141badfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389320 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-socket-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389374 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3e6058a-1d79-4bb5-a514-0c05c6185279-auth-proxy-config\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389422 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b3e6058a-1d79-4bb5-a514-0c05c6185279-machine-approver-tls\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389471 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vc4n\" (UniqueName: \"kubernetes.io/projected/8cf76d74-f1bd-446f-90fe-2006ac188804-kube-api-access-4vc4n\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389524 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a55f76fb-e225-458d-aab9-03e376b09de9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f4582d-9706-4c08-852a-c6d6ac3694a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389624 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7bl2\" (UniqueName: \"kubernetes.io/projected/4dfecfec-c904-49cb-86fe-83ad121c6a68-kube-api-access-f7bl2\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/174316ed-1085-4cbe-8ec9-406c566e914d-audit-dir\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389718 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389768 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fs9f\" (UniqueName: \"kubernetes.io/projected/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-kube-api-access-4fs9f\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389812 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01cb6b91-c961-4f3e-9d36-c839305c7a80-srv-cert\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389886 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389898 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55f76fb-e225-458d-aab9-03e376b09de9-config\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.389949 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d36f11ac-fb3e-4577-ba62-127115e6cc87-metrics-tls\") pod \"dns-operator-744455d44c-gm72l\" (UID: \"d36f11ac-fb3e-4577-ba62-127115e6cc87\") " pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.390007 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.390073 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.390127 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6jzd\" (UniqueName: \"kubernetes.io/projected/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-kube-api-access-g6jzd\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.390174 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j28q\" (UniqueName: \"kubernetes.io/projected/174316ed-1085-4cbe-8ec9-406c566e914d-kube-api-access-6j28q\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391165 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-config\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391245 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-oauth-serving-cert\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391393 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/103f0ba8-4fee-41ce-bf68-9df3feddffc8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391429 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-csi-data-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391451 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391490 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-policies\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391491 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-ca\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-service-ca\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391801 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.391979 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-client\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6812c7a2-d36e-434d-90ca-079fc0a3390d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5kfqf\" (UID: \"6812c7a2-d36e-434d-90ca-079fc0a3390d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392261 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55f76fb-e225-458d-aab9-03e376b09de9-config\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392329 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-config\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392442 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-plugins-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392579 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-service-ca\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-service-ca\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392645 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f195d176-af0f-4048-a981-358f0240f6cd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392779 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-serving-cert\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392906 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.392967 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcgns\" (UniqueName: \"kubernetes.io/projected/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-kube-api-access-jcgns\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393021 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-serving-cert\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393071 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-bound-sa-token\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393244 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl7t4\" (UniqueName: \"kubernetes.io/projected/b3e6058a-1d79-4bb5-a514-0c05c6185279-kube-api-access-cl7t4\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393295 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01cb6b91-c961-4f3e-9d36-c839305c7a80-profile-collector-cert\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-dir\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393386 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393438 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a55f76fb-e225-458d-aab9-03e376b09de9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393470 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgt6v\" (UniqueName: \"kubernetes.io/projected/111d054e-76a1-4783-aaa4-a05fd3250b1a-kube-api-access-jgt6v\") pod \"migrator-59844c95c7-rm4xd\" (UID: \"111d054e-76a1-4783-aaa4-a05fd3250b1a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393555 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-config\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393589 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqsl\" (UniqueName: \"kubernetes.io/projected/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-kube-api-access-gfqsl\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393637 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f195d176-af0f-4048-a981-358f0240f6cd-images\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393637 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-service-ca\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393662 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f195d176-af0f-4048-a981-358f0240f6cd-proxy-tls\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.393694 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:03.893671166 +0000 UTC m=+194.056883077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393802 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4dfecfec-c904-49cb-86fe-83ad121c6a68-webhook-cert\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.393929 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/827061b1-4659-4457-b74f-24d85f0bf010-config\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394014 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7plzm\" (UniqueName: \"kubernetes.io/projected/6812c7a2-d36e-434d-90ca-079fc0a3390d-kube-api-access-7plzm\") pod \"package-server-manager-789f6589d5-5kfqf\" (UID: \"6812c7a2-d36e-434d-90ca-079fc0a3390d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394107 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkfd\" (UniqueName: \"kubernetes.io/projected/f195d176-af0f-4048-a981-358f0240f6cd-kube-api-access-lzkfd\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394444 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pxvf\" (UniqueName: \"kubernetes.io/projected/e077b561-62da-4d5b-b7a2-faf0e03f46b1-kube-api-access-2pxvf\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394477 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxxl\" (UniqueName: \"kubernetes.io/projected/874a0d99-444c-45f4-9bfa-ad8f5c469afc-kube-api-access-qsxxl\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394548 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29df0f63-b9e8-4694-b587-539d1fe80658-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vxx8z\" (UID: \"29df0f63-b9e8-4694-b587-539d1fe80658\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394568 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394597 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-oauth-config\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394614 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-registration-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394633 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/874a0d99-444c-45f4-9bfa-ad8f5c469afc-signing-key\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394647 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394664 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/174316ed-1085-4cbe-8ec9-406c566e914d-node-pullsecrets\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394681 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394717 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4dfecfec-c904-49cb-86fe-83ad121c6a68-tmpfs\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394740 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fdvk\" (UniqueName: \"kubernetes.io/projected/0878bdef-ff38-4bfa-a4e3-4d656afd474f-kube-api-access-7fdvk\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394760 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-etcd-serving-ca\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394778 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szt5s\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-kube-api-access-szt5s\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394793 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-config-volume\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394811 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4dfecfec-c904-49cb-86fe-83ad121c6a68-apiservice-cert\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394828 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-service-ca-bundle\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-trusted-ca\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394878 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394907 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a55f76fb-e225-458d-aab9-03e376b09de9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394923 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-serving-cert\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394942 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90352d66-9c07-441e-a71f-ee3281a66b5b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394966 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txnxb\" (UniqueName: \"kubernetes.io/projected/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-kube-api-access-txnxb\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.394988 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.395007 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-images\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.395031 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.395324 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-config\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.397468 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d72361e-212d-4e7d-a4d3-b10141badfc3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9tj94\" (UID: \"5d72361e-212d-4e7d-a4d3-b10141badfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.398127 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-serving-cert\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.398312 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.398638 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/884731f6-3835-46f5-a86c-f4ae664d255d-node-bootstrap-token\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.398685 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.398706 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-stats-auth\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.399882 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-config\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.399908 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7788c62-1972-43db-a6db-d0bce9c415a6-cert\") pod \"ingress-canary-w5gbr\" (UID: \"f7788c62-1972-43db-a6db-d0bce9c415a6\") " pod="openshift-ingress-canary/ingress-canary-w5gbr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.399929 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86cnz\" (UniqueName: \"kubernetes.io/projected/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-kube-api-access-86cnz\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.399824 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.399740 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-trusted-ca\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.399969 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/827061b1-4659-4457-b74f-24d85f0bf010-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400039 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hpvc\" (UniqueName: \"kubernetes.io/projected/0c6ae29c-e743-4193-bce1-22b4c5732f45-kube-api-access-4hpvc\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400065 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjdx\" (UniqueName: \"kubernetes.io/projected/3303beb2-619f-4973-b3c9-1f75a6e4e88c-kube-api-access-lrjdx\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400110 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400169 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-config\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400194 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400220 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-image-import-ca\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400261 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103f0ba8-4fee-41ce-bf68-9df3feddffc8-proxy-tls\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400283 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-trusted-ca\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400327 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400287 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-tls\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400357 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400523 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c6e0341-e5cb-4912-b3fb-8caedc0d4e10-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dtwcs\" (UID: \"3c6e0341-e5cb-4912-b3fb-8caedc0d4e10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400571 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/884731f6-3835-46f5-a86c-f4ae664d255d-certs\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400635 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d726b70-e168-4d44-954a-c9a3c8b0db5c-metrics-tls\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400727 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7pv\" (UniqueName: \"kubernetes.io/projected/01cb6b91-c961-4f3e-9d36-c839305c7a80-kube-api-access-7p7pv\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400791 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.400882 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-certificates\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skxgq\" (UniqueName: \"kubernetes.io/projected/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-kube-api-access-skxgq\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401183 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401238 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/827061b1-4659-4457-b74f-24d85f0bf010-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401273 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-trusted-ca-bundle\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-config\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401339 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-etcd-client\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401531 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-secret-volume\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401678 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-mountpoint-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401707 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-client-ca\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401750 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cf76d74-f1bd-446f-90fe-2006ac188804-service-ca-bundle\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401803 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d94l2\" (UniqueName: \"kubernetes.io/projected/50879169-a61b-431b-ada7-e06cca5b64bf-kube-api-access-d94l2\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.401830 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d726b70-e168-4d44-954a-c9a3c8b0db5c-config-volume\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.402989 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-certificates\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.403290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-trusted-ca-bundle\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.403939 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90352d66-9c07-441e-a71f-ee3281a66b5b-srv-cert\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.403976 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtm5c\" (UniqueName: \"kubernetes.io/projected/f7788c62-1972-43db-a6db-d0bce9c415a6-kube-api-access-jtm5c\") pod \"ingress-canary-w5gbr\" (UID: \"f7788c62-1972-43db-a6db-d0bce9c415a6\") " pod="openshift-ingress-canary/ingress-canary-w5gbr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.403996 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/874a0d99-444c-45f4-9bfa-ad8f5c469afc-signing-cabundle\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404019 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-oauth-config\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-audit\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404326 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9l8j\" (UniqueName: \"kubernetes.io/projected/884731f6-3835-46f5-a86c-f4ae664d255d-kube-api-access-h9l8j\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzp9\" (UniqueName: \"kubernetes.io/projected/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-kube-api-access-7vzp9\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404416 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm26s\" (UniqueName: \"kubernetes.io/projected/a0f4582d-9706-4c08-852a-c6d6ac3694a7-kube-api-access-mm26s\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404460 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8qb\" (UniqueName: \"kubernetes.io/projected/5d72361e-212d-4e7d-a4d3-b10141badfc3-kube-api-access-gh8qb\") pod \"cluster-samples-operator-665b6dd947-9tj94\" (UID: \"5d72361e-212d-4e7d-a4d3-b10141badfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404483 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h78vv\" (UniqueName: \"kubernetes.io/projected/90352d66-9c07-441e-a71f-ee3281a66b5b-kube-api-access-h78vv\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404507 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50879169-a61b-431b-ada7-e06cca5b64bf-serving-cert\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-serving-cert\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404571 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50879169-a61b-431b-ada7-e06cca5b64bf-config\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404596 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6p9p\" (UniqueName: \"kubernetes.io/projected/d36f11ac-fb3e-4577-ba62-127115e6cc87-kube-api-access-s6p9p\") pod \"dns-operator-744455d44c-gm72l\" (UID: \"d36f11ac-fb3e-4577-ba62-127115e6cc87\") " pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404619 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-config\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.404644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0878bdef-ff38-4bfa-a4e3-4d656afd474f-serving-cert\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.406197 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-etcd-client\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.406433 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.406551 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-serving-cert\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.412328 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.424414 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.442890 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.463147 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.482355 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.502496 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.505944 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506125 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h78vv\" (UniqueName: \"kubernetes.io/projected/90352d66-9c07-441e-a71f-ee3281a66b5b-kube-api-access-h78vv\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.506177 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.006141308 +0000 UTC m=+194.169353159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506250 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50879169-a61b-431b-ada7-e06cca5b64bf-serving-cert\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506319 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-serving-cert\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506351 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50879169-a61b-431b-ada7-e06cca5b64bf-config\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506384 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6p9p\" (UniqueName: \"kubernetes.io/projected/d36f11ac-fb3e-4577-ba62-127115e6cc87-kube-api-access-s6p9p\") pod \"dns-operator-744455d44c-gm72l\" (UID: \"d36f11ac-fb3e-4577-ba62-127115e6cc87\") " pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506421 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-config\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506457 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0878bdef-ff38-4bfa-a4e3-4d656afd474f-serving-cert\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506514 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2rz\" (UniqueName: \"kubernetes.io/projected/3d726b70-e168-4d44-954a-c9a3c8b0db5c-kube-api-access-rr2rz\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506555 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-default-certificate\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506588 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8vph\" (UniqueName: \"kubernetes.io/projected/103f0ba8-4fee-41ce-bf68-9df3feddffc8-kube-api-access-h8vph\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506653 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506714 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdwp5\" (UniqueName: \"kubernetes.io/projected/3c6e0341-e5cb-4912-b3fb-8caedc0d4e10-kube-api-access-fdwp5\") pod \"control-plane-machine-set-operator-78cbb6b69f-dtwcs\" (UID: \"3c6e0341-e5cb-4912-b3fb-8caedc0d4e10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506761 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506830 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-metrics-certs\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvkds\" (UniqueName: \"kubernetes.io/projected/1eda6a73-a8cf-406d-ab33-394ec1982f4a-kube-api-access-nvkds\") pod \"auto-csr-approver-29556906-bz7fd\" (UID: \"1eda6a73-a8cf-406d-ab33-394ec1982f4a\") " pod="openshift-infra/auto-csr-approver-29556906-bz7fd" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506946 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz2w6\" (UniqueName: \"kubernetes.io/projected/29df0f63-b9e8-4694-b587-539d1fe80658-kube-api-access-hz2w6\") pod \"multus-admission-controller-857f4d67dd-vxx8z\" (UID: \"29df0f63-b9e8-4694-b587-539d1fe80658\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.506981 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-encryption-config\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507016 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e6058a-1d79-4bb5-a514-0c05c6185279-config\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f4582d-9706-4c08-852a-c6d6ac3694a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507084 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-socket-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507117 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3e6058a-1d79-4bb5-a514-0c05c6185279-auth-proxy-config\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b3e6058a-1d79-4bb5-a514-0c05c6185279-machine-approver-tls\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507169 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-config\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507178 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vc4n\" (UniqueName: \"kubernetes.io/projected/8cf76d74-f1bd-446f-90fe-2006ac188804-kube-api-access-4vc4n\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507213 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f4582d-9706-4c08-852a-c6d6ac3694a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507244 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7bl2\" (UniqueName: \"kubernetes.io/projected/4dfecfec-c904-49cb-86fe-83ad121c6a68-kube-api-access-f7bl2\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507274 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/174316ed-1085-4cbe-8ec9-406c566e914d-audit-dir\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507306 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fs9f\" (UniqueName: \"kubernetes.io/projected/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-kube-api-access-4fs9f\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507338 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01cb6b91-c961-4f3e-9d36-c839305c7a80-srv-cert\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507383 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d36f11ac-fb3e-4577-ba62-127115e6cc87-metrics-tls\") pod \"dns-operator-744455d44c-gm72l\" (UID: \"d36f11ac-fb3e-4577-ba62-127115e6cc87\") " pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507426 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507517 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6jzd\" (UniqueName: \"kubernetes.io/projected/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-kube-api-access-g6jzd\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507548 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j28q\" (UniqueName: \"kubernetes.io/projected/174316ed-1085-4cbe-8ec9-406c566e914d-kube-api-access-6j28q\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507585 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/103f0ba8-4fee-41ce-bf68-9df3feddffc8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507661 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-csi-data-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507694 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507726 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-policies\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507760 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507806 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6812c7a2-d36e-434d-90ca-079fc0a3390d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5kfqf\" (UID: \"6812c7a2-d36e-434d-90ca-079fc0a3390d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507842 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-plugins-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507917 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f195d176-af0f-4048-a981-358f0240f6cd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507949 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507980 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-serving-cert\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.507981 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508020 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508072 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcgns\" (UniqueName: \"kubernetes.io/projected/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-kube-api-access-jcgns\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508109 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-serving-cert\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508151 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl7t4\" (UniqueName: \"kubernetes.io/projected/b3e6058a-1d79-4bb5-a514-0c05c6185279-kube-api-access-cl7t4\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508183 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01cb6b91-c961-4f3e-9d36-c839305c7a80-profile-collector-cert\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508214 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-dir\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508245 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508280 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgt6v\" (UniqueName: \"kubernetes.io/projected/111d054e-76a1-4783-aaa4-a05fd3250b1a-kube-api-access-jgt6v\") pod \"migrator-59844c95c7-rm4xd\" (UID: \"111d054e-76a1-4783-aaa4-a05fd3250b1a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508312 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-config\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508348 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqsl\" (UniqueName: \"kubernetes.io/projected/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-kube-api-access-gfqsl\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508380 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f195d176-af0f-4048-a981-358f0240f6cd-images\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f195d176-af0f-4048-a981-358f0240f6cd-proxy-tls\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508442 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4dfecfec-c904-49cb-86fe-83ad121c6a68-webhook-cert\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508474 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/827061b1-4659-4457-b74f-24d85f0bf010-config\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508507 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7plzm\" (UniqueName: \"kubernetes.io/projected/6812c7a2-d36e-434d-90ca-079fc0a3390d-kube-api-access-7plzm\") pod \"package-server-manager-789f6589d5-5kfqf\" (UID: \"6812c7a2-d36e-434d-90ca-079fc0a3390d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508541 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkfd\" (UniqueName: \"kubernetes.io/projected/f195d176-af0f-4048-a981-358f0240f6cd-kube-api-access-lzkfd\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508573 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pxvf\" (UniqueName: \"kubernetes.io/projected/e077b561-62da-4d5b-b7a2-faf0e03f46b1-kube-api-access-2pxvf\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsxxl\" (UniqueName: \"kubernetes.io/projected/874a0d99-444c-45f4-9bfa-ad8f5c469afc-kube-api-access-qsxxl\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508659 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508704 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29df0f63-b9e8-4694-b587-539d1fe80658-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vxx8z\" (UID: \"29df0f63-b9e8-4694-b587-539d1fe80658\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508739 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-registration-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508810 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/874a0d99-444c-45f4-9bfa-ad8f5c469afc-signing-key\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508903 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/174316ed-1085-4cbe-8ec9-406c566e914d-node-pullsecrets\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508951 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4dfecfec-c904-49cb-86fe-83ad121c6a68-tmpfs\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.508985 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fdvk\" (UniqueName: \"kubernetes.io/projected/0878bdef-ff38-4bfa-a4e3-4d656afd474f-kube-api-access-7fdvk\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509017 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-etcd-serving-ca\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509064 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-config-volume\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509109 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4dfecfec-c904-49cb-86fe-83ad121c6a68-apiservice-cert\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509143 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e6058a-1d79-4bb5-a514-0c05c6185279-config\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509154 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-service-ca-bundle\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90352d66-9c07-441e-a71f-ee3281a66b5b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509265 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txnxb\" (UniqueName: \"kubernetes.io/projected/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-kube-api-access-txnxb\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509300 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509332 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-images\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509367 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509411 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/884731f6-3835-46f5-a86c-f4ae664d255d-node-bootstrap-token\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509448 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509483 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-stats-auth\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509515 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-config\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7788c62-1972-43db-a6db-d0bce9c415a6-cert\") pod \"ingress-canary-w5gbr\" (UID: \"f7788c62-1972-43db-a6db-d0bce9c415a6\") " pod="openshift-ingress-canary/ingress-canary-w5gbr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509578 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86cnz\" (UniqueName: \"kubernetes.io/projected/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-kube-api-access-86cnz\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509610 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/827061b1-4659-4457-b74f-24d85f0bf010-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/174316ed-1085-4cbe-8ec9-406c566e914d-audit-dir\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509660 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjdx\" (UniqueName: \"kubernetes.io/projected/3303beb2-619f-4973-b3c9-1f75a6e4e88c-kube-api-access-lrjdx\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509695 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.509016 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511336 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/103f0ba8-4fee-41ce-bf68-9df3feddffc8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511345 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-etcd-serving-ca\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511413 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-config\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511436 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-csi-data-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511453 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511507 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-image-import-ca\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511533 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-encryption-config\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511542 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103f0ba8-4fee-41ce-bf68-9df3feddffc8-proxy-tls\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511573 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-trusted-ca\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511634 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511670 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511707 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c6e0341-e5cb-4912-b3fb-8caedc0d4e10-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dtwcs\" (UID: \"3c6e0341-e5cb-4912-b3fb-8caedc0d4e10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/884731f6-3835-46f5-a86c-f4ae664d255d-certs\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511787 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d726b70-e168-4d44-954a-c9a3c8b0db5c-metrics-tls\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511836 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7pv\" (UniqueName: \"kubernetes.io/projected/01cb6b91-c961-4f3e-9d36-c839305c7a80-kube-api-access-7p7pv\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511915 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.511976 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512008 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/827061b1-4659-4457-b74f-24d85f0bf010-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512041 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-config\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512074 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-etcd-client\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512119 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-secret-volume\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512157 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-mountpoint-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512179 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-serving-cert\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512320 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-service-ca-bundle\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-client-ca\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512457 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cf76d74-f1bd-446f-90fe-2006ac188804-service-ca-bundle\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d94l2\" (UniqueName: \"kubernetes.io/projected/50879169-a61b-431b-ada7-e06cca5b64bf-kube-api-access-d94l2\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512551 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d726b70-e168-4d44-954a-c9a3c8b0db5c-config-volume\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.514462 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-registration-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.514586 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.514696 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-mountpoint-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.516502 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.517587 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3e6058a-1d79-4bb5-a514-0c05c6185279-auth-proxy-config\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.518005 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.017976614 +0000 UTC m=+194.181188465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.518920 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-config\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.519029 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-socket-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.520054 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f195d176-af0f-4048-a981-358f0240f6cd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.521058 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-policies\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.522227 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.522239 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-serving-cert\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.522376 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.523024 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.524386 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e077b561-62da-4d5b-b7a2-faf0e03f46b1-plugins-dir\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.525434 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.525831 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0878bdef-ff38-4bfa-a4e3-4d656afd474f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.531399 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.531774 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90352d66-9c07-441e-a71f-ee3281a66b5b-srv-cert\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.512514 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0878bdef-ff38-4bfa-a4e3-4d656afd474f-serving-cert\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.532227 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/174316ed-1085-4cbe-8ec9-406c566e914d-node-pullsecrets\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.532882 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4dfecfec-c904-49cb-86fe-83ad121c6a68-tmpfs\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.533745 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-dir\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.533917 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtm5c\" (UniqueName: \"kubernetes.io/projected/f7788c62-1972-43db-a6db-d0bce9c415a6-kube-api-access-jtm5c\") pod \"ingress-canary-w5gbr\" (UID: \"f7788c62-1972-43db-a6db-d0bce9c415a6\") " pod="openshift-ingress-canary/ingress-canary-w5gbr" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.533955 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/874a0d99-444c-45f4-9bfa-ad8f5c469afc-signing-cabundle\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.533985 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-audit\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.534019 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9l8j\" (UniqueName: \"kubernetes.io/projected/884731f6-3835-46f5-a86c-f4ae664d255d-kube-api-access-h9l8j\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.535299 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-config\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.535778 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f4582d-9706-4c08-852a-c6d6ac3694a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.534950 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-images\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.536440 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.536461 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c6e0341-e5cb-4912-b3fb-8caedc0d4e10-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dtwcs\" (UID: \"3c6e0341-e5cb-4912-b3fb-8caedc0d4e10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.536689 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/827061b1-4659-4457-b74f-24d85f0bf010-config\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.537986 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-audit\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.539067 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.539936 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90352d66-9c07-441e-a71f-ee3281a66b5b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.540511 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f4582d-9706-4c08-852a-c6d6ac3694a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.542026 4786 request.go:700] Waited for 1.000083583s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.542753 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-config\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.542933 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm26s\" (UniqueName: \"kubernetes.io/projected/a0f4582d-9706-4c08-852a-c6d6ac3694a7-kube-api-access-mm26s\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.543450 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-trusted-ca\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.544167 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.544613 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-client-ca\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.545391 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01cb6b91-c961-4f3e-9d36-c839305c7a80-srv-cert\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.545799 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b3e6058a-1d79-4bb5-a514-0c05c6185279-machine-approver-tls\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.546090 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.547745 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-secret-volume\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.548415 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.548576 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01cb6b91-c961-4f3e-9d36-c839305c7a80-profile-collector-cert\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.549012 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.549068 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.549192 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-serving-cert\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.549763 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/174316ed-1085-4cbe-8ec9-406c566e914d-etcd-client\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.551138 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.552562 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/174316ed-1085-4cbe-8ec9-406c566e914d-image-import-ca\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.552774 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.554764 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-config\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.554790 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/827061b1-4659-4457-b74f-24d85f0bf010-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.556370 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d36f11ac-fb3e-4577-ba62-127115e6cc87-metrics-tls\") pod \"dns-operator-744455d44c-gm72l\" (UID: \"d36f11ac-fb3e-4577-ba62-127115e6cc87\") " pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.558156 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.562718 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.583116 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.597079 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6812c7a2-d36e-434d-90ca-079fc0a3390d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5kfqf\" (UID: \"6812c7a2-d36e-434d-90ca-079fc0a3390d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.602369 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.622710 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.626398 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29df0f63-b9e8-4694-b587-539d1fe80658-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vxx8z\" (UID: \"29df0f63-b9e8-4694-b587-539d1fe80658\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.642402 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.645141 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.645305 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.145279068 +0000 UTC m=+194.308490889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.645891 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.646391 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.146354332 +0000 UTC m=+194.309566193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.663133 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.672224 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-default-certificate\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.682192 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.688633 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-stats-auth\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.703334 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.711680 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8cf76d74-f1bd-446f-90fe-2006ac188804-metrics-certs\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.724464 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.742758 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.745615 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cf76d74-f1bd-446f-90fe-2006ac188804-service-ca-bundle\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.748058 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.748412 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.248377229 +0000 UTC m=+194.411589080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.749041 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.749556 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.249456254 +0000 UTC m=+194.412668105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.762217 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.781822 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.790717 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90352d66-9c07-441e-a71f-ee3281a66b5b-srv-cert\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.802077 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.832144 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.843391 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.844311 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.850577 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.851441 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.351420519 +0000 UTC m=+194.514632330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.852380 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.854024 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.855194 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.355162993 +0000 UTC m=+194.518374844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.861811 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.883246 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.902171 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.909498 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f195d176-af0f-4048-a981-358f0240f6cd-images\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.923396 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.942610 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.951490 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4dfecfec-c904-49cb-86fe-83ad121c6a68-webhook-cert\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.954037 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.954245 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.454212384 +0000 UTC m=+194.617424235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.954918 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.955340 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4dfecfec-c904-49cb-86fe-83ad121c6a68-apiservice-cert\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:03 crc kubenswrapper[4786]: E0313 15:06:03.955531 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.455510673 +0000 UTC m=+194.618722494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.963893 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.983087 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 15:06:03 crc kubenswrapper[4786]: I0313 15:06:03.989959 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50879169-a61b-431b-ada7-e06cca5b64bf-serving-cert\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.002316 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.008056 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50879169-a61b-431b-ada7-e06cca5b64bf-config\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.023111 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.043264 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.056095 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.056404 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.556367155 +0000 UTC m=+194.719579006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.057179 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.057781 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.557761466 +0000 UTC m=+194.720973287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.063071 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.068008 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f195d176-af0f-4048-a981-358f0240f6cd-proxy-tls\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.082351 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.102602 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.123812 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.142836 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.159001 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.159130 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.659098278 +0000 UTC m=+194.822310129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.159324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.159776 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.659759173 +0000 UTC m=+194.822971024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.161707 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.169400 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/874a0d99-444c-45f4-9bfa-ad8f5c469afc-signing-key\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.182881 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.184887 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/874a0d99-444c-45f4-9bfa-ad8f5c469afc-signing-cabundle\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.202678 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.224173 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.243135 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.252929 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-config-volume\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.260734 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.260952 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.760927571 +0000 UTC m=+194.924139422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.261400 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.261842 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.761823871 +0000 UTC m=+194.925035712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.262830 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.283633 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.292053 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/103f0ba8-4fee-41ce-bf68-9df3feddffc8-proxy-tls\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.302676 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.303906 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d726b70-e168-4d44-954a-c9a3c8b0db5c-config-volume\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.322177 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.342404 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.350108 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d726b70-e168-4d44-954a-c9a3c8b0db5c-metrics-tls\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.362912 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.362981 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.363147 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.863122592 +0000 UTC m=+195.026334423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.363989 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.364362 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.86435104 +0000 UTC m=+195.027562851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.382146 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.392780 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/884731f6-3835-46f5-a86c-f4ae664d255d-certs\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.402497 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.409845 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/884731f6-3835-46f5-a86c-f4ae664d255d-node-bootstrap-token\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.435911 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvp7q\" (UniqueName: \"kubernetes.io/projected/6aa22da4-e413-4089-9e4a-ef7a8f324435-kube-api-access-dvp7q\") pod \"openshift-apiserver-operator-796bbdcf4f-x75ps\" (UID: \"6aa22da4-e413-4089-9e4a-ef7a8f324435\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.441548 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.464802 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.464992 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.964959225 +0000 UTC m=+195.128171026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.465127 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.465673 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:04.9656621 +0000 UTC m=+195.128873911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.481875 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.485244 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7788c62-1972-43db-a6db-d0bce9c415a6-cert\") pod \"ingress-canary-w5gbr\" (UID: \"f7788c62-1972-43db-a6db-d0bce9c415a6\") " pod="openshift-ingress-canary/ingress-canary-w5gbr" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.487041 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/233ce256-9724-4f50-a45b-a58aaaee2ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.502548 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.522453 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.542785 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.560927 4786 request.go:700] Waited for 1.879728914s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.562730 4786 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.566414 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.566583 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.066543152 +0000 UTC m=+195.229754963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.566680 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.567064 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.067053574 +0000 UTC m=+195.230265385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.582327 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.618315 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fz8c\" (UniqueName: \"kubernetes.io/projected/3a2643aa-9180-475a-af73-8e7b311cc77c-kube-api-access-2fz8c\") pod \"controller-manager-879f6c89f-7c65p\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.640542 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhr4z\" (UniqueName: \"kubernetes.io/projected/233ce256-9724-4f50-a45b-a58aaaee2ec8-kube-api-access-nhr4z\") pod \"ingress-operator-5b745b69d9-hcn8x\" (UID: \"233ce256-9724-4f50-a45b-a58aaaee2ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.662596 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhb9d\" (UniqueName: \"kubernetes.io/projected/963ffcec-87b4-480a-81d1-e19ecb7edb12-kube-api-access-nhb9d\") pod \"apiserver-7bbb656c7d-bg6tw\" (UID: \"963ffcec-87b4-480a-81d1-e19ecb7edb12\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.667698 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.667913 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.167883404 +0000 UTC m=+195.331095215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.668405 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.669495 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.16947958 +0000 UTC m=+195.332691411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.678513 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5cs\" (UniqueName: \"kubernetes.io/projected/e50609e3-d329-4c22-9be4-4100f122508d-kube-api-access-lj5cs\") pod \"openshift-config-operator-7777fb866f-5cbrn\" (UID: \"e50609e3-d329-4c22-9be4-4100f122508d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.680311 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.707409 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.716987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fpj5\" (UniqueName: \"kubernetes.io/projected/7babe28d-a681-4e19-ba38-150d554380a2-kube-api-access-7fpj5\") pod \"downloads-7954f5f757-2bj4z\" (UID: \"7babe28d-a681-4e19-ba38-150d554380a2\") " pod="openshift-console/downloads-7954f5f757-2bj4z" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.725743 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.748212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-bound-sa-token\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.757360 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a55f76fb-e225-458d-aab9-03e376b09de9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6pqv7\" (UID: \"a55f76fb-e225-458d-aab9-03e376b09de9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.769843 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.770889 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.269993584 +0000 UTC m=+195.433205415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.771371 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.771673 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.271659611 +0000 UTC m=+195.434871422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.779096 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szt5s\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-kube-api-access-szt5s\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.798512 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hpvc\" (UniqueName: \"kubernetes.io/projected/0c6ae29c-e743-4193-bce1-22b4c5732f45-kube-api-access-4hpvc\") pod \"console-f9d7485db-2tqr8\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.821737 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skxgq\" (UniqueName: \"kubernetes.io/projected/86fb29bf-af7d-48e1-b4d5-8fbef5c3387b-kube-api-access-skxgq\") pod \"etcd-operator-b45778765-w5kxr\" (UID: \"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.835493 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzp9\" (UniqueName: \"kubernetes.io/projected/4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d-kube-api-access-7vzp9\") pod \"openshift-controller-manager-operator-756b6f6bc6-gk464\" (UID: \"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.859538 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8qb\" (UniqueName: \"kubernetes.io/projected/5d72361e-212d-4e7d-a4d3-b10141badfc3-kube-api-access-gh8qb\") pod \"cluster-samples-operator-665b6dd947-9tj94\" (UID: \"5d72361e-212d-4e7d-a4d3-b10141badfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.871322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.872125 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.872455 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.372432111 +0000 UTC m=+195.535643922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.872516 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.872828 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.372816309 +0000 UTC m=+195.536028120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.877268 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h78vv\" (UniqueName: \"kubernetes.io/projected/90352d66-9c07-441e-a71f-ee3281a66b5b-kube-api-access-h78vv\") pod \"olm-operator-6b444d44fb-vl4kp\" (UID: \"90352d66-9c07-441e-a71f-ee3281a66b5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.894910 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6p9p\" (UniqueName: \"kubernetes.io/projected/d36f11ac-fb3e-4577-ba62-127115e6cc87-kube-api-access-s6p9p\") pod \"dns-operator-744455d44c-gm72l\" (UID: \"d36f11ac-fb3e-4577-ba62-127115e6cc87\") " pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.924207 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c65p"] Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.924574 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2rz\" (UniqueName: \"kubernetes.io/projected/3d726b70-e168-4d44-954a-c9a3c8b0db5c-kube-api-access-rr2rz\") pod \"dns-default-996mk\" (UID: \"3d726b70-e168-4d44-954a-c9a3c8b0db5c\") " pod="openshift-dns/dns-default-996mk" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.927321 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.927321 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.932791 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.933601 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.944694 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdwp5\" (UniqueName: \"kubernetes.io/projected/3c6e0341-e5cb-4912-b3fb-8caedc0d4e10-kube-api-access-fdwp5\") pod \"control-plane-machine-set-operator-78cbb6b69f-dtwcs\" (UID: \"3c6e0341-e5cb-4912-b3fb-8caedc0d4e10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.956590 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.960489 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8vph\" (UniqueName: \"kubernetes.io/projected/103f0ba8-4fee-41ce-bf68-9df3feddffc8-kube-api-access-h8vph\") pod \"machine-config-controller-84d6567774-fsdpw\" (UID: \"103f0ba8-4fee-41ce-bf68-9df3feddffc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.974438 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.974770 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.474739384 +0000 UTC m=+195.637951195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.975088 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:04 crc kubenswrapper[4786]: E0313 15:06:04.975395 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.475381789 +0000 UTC m=+195.638593600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.983616 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b2be232-e8e1-4fa6-a574-7c5dfda4f386-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x7kx2\" (UID: \"1b2be232-e8e1-4fa6-a574-7c5dfda4f386\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.991927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" Mar 13 15:06:04 crc kubenswrapper[4786]: I0313 15:06:04.997898 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvkds\" (UniqueName: \"kubernetes.io/projected/1eda6a73-a8cf-406d-ab33-394ec1982f4a-kube-api-access-nvkds\") pod \"auto-csr-approver-29556906-bz7fd\" (UID: \"1eda6a73-a8cf-406d-ab33-394ec1982f4a\") " pod="openshift-infra/auto-csr-approver-29556906-bz7fd" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.000607 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.012220 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.014213 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2bj4z" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.023623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-996mk" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.039655 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vc4n\" (UniqueName: \"kubernetes.io/projected/8cf76d74-f1bd-446f-90fe-2006ac188804-kube-api-access-4vc4n\") pod \"router-default-5444994796-z752l\" (UID: \"8cf76d74-f1bd-446f-90fe-2006ac188804\") " pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.068597 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w5kxr"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.068600 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgt6v\" (UniqueName: \"kubernetes.io/projected/111d054e-76a1-4783-aaa4-a05fd3250b1a-kube-api-access-jgt6v\") pod \"migrator-59844c95c7-rm4xd\" (UID: \"111d054e-76a1-4783-aaa4-a05fd3250b1a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.076573 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.076702 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.576684 +0000 UTC m=+195.739895811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.076946 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.077253 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.577246493 +0000 UTC m=+195.740458304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.080540 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6jzd\" (UniqueName: \"kubernetes.io/projected/55e064ca-7aad-4fdd-8270-b43a93b9ff3c-kube-api-access-g6jzd\") pod \"console-operator-58897d9998-7cf4m\" (UID: \"55e064ca-7aad-4fdd-8270-b43a93b9ff3c\") " pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.095368 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j28q\" (UniqueName: \"kubernetes.io/projected/174316ed-1085-4cbe-8ec9-406c566e914d-kube-api-access-6j28q\") pod \"apiserver-76f77b778f-5m4w2\" (UID: \"174316ed-1085-4cbe-8ec9-406c566e914d\") " pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.114839 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7plzm\" (UniqueName: \"kubernetes.io/projected/6812c7a2-d36e-434d-90ca-079fc0a3390d-kube-api-access-7plzm\") pod \"package-server-manager-789f6589d5-5kfqf\" (UID: \"6812c7a2-d36e-434d-90ca-079fc0a3390d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.128825 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.137148 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl7t4\" (UniqueName: \"kubernetes.io/projected/b3e6058a-1d79-4bb5-a514-0c05c6185279-kube-api-access-cl7t4\") pod \"machine-approver-56656f9798-sdzbq\" (UID: \"b3e6058a-1d79-4bb5-a514-0c05c6185279\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.144127 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.144337 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.158778 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.172084 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.177832 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.178294 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.678279438 +0000 UTC m=+195.841491249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.180776 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.181244 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkfd\" (UniqueName: \"kubernetes.io/projected/f195d176-af0f-4048-a981-358f0240f6cd-kube-api-access-lzkfd\") pod \"machine-config-operator-74547568cd-ph2dx\" (UID: \"f195d176-af0f-4048-a981-358f0240f6cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.194312 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7bl2\" (UniqueName: \"kubernetes.io/projected/4dfecfec-c904-49cb-86fe-83ad121c6a68-kube-api-access-f7bl2\") pod \"packageserver-d55dfcdfc-mdvv7\" (UID: \"4dfecfec-c904-49cb-86fe-83ad121c6a68\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.196040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.211309 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.217705 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.218479 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcgns\" (UniqueName: \"kubernetes.io/projected/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-kube-api-access-jcgns\") pod \"collect-profiles-29556900-d6kmv\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.236429 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqsl\" (UniqueName: \"kubernetes.io/projected/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-kube-api-access-gfqsl\") pod \"route-controller-manager-6576b87f9c-d5wsg\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.248349 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.262095 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pxvf\" (UniqueName: \"kubernetes.io/projected/e077b561-62da-4d5b-b7a2-faf0e03f46b1-kube-api-access-2pxvf\") pod \"csi-hostpathplugin-znwnc\" (UID: \"e077b561-62da-4d5b-b7a2-faf0e03f46b1\") " pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.263928 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.278167 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsxxl\" (UniqueName: \"kubernetes.io/projected/874a0d99-444c-45f4-9bfa-ad8f5c469afc-kube-api-access-qsxxl\") pod \"service-ca-9c57cc56f-t98jm\" (UID: \"874a0d99-444c-45f4-9bfa-ad8f5c469afc\") " pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.278790 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.279192 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.7791707 +0000 UTC m=+195.942382601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.289614 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.295230 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fdvk\" (UniqueName: \"kubernetes.io/projected/0878bdef-ff38-4bfa-a4e3-4d656afd474f-kube-api-access-7fdvk\") pod \"authentication-operator-69f744f599-4rkfk\" (UID: \"0878bdef-ff38-4bfa-a4e3-4d656afd474f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.296037 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.305229 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.315748 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.337105 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.338441 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86cnz\" (UniqueName: \"kubernetes.io/projected/b960566c-bcc9-41ff-9fbc-c132f0e4d6e5-kube-api-access-86cnz\") pod \"machine-api-operator-5694c8668f-2c944\" (UID: \"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.344599 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-znwnc" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.353128 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.359482 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txnxb\" (UniqueName: \"kubernetes.io/projected/b2d9a397-cea7-4db5-a5d0-a657bf571e1f-kube-api-access-txnxb\") pod \"cluster-image-registry-operator-dc59b4c8b-tlwm2\" (UID: \"b2d9a397-cea7-4db5-a5d0-a657bf571e1f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.366592 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.372388 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.375425 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/827061b1-4659-4457-b74f-24d85f0bf010-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qcs2g\" (UID: \"827061b1-4659-4457-b74f-24d85f0bf010\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.379474 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.379923 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.879907569 +0000 UTC m=+196.043119380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.401973 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjdx\" (UniqueName: \"kubernetes.io/projected/3303beb2-619f-4973-b3c9-1f75a6e4e88c-kube-api-access-lrjdx\") pod \"oauth-openshift-558db77b4-b7krw\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.420307 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7pv\" (UniqueName: \"kubernetes.io/projected/01cb6b91-c961-4f3e-9d36-c839305c7a80-kube-api-access-7p7pv\") pod \"catalog-operator-68c6474976-bm59p\" (UID: \"01cb6b91-c961-4f3e-9d36-c839305c7a80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.434170 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d94l2\" (UniqueName: \"kubernetes.io/projected/50879169-a61b-431b-ada7-e06cca5b64bf-kube-api-access-d94l2\") pod \"service-ca-operator-777779d784-bhts5\" (UID: \"50879169-a61b-431b-ada7-e06cca5b64bf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.438151 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.451448 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.454195 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm26s\" (UniqueName: \"kubernetes.io/projected/a0f4582d-9706-4c08-852a-c6d6ac3694a7-kube-api-access-mm26s\") pod \"kube-storage-version-migrator-operator-b67b599dd-7f2vq\" (UID: \"a0f4582d-9706-4c08-852a-c6d6ac3694a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.470541 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fs9f\" (UniqueName: \"kubernetes.io/projected/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-kube-api-access-4fs9f\") pod \"marketplace-operator-79b997595-wcljz\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.474074 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz2w6\" (UniqueName: \"kubernetes.io/projected/29df0f63-b9e8-4694-b587-539d1fe80658-kube-api-access-hz2w6\") pod \"multus-admission-controller-857f4d67dd-vxx8z\" (UID: \"29df0f63-b9e8-4694-b587-539d1fe80658\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.481109 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.481370 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:05.981359203 +0000 UTC m=+196.144571004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.484835 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9l8j\" (UniqueName: \"kubernetes.io/projected/884731f6-3835-46f5-a86c-f4ae664d255d-kube-api-access-h9l8j\") pod \"machine-config-server-4h7qh\" (UID: \"884731f6-3835-46f5-a86c-f4ae664d255d\") " pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.489848 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.492687 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.501768 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtm5c\" (UniqueName: \"kubernetes.io/projected/f7788c62-1972-43db-a6db-d0bce9c415a6-kube-api-access-jtm5c\") pod \"ingress-canary-w5gbr\" (UID: \"f7788c62-1972-43db-a6db-d0bce9c415a6\") " pod="openshift-ingress-canary/ingress-canary-w5gbr" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.521466 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" event={"ID":"3a2643aa-9180-475a-af73-8e7b311cc77c","Type":"ContainerStarted","Data":"89ebedb800150b8d7a01cd7fbc6fbb9189593ba56a18165425a70e9461d0347e"} Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.523801 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" event={"ID":"963ffcec-87b4-480a-81d1-e19ecb7edb12","Type":"ContainerStarted","Data":"635ebf84f141897aa07021ec314eb28fec8988174a7422faa2e153116aada656"} Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.530574 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.542201 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.556802 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.581545 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.582560 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.582782 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.082756537 +0000 UTC m=+196.245968378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.629942 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4h7qh" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.632079 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.637905 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w5gbr" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.658540 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.683923 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.684220 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.184208601 +0000 UTC m=+196.347420412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.693196 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.758531 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.784838 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.785218 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.285203476 +0000 UTC m=+196.448415287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.791482 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.804249 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.827759 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.863738 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.868274 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn"] Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.886909 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.887395 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.387373246 +0000 UTC m=+196.550585057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.990131 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.991319 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.491276726 +0000 UTC m=+196.654488547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:05 crc kubenswrapper[4786]: I0313 15:06:05.991459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:05 crc kubenswrapper[4786]: E0313 15:06:05.991825 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.491809639 +0000 UTC m=+196.655021450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: W0313 15:06:06.013993 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e6058a_1d79_4bb5_a514_0c05c6185279.slice/crio-0332993e3c5df46a75477dfade35c69e64e91a886b1b984525f830a05c9b871e WatchSource:0}: Error finding container 0332993e3c5df46a75477dfade35c69e64e91a886b1b984525f830a05c9b871e: Status 404 returned error can't find the container with id 0332993e3c5df46a75477dfade35c69e64e91a886b1b984525f830a05c9b871e Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.094673 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.095055 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.595037859 +0000 UTC m=+196.758249670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.160979 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2"] Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.202035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.202338 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.702324058 +0000 UTC m=+196.865535869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.303037 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.303235 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.803210175 +0000 UTC m=+196.966421986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.303313 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.303734 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.803726279 +0000 UTC m=+196.966938090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.404211 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.404981 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:06.904960744 +0000 UTC m=+197.068172555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.505969 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.506271 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.006259422 +0000 UTC m=+197.169471233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.532092 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" event={"ID":"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d","Type":"ContainerStarted","Data":"b1062af893d266b0452bb80fc37d00d97a8427628fe5ced77a472260b43d9d04"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.534041 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" event={"ID":"b3e6058a-1d79-4bb5-a514-0c05c6185279","Type":"ContainerStarted","Data":"0332993e3c5df46a75477dfade35c69e64e91a886b1b984525f830a05c9b871e"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.537484 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" event={"ID":"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b","Type":"ContainerStarted","Data":"3fb31745ca923033edc624bcd810fe2d5052efdd405b02dd2c9ba86a03268c6d"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.539531 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" event={"ID":"6aa22da4-e413-4089-9e4a-ef7a8f324435","Type":"ContainerStarted","Data":"e5c39c753ed65ea351fedc827fca177383581db2c3881110d8de136adec02b58"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.539571 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" event={"ID":"6aa22da4-e413-4089-9e4a-ef7a8f324435","Type":"ContainerStarted","Data":"9b8b870e4d556823f8c2dffab7cac560b744129ce5416959370bac6e409f8d1e"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.542632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4h7qh" event={"ID":"884731f6-3835-46f5-a86c-f4ae664d255d","Type":"ContainerStarted","Data":"c8215c40513d9e5e367f62e2a75a56ea4e7d92c97c854a126d8b0d8d1e49f262"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.543311 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" event={"ID":"3c6e0341-e5cb-4912-b3fb-8caedc0d4e10","Type":"ContainerStarted","Data":"a7c53095ec81c2cf10d2fffe0ce5fe9d582543a5abc918fdd42e84b81364ab58"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.544519 4786 generic.go:334] "Generic (PLEG): container finished" podID="963ffcec-87b4-480a-81d1-e19ecb7edb12" containerID="62e31f0fbe0ce7432bea348113d745f817b02591575ffceb2fbc3b2fef86ea14" exitCode=0 Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.544561 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" event={"ID":"963ffcec-87b4-480a-81d1-e19ecb7edb12","Type":"ContainerDied","Data":"62e31f0fbe0ce7432bea348113d745f817b02591575ffceb2fbc3b2fef86ea14"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.547006 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" event={"ID":"b2d9a397-cea7-4db5-a5d0-a657bf571e1f","Type":"ContainerStarted","Data":"a54e5781ef95b5d09b4be4229cd18bff8fc970adf17ac76d46963800ac75c705"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.606809 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.607877 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.107847427 +0000 UTC m=+197.271059238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622728 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622782 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" event={"ID":"5d72361e-212d-4e7d-a4d3-b10141badfc3","Type":"ContainerStarted","Data":"060c04f25bfa9ea91e3344a7b7230598f2578e1fe8969e7ff0a27d7df1241fee"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622804 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" event={"ID":"233ce256-9724-4f50-a45b-a58aaaee2ec8","Type":"ContainerStarted","Data":"1c67f4f74ff147fa12fcd36daf319f8fed1e82f836310810798b616572cdeafa"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622817 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" event={"ID":"e50609e3-d329-4c22-9be4-4100f122508d","Type":"ContainerStarted","Data":"4f7d312aa98d12def82cc42ca8475af0a0bc3135aa622cee05031ff4d28292f8"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" event={"ID":"3a2643aa-9180-475a-af73-8e7b311cc77c","Type":"ContainerStarted","Data":"068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622840 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" event={"ID":"90352d66-9c07-441e-a71f-ee3281a66b5b","Type":"ContainerStarted","Data":"903ef74c16d36be883163e535170fc06fa6005a2678911e34933e1d90c461ffb"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622866 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" event={"ID":"1b2be232-e8e1-4fa6-a574-7c5dfda4f386","Type":"ContainerStarted","Data":"2d17d6c2393b6a63df1a4370ac1878e74bcd82d5596b25634aea0e7256d79ff7"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622881 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" event={"ID":"a55f76fb-e225-458d-aab9-03e376b09de9","Type":"ContainerStarted","Data":"e2d3cf911c4dfb56dad2a7b7fa485b41a569bcea06cbdac73db8047d1982ebdf"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.622893 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-z752l" event={"ID":"8cf76d74-f1bd-446f-90fe-2006ac188804","Type":"ContainerStarted","Data":"3b948ba0bd4bcff3cd9eb75ff66385b814e91c66ef000f374d1ddbb5f0b3e2aa"} Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.701985 4786 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7c65p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.702317 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" podUID="3a2643aa-9180-475a-af73-8e7b311cc77c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.712755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.713045 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.21303318 +0000 UTC m=+197.376244991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.813800 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.814491 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.314470381 +0000 UTC m=+197.477682192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:06 crc kubenswrapper[4786]: I0313 15:06:06.915522 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:06 crc kubenswrapper[4786]: E0313 15:06:06.915902 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.415866661 +0000 UTC m=+197.579078472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.017469 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.017629 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.51760491 +0000 UTC m=+197.680816721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.018271 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.018556 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.518543236 +0000 UTC m=+197.681755047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.019601 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t98jm"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.023122 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.033796 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p"] Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.035147 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874a0d99_444c_45f4_9bfa_ad8f5c469afc.slice/crio-c064f401e5954fee87411755150ec0b4bc1ca670f7dc3b40718885ce1a8d3fad WatchSource:0}: Error finding container c064f401e5954fee87411755150ec0b4bc1ca670f7dc3b40718885ce1a8d3fad: Status 404 returned error can't find the container with id c064f401e5954fee87411755150ec0b4bc1ca670f7dc3b40718885ce1a8d3fad Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.060940 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2tqr8"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.088201 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-7cf4m"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.101156 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.120044 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.120490 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.620471721 +0000 UTC m=+197.783683532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.127507 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.137294 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4rkfk"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.154014 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" podStartSLOduration=131.153997 podStartE2EDuration="2m11.153997s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.152258783 +0000 UTC m=+197.315470604" watchObservedRunningTime="2026-03-13 15:06:07.153997 +0000 UTC m=+197.317208811" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.191228 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-996mk"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.200664 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.212762 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.217657 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2bj4z"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.221176 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.221606 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.721593153 +0000 UTC m=+197.884804964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.224550 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.229897 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5m4w2"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.238256 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw"] Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.239329 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e064ca_7aad_4fdd_8270_b43a93b9ff3c.slice/crio-419171bb11b051bfb28c24cabb0379741100e00fac6f0985f5f26213f35c13a3 WatchSource:0}: Error finding container 419171bb11b051bfb28c24cabb0379741100e00fac6f0985f5f26213f35c13a3: Status 404 returned error can't find the container with id 419171bb11b051bfb28c24cabb0379741100e00fac6f0985f5f26213f35c13a3 Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.244380 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-b7krw"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.245580 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-x75ps" podStartSLOduration=131.245556203 podStartE2EDuration="2m11.245556203s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.233002023 +0000 UTC m=+197.396213834" watchObservedRunningTime="2026-03-13 15:06:07.245556203 +0000 UTC m=+197.408768014" Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.257501 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf195d176_af0f_4048_a981_358f0240f6cd.slice/crio-d0b3ceaa7b0d9845fa88ec854e9b48e6f30cded762e3ebaea0636ab0b4781ee9 WatchSource:0}: Error finding container d0b3ceaa7b0d9845fa88ec854e9b48e6f30cded762e3ebaea0636ab0b4781ee9: Status 404 returned error can't find the container with id d0b3ceaa7b0d9845fa88ec854e9b48e6f30cded762e3ebaea0636ab0b4781ee9 Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.260946 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0878bdef_ff38_4bfa_a4e3_4d656afd474f.slice/crio-f1734235102262e1e338b3b0ee705d4b44a3512c5e4dec75544cdc336930c80c WatchSource:0}: Error finding container f1734235102262e1e338b3b0ee705d4b44a3512c5e4dec75544cdc336930c80c: Status 404 returned error can't find the container with id f1734235102262e1e338b3b0ee705d4b44a3512c5e4dec75544cdc336930c80c Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.265452 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d726b70_e168_4d44_954a_c9a3c8b0db5c.slice/crio-ee42d2c8162cb43108666a1cd52836e43f0c689e0aa25a2947b1f6ee5c3bd0ed WatchSource:0}: Error finding container ee42d2c8162cb43108666a1cd52836e43f0c689e0aa25a2947b1f6ee5c3bd0ed: Status 404 returned error can't find the container with id ee42d2c8162cb43108666a1cd52836e43f0c689e0aa25a2947b1f6ee5c3bd0ed Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.268067 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dfecfec_c904_49cb_86fe_83ad121c6a68.slice/crio-ea59e4c9a10eb5173888d064cb39002e218c3f1d800ebb61d8d9019c1b403aa0 WatchSource:0}: Error finding container ea59e4c9a10eb5173888d064cb39002e218c3f1d800ebb61d8d9019c1b403aa0: Status 404 returned error can't find the container with id ea59e4c9a10eb5173888d064cb39002e218c3f1d800ebb61d8d9019c1b403aa0 Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.268191 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.270459 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd"] Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.275245 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6812c7a2_d36e_434d_90ca_079fc0a3390d.slice/crio-ee2251db122f7d11ef883d8ab15b26dd60b9c588d411f0b231cf89f0219fab4e WatchSource:0}: Error finding container ee2251db122f7d11ef883d8ab15b26dd60b9c588d411f0b231cf89f0219fab4e: Status 404 returned error can't find the container with id ee2251db122f7d11ef883d8ab15b26dd60b9c588d411f0b231cf89f0219fab4e Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.276073 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7babe28d_a681_4e19_ba38_150d554380a2.slice/crio-9b9bce9a2a66b22474bc40345e18f704b7d1a457479afefd726c116365e3be7f WatchSource:0}: Error finding container 9b9bce9a2a66b22474bc40345e18f704b7d1a457479afefd726c116365e3be7f: Status 404 returned error can't find the container with id 9b9bce9a2a66b22474bc40345e18f704b7d1a457479afefd726c116365e3be7f Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.324051 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.324336 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.824321679 +0000 UTC m=+197.987533490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.338904 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8ed915e_9dfb_4a63_87c6_f21160cdca5f.slice/crio-52a9f1544e8f915b70f5d279a06fe2143d55d88119db614689d9ac85cc144d2c WatchSource:0}: Error finding container 52a9f1544e8f915b70f5d279a06fe2143d55d88119db614689d9ac85cc144d2c: Status 404 returned error can't find the container with id 52a9f1544e8f915b70f5d279a06fe2143d55d88119db614689d9ac85cc144d2c Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.365649 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2c944"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.370569 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bhts5"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.373987 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w5gbr"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.378226 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wcljz"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.399554 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-znwnc"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.425297 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.425572 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:07.925559935 +0000 UTC m=+198.088771746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.444399 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc2927b9_2e8a_4a34_90e4_932c1f6115c3.slice/crio-65d28073b6acd7b97848fb63e3b4f41f8c6f63c8591254bc37ecc5bf20e5f0a2 WatchSource:0}: Error finding container 65d28073b6acd7b97848fb63e3b4f41f8c6f63c8591254bc37ecc5bf20e5f0a2: Status 404 returned error can't find the container with id 65d28073b6acd7b97848fb63e3b4f41f8c6f63c8591254bc37ecc5bf20e5f0a2 Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.445306 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50879169_a61b_431b_ada7_e06cca5b64bf.slice/crio-6fe6934ded9087da8486bc12ea9c28d0f033be8f1144e4355de2610f84cab2ba WatchSource:0}: Error finding container 6fe6934ded9087da8486bc12ea9c28d0f033be8f1144e4355de2610f84cab2ba: Status 404 returned error can't find the container with id 6fe6934ded9087da8486bc12ea9c28d0f033be8f1144e4355de2610f84cab2ba Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.448702 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb960566c_bcc9_41ff_9fbc_c132f0e4d6e5.slice/crio-466cec4fc04888cdcbb4d5e3309dc5ef242486784054f1ca48d2d4e447783a3e WatchSource:0}: Error finding container 466cec4fc04888cdcbb4d5e3309dc5ef242486784054f1ca48d2d4e447783a3e: Status 404 returned error can't find the container with id 466cec4fc04888cdcbb4d5e3309dc5ef242486784054f1ca48d2d4e447783a3e Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.468212 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7788c62_1972_43db_a6db_d0bce9c415a6.slice/crio-ab99588b534c60005e5cb27587e4963bae1a483d745d79d97279342a33289afe WatchSource:0}: Error finding container ab99588b534c60005e5cb27587e4963bae1a483d745d79d97279342a33289afe: Status 404 returned error can't find the container with id ab99588b534c60005e5cb27587e4963bae1a483d745d79d97279342a33289afe Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.499222 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vxx8z"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.516928 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-bz7fd"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.528398 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.530815 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.029669719 +0000 UTC m=+198.192881530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.530999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.531316 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.031305613 +0000 UTC m=+198.194517424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: W0313 15:06:07.566092 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eda6a73_a8cf_406d_ab33_394ec1982f4a.slice/crio-3dcb19c3a1321ecb5f357d620e857b85ba43d8b1489fb0a88ed50ba265e8284f WatchSource:0}: Error finding container 3dcb19c3a1321ecb5f357d620e857b85ba43d8b1489fb0a88ed50ba265e8284f: Status 404 returned error can't find the container with id 3dcb19c3a1321ecb5f357d620e857b85ba43d8b1489fb0a88ed50ba265e8284f Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.568344 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gm72l"] Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.591452 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" event={"ID":"b2d9a397-cea7-4db5-a5d0-a657bf571e1f","Type":"ContainerStarted","Data":"bbfa89b5b21abc90ad05b121a66c58dce258190c636aab397ce8c17d705d9eba"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.594594 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.595099 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" event={"ID":"827061b1-4659-4457-b74f-24d85f0bf010","Type":"ContainerStarted","Data":"e82cbf023cb0590a5714dedb8f891386f7061174103c2fb300cfe74878a43e23"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.596530 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" event={"ID":"a0f4582d-9706-4c08-852a-c6d6ac3694a7","Type":"ContainerStarted","Data":"766db5fc6f9c39ff1e9eb296a0a00936fd072b52bea7aa134b0354e7f624c77e"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.598826 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4h7qh" event={"ID":"884731f6-3835-46f5-a86c-f4ae664d255d","Type":"ContainerStarted","Data":"9d2f8ca98f93a84b1e89b8d59fd44ed8c5695cf40b9a26173a5cee88275951fb"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.608637 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" event={"ID":"963ffcec-87b4-480a-81d1-e19ecb7edb12","Type":"ContainerStarted","Data":"f64dd6acb9aa07c7e5371ebd0900b15f7419957bece303ce755dcbcefba15ab4"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.634916 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-tlwm2" podStartSLOduration=131.634896863 podStartE2EDuration="2m11.634896863s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.611019285 +0000 UTC m=+197.774231086" watchObservedRunningTime="2026-03-13 15:06:07.634896863 +0000 UTC m=+197.798108674" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.636124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" event={"ID":"01cb6b91-c961-4f3e-9d36-c839305c7a80","Type":"ContainerStarted","Data":"5ee75df072eb5d7649bdb2da0dd0df443dfd1459620ef30978d7c6642d6ccd9d"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.636237 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.636648 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.13663458 +0000 UTC m=+198.299846391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.636881 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4h7qh" podStartSLOduration=5.636872977 podStartE2EDuration="5.636872977s" podCreationTimestamp="2026-03-13 15:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.636301881 +0000 UTC m=+197.799513692" watchObservedRunningTime="2026-03-13 15:06:07.636872977 +0000 UTC m=+197.800084788" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.654294 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" event={"ID":"86fb29bf-af7d-48e1-b4d5-8fbef5c3387b","Type":"ContainerStarted","Data":"87eefb2082312c1da8a8336c1b622c4e9be86b8b79f8705125cdce3f15a19e90"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.659618 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" event={"ID":"103f0ba8-4fee-41ce-bf68-9df3feddffc8","Type":"ContainerStarted","Data":"4a97eb57158b42661698e97c1ac3202425a1fe15868afe22eb52ec0d8a3467b5"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.665253 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" event={"ID":"4dfecfec-c904-49cb-86fe-83ad121c6a68","Type":"ContainerStarted","Data":"ea59e4c9a10eb5173888d064cb39002e218c3f1d800ebb61d8d9019c1b403aa0"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.667247 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" event={"ID":"a55f76fb-e225-458d-aab9-03e376b09de9","Type":"ContainerStarted","Data":"97ded90d490c2348bc422408b1ea7128b8a849114f09186f7bbc2867c9fa2cf5"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.669791 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" event={"ID":"cc2927b9-2e8a-4a34-90e4-932c1f6115c3","Type":"ContainerStarted","Data":"65d28073b6acd7b97848fb63e3b4f41f8c6f63c8591254bc37ecc5bf20e5f0a2"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.676049 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" event={"ID":"3303beb2-619f-4973-b3c9-1f75a6e4e88c","Type":"ContainerStarted","Data":"45e9c57e28a98790760a0850f017d8374e2fcb5bf2bce35f53f0537ba307abab"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.685937 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" event={"ID":"e50609e3-d329-4c22-9be4-4100f122508d","Type":"ContainerStarted","Data":"8b4a858a4ed166152e68dfb62f406a5269aac8fa7d9aa3097f0a64054fc6fca3"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.689345 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" event={"ID":"3c6e0341-e5cb-4912-b3fb-8caedc0d4e10","Type":"ContainerStarted","Data":"6de39beffb33e9a6d385b3bbe84d7cea1f4eb0ecc9d99e021c2fd34a52965fde"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.691891 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" event={"ID":"4dbaaf1d-f5e8-487f-b5b7-e9c57029d49d","Type":"ContainerStarted","Data":"e1f52cebad6c7c47bb28e8d4d9c2537ba8963905195df5c0e2579cdc9165c4d9"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.694222 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" event={"ID":"5d72361e-212d-4e7d-a4d3-b10141badfc3","Type":"ContainerStarted","Data":"e3f1bd3f31b5bc372ed6d9fa63a5824c13d63860056f78b519b96d749e1e4959"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.696210 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6pqv7" podStartSLOduration=131.696198136 podStartE2EDuration="2m11.696198136s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.695674962 +0000 UTC m=+197.858886793" watchObservedRunningTime="2026-03-13 15:06:07.696198136 +0000 UTC m=+197.859409947" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.697598 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-w5kxr" podStartSLOduration=131.697590594 podStartE2EDuration="2m11.697590594s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.674425015 +0000 UTC m=+197.837636826" watchObservedRunningTime="2026-03-13 15:06:07.697590594 +0000 UTC m=+197.860802405" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.705554 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-z752l" event={"ID":"8cf76d74-f1bd-446f-90fe-2006ac188804","Type":"ContainerStarted","Data":"03c72429d10dce4be7bced7eb0cc125f489b11debe9056d4dd2be99cd806592e"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.709446 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dtwcs" podStartSLOduration=131.709432505 podStartE2EDuration="2m11.709432505s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.709390874 +0000 UTC m=+197.872602685" watchObservedRunningTime="2026-03-13 15:06:07.709432505 +0000 UTC m=+197.872644316" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.709578 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2tqr8" event={"ID":"0c6ae29c-e743-4193-bce1-22b4c5732f45","Type":"ContainerStarted","Data":"e3992521c77cdd988b51c4b4bae2b756da30ca49d8714baed94fdade6bd8c8c9"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.711706 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" event={"ID":"f195d176-af0f-4048-a981-358f0240f6cd","Type":"ContainerStarted","Data":"d0b3ceaa7b0d9845fa88ec854e9b48e6f30cded762e3ebaea0636ab0b4781ee9"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.716186 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-996mk" event={"ID":"3d726b70-e168-4d44-954a-c9a3c8b0db5c","Type":"ContainerStarted","Data":"ee42d2c8162cb43108666a1cd52836e43f0c689e0aa25a2947b1f6ee5c3bd0ed"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.733566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" event={"ID":"b8ed915e-9dfb-4a63-87c6-f21160cdca5f","Type":"ContainerStarted","Data":"52a9f1544e8f915b70f5d279a06fe2143d55d88119db614689d9ac85cc144d2c"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.741541 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.742097 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.24208549 +0000 UTC m=+198.405297301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.766899 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2bj4z" event={"ID":"7babe28d-a681-4e19-ba38-150d554380a2","Type":"ContainerStarted","Data":"9b9bce9a2a66b22474bc40345e18f704b7d1a457479afefd726c116365e3be7f"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.767356 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gk464" podStartSLOduration=131.767333485 podStartE2EDuration="2m11.767333485s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.741548756 +0000 UTC m=+197.904760567" watchObservedRunningTime="2026-03-13 15:06:07.767333485 +0000 UTC m=+197.930545296" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.772243 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" event={"ID":"174316ed-1085-4cbe-8ec9-406c566e914d","Type":"ContainerStarted","Data":"8119ccadb4ceaab14503629b5630aa200b2cec16dbe9e1a1378d1e0baf8f7ae7"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.792122 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" event={"ID":"6812c7a2-d36e-434d-90ca-079fc0a3390d","Type":"ContainerStarted","Data":"ee2251db122f7d11ef883d8ab15b26dd60b9c588d411f0b231cf89f0219fab4e"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.799768 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-z752l" podStartSLOduration=131.799745554 podStartE2EDuration="2m11.799745554s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.796190688 +0000 UTC m=+197.959402489" watchObservedRunningTime="2026-03-13 15:06:07.799745554 +0000 UTC m=+197.962957385" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.806040 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" event={"ID":"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af","Type":"ContainerStarted","Data":"a49f4d674975b1f1f2938e058fd57566f106ed1e94e146d1bb9dc5509194bd85"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.824095 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" event={"ID":"1b2be232-e8e1-4fa6-a574-7c5dfda4f386","Type":"ContainerStarted","Data":"7e92bc85872cebad8b82302cc433e1b57da17e12d14c2a0fd9b27a597081657c"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.851019 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.853280 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.353253976 +0000 UTC m=+198.516465787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.858131 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x7kx2" podStartSLOduration=131.858097707 podStartE2EDuration="2m11.858097707s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.850355027 +0000 UTC m=+198.013566838" watchObservedRunningTime="2026-03-13 15:06:07.858097707 +0000 UTC m=+198.021309518" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.924446 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" event={"ID":"29df0f63-b9e8-4694-b587-539d1fe80658","Type":"ContainerStarted","Data":"42d4ae7f0c5e850ee53c79b583bd23266626c36cd46d168d40b5ad5daabe57c8"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.928665 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" event={"ID":"50879169-a61b-431b-ada7-e06cca5b64bf","Type":"ContainerStarted","Data":"6fe6934ded9087da8486bc12ea9c28d0f033be8f1144e4355de2610f84cab2ba"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.946762 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" event={"ID":"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5","Type":"ContainerStarted","Data":"466cec4fc04888cdcbb4d5e3309dc5ef242486784054f1ca48d2d4e447783a3e"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.949001 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" event={"ID":"874a0d99-444c-45f4-9bfa-ad8f5c469afc","Type":"ContainerStarted","Data":"c064f401e5954fee87411755150ec0b4bc1ca670f7dc3b40718885ce1a8d3fad"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.951182 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" event={"ID":"0878bdef-ff38-4bfa-a4e3-4d656afd474f","Type":"ContainerStarted","Data":"f1734235102262e1e338b3b0ee705d4b44a3512c5e4dec75544cdc336930c80c"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.954810 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:07 crc kubenswrapper[4786]: E0313 15:06:07.955456 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.455409806 +0000 UTC m=+198.618621807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.960159 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.966597 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.968424 4786 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vl4kp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.968474 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" podUID="90352d66-9c07-441e-a71f-ee3281a66b5b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.969402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" event={"ID":"111d054e-76a1-4783-aaa4-a05fd3250b1a","Type":"ContainerStarted","Data":"f14b6acdd2882f003feab6a75f2b7333197b9e8b298eca434c2f972e8a00ecba"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.970435 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" podStartSLOduration=131.970424253 podStartE2EDuration="2m11.970424253s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:07.969493678 +0000 UTC m=+198.132705479" watchObservedRunningTime="2026-03-13 15:06:07.970424253 +0000 UTC m=+198.133636064" Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.993727 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" event={"ID":"b3e6058a-1d79-4bb5-a514-0c05c6185279","Type":"ContainerStarted","Data":"c2fe89c8ad2931f73f491cd3c5f72dd1ee7b6923370a9ae1c3b776481c9c322b"} Mar 13 15:06:07 crc kubenswrapper[4786]: I0313 15:06:07.995318 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w5gbr" event={"ID":"f7788c62-1972-43db-a6db-d0bce9c415a6","Type":"ContainerStarted","Data":"ab99588b534c60005e5cb27587e4963bae1a483d745d79d97279342a33289afe"} Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.003390 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" event={"ID":"55e064ca-7aad-4fdd-8270-b43a93b9ff3c","Type":"ContainerStarted","Data":"419171bb11b051bfb28c24cabb0379741100e00fac6f0985f5f26213f35c13a3"} Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.003960 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.005434 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" event={"ID":"233ce256-9724-4f50-a45b-a58aaaee2ec8","Type":"ContainerStarted","Data":"e1053ff47dbea957602d22f2710755f3ffbddebd528136a9d88f2700d9cb1cde"} Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.016569 4786 patch_prober.go:28] interesting pod/console-operator-58897d9998-7cf4m container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.016813 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" podUID="55e064ca-7aad-4fdd-8270-b43a93b9ff3c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.022995 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-znwnc" event={"ID":"e077b561-62da-4d5b-b7a2-faf0e03f46b1","Type":"ContainerStarted","Data":"4dfaa4f2ab7341fc4c40844a448d1a6583d59aeeff4b070fce08a6da70fa5584"} Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.024090 4786 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7c65p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.024168 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" podUID="3a2643aa-9180-475a-af73-8e7b311cc77c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.046036 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" podStartSLOduration=132.045967461 podStartE2EDuration="2m12.045967461s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:08.030676787 +0000 UTC m=+198.193888598" watchObservedRunningTime="2026-03-13 15:06:08.045967461 +0000 UTC m=+198.209179272" Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.057448 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.059132 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.559116728 +0000 UTC m=+198.722328539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.088927 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" podStartSLOduration=132.088830724 podStartE2EDuration="2m12.088830724s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:08.050745811 +0000 UTC m=+198.213957622" watchObservedRunningTime="2026-03-13 15:06:08.088830724 +0000 UTC m=+198.252042535" Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.089975 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" podStartSLOduration=132.089964895 podStartE2EDuration="2m12.089964895s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:08.081840924 +0000 UTC m=+198.245052745" watchObservedRunningTime="2026-03-13 15:06:08.089964895 +0000 UTC m=+198.253176716" Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.159834 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.162128 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.662114122 +0000 UTC m=+198.825325933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.218512 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.226404 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.226472 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.261269 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.261735 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.761715293 +0000 UTC m=+198.924927104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.364044 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.364443 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.864426959 +0000 UTC m=+199.027638770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.464843 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.465808 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:08.965770338 +0000 UTC m=+199.128982289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.567239 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.567528 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.067517287 +0000 UTC m=+199.230729088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.678696 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.679165 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.179150965 +0000 UTC m=+199.342362776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.780333 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.780628 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.280616227 +0000 UTC m=+199.443828038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.881489 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.881815 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.381788131 +0000 UTC m=+199.544999942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.881872 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.882222 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.382211442 +0000 UTC m=+199.545423253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:08 crc kubenswrapper[4786]: I0313 15:06:08.983090 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:08 crc kubenswrapper[4786]: E0313 15:06:08.983634 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.483619123 +0000 UTC m=+199.646830934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.085032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.085340 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.585329121 +0000 UTC m=+199.748540932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.085402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" event={"ID":"3303beb2-619f-4973-b3c9-1f75a6e4e88c","Type":"ContainerStarted","Data":"ea936a8c2bcfc05af8d5e7564240645979c9f5072dfd95e5596acbc73a5d44c5"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.085745 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.095086 4786 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-b7krw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" start-of-body= Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.095122 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" podUID="3303beb2-619f-4973-b3c9-1f75a6e4e88c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.107089 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" event={"ID":"55e064ca-7aad-4fdd-8270-b43a93b9ff3c","Type":"ContainerStarted","Data":"9957ec10c3c750a6811cb5fde51882c7fa71627b49d8a1771a501f3bc73c6ef6"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.108111 4786 patch_prober.go:28] interesting pod/console-operator-58897d9998-7cf4m container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.108156 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" podUID="55e064ca-7aad-4fdd-8270-b43a93b9ff3c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.118921 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" podStartSLOduration=133.118900282 podStartE2EDuration="2m13.118900282s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.113323391 +0000 UTC m=+199.276535202" watchObservedRunningTime="2026-03-13 15:06:09.118900282 +0000 UTC m=+199.282112083" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.133699 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2bj4z" event={"ID":"7babe28d-a681-4e19-ba38-150d554380a2","Type":"ContainerStarted","Data":"3ac1697f04da33c445397642f94f0daa203aed7c13a934038b849ec2aa2d98cb"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.134701 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2bj4z" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.140251 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" event={"ID":"1eda6a73-a8cf-406d-ab33-394ec1982f4a","Type":"ContainerStarted","Data":"3dcb19c3a1321ecb5f357d620e857b85ba43d8b1489fb0a88ed50ba265e8284f"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.148353 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bj4z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.148404 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2bj4z" podUID="7babe28d-a681-4e19-ba38-150d554380a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.188315 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.188435 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.688417337 +0000 UTC m=+199.851629148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.188676 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.206021 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.706001164 +0000 UTC m=+199.869212975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.234286 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:09 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:09 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:09 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.234667 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.234587 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" event={"ID":"827061b1-4659-4457-b74f-24d85f0bf010","Type":"ContainerStarted","Data":"eba0182a2c4163559dfb7a05220ce3febbd5ce95038f4815a6276a354ef39032"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.251527 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" event={"ID":"d36f11ac-fb3e-4577-ba62-127115e6cc87","Type":"ContainerStarted","Data":"548b375fa2cd516cd380c74d11bedc80bde33a52316f441f14b1ffced0e2887f"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.265248 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qcs2g" podStartSLOduration=133.265231511 podStartE2EDuration="2m13.265231511s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.265098497 +0000 UTC m=+199.428310308" watchObservedRunningTime="2026-03-13 15:06:09.265231511 +0000 UTC m=+199.428443322" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.265329 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2bj4z" podStartSLOduration=133.265325183 podStartE2EDuration="2m13.265325183s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.165917387 +0000 UTC m=+199.329129198" watchObservedRunningTime="2026-03-13 15:06:09.265325183 +0000 UTC m=+199.428536994" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.293155 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.294101 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.794085293 +0000 UTC m=+199.957297104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.321898 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" event={"ID":"01cb6b91-c961-4f3e-9d36-c839305c7a80","Type":"ContainerStarted","Data":"f0588c1d516be127445aeb1fa754f36a7cd307a8d9ba4ea11752f522513e6a28"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.322539 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.325425 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" event={"ID":"f195d176-af0f-4048-a981-358f0240f6cd","Type":"ContainerStarted","Data":"a38bbdd09692069accfb4977392e3342d68526051f512b2dc2daabe22eb6847d"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.338723 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-996mk" event={"ID":"3d726b70-e168-4d44-954a-c9a3c8b0db5c","Type":"ContainerStarted","Data":"3eaff887a8d0629a36df064e05aeb4d209696642a0929ece0810b0ab70f839c2"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.362809 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" event={"ID":"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5","Type":"ContainerStarted","Data":"b78fb4033ded706bd31c8dfd2f8297e81301ca3aeeb43d2ce76954d06a3a8b0e"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.369566 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.371002 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w5gbr" event={"ID":"f7788c62-1972-43db-a6db-d0bce9c415a6","Type":"ContainerStarted","Data":"999bc349d7a949955f76ac28fcb7fb73ef2b56ffc010ea88a010dddda54c2140"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.380203 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" event={"ID":"4dfecfec-c904-49cb-86fe-83ad121c6a68","Type":"ContainerStarted","Data":"d4202e82e7bcba95dbb495cd30400e6d2ef8ebae480ef9857b6245cdbedcca9a"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.381007 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.394781 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.396810 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:09.896795549 +0000 UTC m=+200.060007360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.409613 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bm59p" podStartSLOduration=133.409594986 podStartE2EDuration="2m13.409594986s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.369227041 +0000 UTC m=+199.532438852" watchObservedRunningTime="2026-03-13 15:06:09.409594986 +0000 UTC m=+199.572806797" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.411318 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" podStartSLOduration=133.411310133 podStartE2EDuration="2m13.411310133s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.406680977 +0000 UTC m=+199.569892788" watchObservedRunningTime="2026-03-13 15:06:09.411310133 +0000 UTC m=+199.574521944" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.420374 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" event={"ID":"50879169-a61b-431b-ada7-e06cca5b64bf","Type":"ContainerStarted","Data":"4f6b63af970c6d52569334b5ca19cd1cead4cf61327bc79cbd1ae4528b490100"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.452260 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" event={"ID":"90352d66-9c07-441e-a71f-ee3281a66b5b","Type":"ContainerStarted","Data":"6523419c32251ff20cd74625ed73d8b3afcb9c61abd04b14602e87583efa02e5"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.482008 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" event={"ID":"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af","Type":"ContainerStarted","Data":"2cd44f730f0e36d056d8017b35bdaab621c0324faa0b9ec39d5d8fe4ae805e60"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.496588 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" podStartSLOduration=133.496560865 podStartE2EDuration="2m13.496560865s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.451617296 +0000 UTC m=+199.614829117" watchObservedRunningTime="2026-03-13 15:06:09.496560865 +0000 UTC m=+199.659772676" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.497771 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bhts5" podStartSLOduration=133.497765187 podStartE2EDuration="2m13.497765187s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.495701262 +0000 UTC m=+199.658913083" watchObservedRunningTime="2026-03-13 15:06:09.497765187 +0000 UTC m=+199.660976998" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.500194 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4rkfk" event={"ID":"0878bdef-ff38-4bfa-a4e3-4d656afd474f","Type":"ContainerStarted","Data":"f1f7d0083d735b0c0c7ec3c01fa67ce27ab25ac719ad7b000d23a6b6df5ca1df"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.500600 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vl4kp" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.501235 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.505754 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.005721943 +0000 UTC m=+200.168933914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.506105 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.508696 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.008684394 +0000 UTC m=+200.171896205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.518178 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" event={"ID":"b8ed915e-9dfb-4a63-87c6-f21160cdca5f","Type":"ContainerStarted","Data":"e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.519052 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.521617 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" event={"ID":"cc2927b9-2e8a-4a34-90e4-932c1f6115c3","Type":"ContainerStarted","Data":"13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.522465 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.524553 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" event={"ID":"a0f4582d-9706-4c08-852a-c6d6ac3694a7","Type":"ContainerStarted","Data":"1a11f741c639fbc2eada57fd25067a00ce6120f89ad2ece7a83a8ede41439752"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.536541 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" podStartSLOduration=133.536520449 podStartE2EDuration="2m13.536520449s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.536321833 +0000 UTC m=+199.699533644" watchObservedRunningTime="2026-03-13 15:06:09.536520449 +0000 UTC m=+199.699732250" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.540365 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" event={"ID":"5d72361e-212d-4e7d-a4d3-b10141badfc3","Type":"ContainerStarted","Data":"827a7ef5b0422a5427ca1a41d998ff671c9b3a5c62d04bc94caebb8658c9bebd"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.548068 4786 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-d5wsg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.548133 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" podUID="b8ed915e-9dfb-4a63-87c6-f21160cdca5f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.548172 4786 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wcljz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.548235 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.591137 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" event={"ID":"111d054e-76a1-4783-aaa4-a05fd3250b1a","Type":"ContainerStarted","Data":"d3e33bd76f3ed1cc02302540d7d56975d135a21aa7a031ab817f2e0b7f7a9ac7"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.608503 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.610030 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.110010792 +0000 UTC m=+200.273222603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.616194 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w5gbr" podStartSLOduration=7.616166519 podStartE2EDuration="7.616166519s" podCreationTimestamp="2026-03-13 15:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.559246115 +0000 UTC m=+199.722457926" watchObservedRunningTime="2026-03-13 15:06:09.616166519 +0000 UTC m=+199.779378330" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.630482 4786 generic.go:334] "Generic (PLEG): container finished" podID="e50609e3-d329-4c22-9be4-4100f122508d" containerID="8b4a858a4ed166152e68dfb62f406a5269aac8fa7d9aa3097f0a64054fc6fca3" exitCode=0 Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.631492 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" event={"ID":"e50609e3-d329-4c22-9be4-4100f122508d","Type":"ContainerDied","Data":"8b4a858a4ed166152e68dfb62f406a5269aac8fa7d9aa3097f0a64054fc6fca3"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.701337 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2tqr8" event={"ID":"0c6ae29c-e743-4193-bce1-22b4c5732f45","Type":"ContainerStarted","Data":"a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.719766 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.720395 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.220377815 +0000 UTC m=+200.383589626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.740421 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t98jm" event={"ID":"874a0d99-444c-45f4-9bfa-ad8f5c469afc","Type":"ContainerStarted","Data":"9d9c72e72905b974baa66ace0161eb07240d5882d6ca37f667049a54338aaa69"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.768428 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" event={"ID":"103f0ba8-4fee-41ce-bf68-9df3feddffc8","Type":"ContainerStarted","Data":"44458e010e1d0ca5c4020ee323bc84da1c6243f6883ded8f4ee74d8bda60869d"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.789355 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9tj94" podStartSLOduration=133.789334386 podStartE2EDuration="2m13.789334386s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.637292342 +0000 UTC m=+199.800504153" watchObservedRunningTime="2026-03-13 15:06:09.789334386 +0000 UTC m=+199.952546197" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.789799 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" podStartSLOduration=133.789795098 podStartE2EDuration="2m13.789795098s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.770489604 +0000 UTC m=+199.933701425" watchObservedRunningTime="2026-03-13 15:06:09.789795098 +0000 UTC m=+199.953006909" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.820964 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" event={"ID":"233ce256-9724-4f50-a45b-a58aaaee2ec8","Type":"ContainerStarted","Data":"b57eef303bdc6cfefc7ee8d14430ba763cf00501ea44e522c11a0c2249ad12c2"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.821377 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.822642 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.322626839 +0000 UTC m=+200.485838650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.854038 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" event={"ID":"6812c7a2-d36e-434d-90ca-079fc0a3390d","Type":"ContainerStarted","Data":"25d94c9dca3c5087f60330ad24981d06d344fb088a1d8f5ecb56e24e1e7cf8e1"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.854762 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.874303 4786 generic.go:334] "Generic (PLEG): container finished" podID="174316ed-1085-4cbe-8ec9-406c566e914d" containerID="38cda586aab8e4f4eb3d97763832eeac7a2bd74d05a1b643fcbc6ae925c4d1d9" exitCode=0 Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.874579 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" event={"ID":"174316ed-1085-4cbe-8ec9-406c566e914d","Type":"ContainerDied","Data":"38cda586aab8e4f4eb3d97763832eeac7a2bd74d05a1b643fcbc6ae925c4d1d9"} Mar 13 15:06:09 crc kubenswrapper[4786]: I0313 15:06:09.923627 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:09 crc kubenswrapper[4786]: E0313 15:06:09.926620 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.426606609 +0000 UTC m=+200.589818420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.022347 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" podStartSLOduration=134.022324515 podStartE2EDuration="2m14.022324515s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:09.965451812 +0000 UTC m=+200.128663623" watchObservedRunningTime="2026-03-13 15:06:10.022324515 +0000 UTC m=+200.185536326" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.022626 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" podStartSLOduration=134.022622953 podStartE2EDuration="2m14.022622953s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:10.018159212 +0000 UTC m=+200.181371023" watchObservedRunningTime="2026-03-13 15:06:10.022622953 +0000 UTC m=+200.185834764" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.032561 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.034489 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.534469464 +0000 UTC m=+200.697681275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.035057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.039111 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.53910017 +0000 UTC m=+200.702311981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.071729 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" podStartSLOduration=134.071705644 podStartE2EDuration="2m14.071705644s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:10.062114594 +0000 UTC m=+200.225326425" watchObservedRunningTime="2026-03-13 15:06:10.071705644 +0000 UTC m=+200.234917445" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.128434 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7f2vq" podStartSLOduration=134.128414262 podStartE2EDuration="2m14.128414262s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:10.092332314 +0000 UTC m=+200.255544125" watchObservedRunningTime="2026-03-13 15:06:10.128414262 +0000 UTC m=+200.291626073" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.141399 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.162468 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.662422755 +0000 UTC m=+200.825634566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.231242 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:10 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:10 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:10 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.231301 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.255652 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.255946 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.755934451 +0000 UTC m=+200.919146262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.358915 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.359233 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.859219982 +0000 UTC m=+201.022431793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.368019 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" podStartSLOduration=134.368001361 podStartE2EDuration="2m14.368001361s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:10.35840782 +0000 UTC m=+200.521619631" watchObservedRunningTime="2026-03-13 15:06:10.368001361 +0000 UTC m=+200.531213172" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.384964 4786 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-mdvv7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.385033 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" podUID="4dfecfec-c904-49cb-86fe-83ad121c6a68" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.463149 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.463467 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:10.96345333 +0000 UTC m=+201.126665141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.467534 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hcn8x" podStartSLOduration=134.46751606 podStartE2EDuration="2m14.46751606s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:10.466322297 +0000 UTC m=+200.629534108" watchObservedRunningTime="2026-03-13 15:06:10.46751606 +0000 UTC m=+200.630727871" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.468418 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-2tqr8" podStartSLOduration=134.468412164 podStartE2EDuration="2m14.468412164s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:10.416764283 +0000 UTC m=+200.579976084" watchObservedRunningTime="2026-03-13 15:06:10.468412164 +0000 UTC m=+200.631623975" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.566387 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.566606 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.066583617 +0000 UTC m=+201.229795418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.566818 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.567109 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.067098071 +0000 UTC m=+201.230309882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.595738 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" podStartSLOduration=134.595721777 podStartE2EDuration="2m14.595721777s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:10.552896195 +0000 UTC m=+200.716108006" watchObservedRunningTime="2026-03-13 15:06:10.595721777 +0000 UTC m=+200.758933588" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.595837 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" podStartSLOduration=134.59583363 podStartE2EDuration="2m14.59583363s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:10.595055069 +0000 UTC m=+200.758266880" watchObservedRunningTime="2026-03-13 15:06:10.59583363 +0000 UTC m=+200.759045441" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.668347 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.668688 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.168669326 +0000 UTC m=+201.331881137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.699825 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2dkvg"] Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.700788 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.726305 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.740387 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dkvg"] Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.781571 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-utilities\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.781615 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j679r\" (UniqueName: \"kubernetes.io/projected/44d91c62-557f-40c8-a725-33ff965bee1b-kube-api-access-j679r\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.781642 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-catalog-content\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.781731 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.782071 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.282056791 +0000 UTC m=+201.445268602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.885062 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kbj4r"] Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.885923 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.886844 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.887045 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.387012268 +0000 UTC m=+201.550224079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.887090 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.887141 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-utilities\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.887162 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j679r\" (UniqueName: \"kubernetes.io/projected/44d91c62-557f-40c8-a725-33ff965bee1b-kube-api-access-j679r\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.887180 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-catalog-content\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.887604 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-catalog-content\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.887823 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.387814309 +0000 UTC m=+201.551026120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.888160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-utilities\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.896702 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.916830 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbj4r"] Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.928804 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j679r\" (UniqueName: \"kubernetes.io/projected/44d91c62-557f-40c8-a725-33ff965bee1b-kube-api-access-j679r\") pod \"certified-operators-2dkvg\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.952968 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-fsdpw" event={"ID":"103f0ba8-4fee-41ce-bf68-9df3feddffc8","Type":"ContainerStarted","Data":"1ef138e0e673dc066b2361f741c0b38a923b7dc85e8fa70c297bc932231615fc"} Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.989837 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.990010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-utilities\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.990044 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np77p\" (UniqueName: \"kubernetes.io/projected/4bcce315-5828-4a7c-870f-6dd6518af3dd-kube-api-access-np77p\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:10 crc kubenswrapper[4786]: I0313 15:06:10.990114 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-catalog-content\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:10 crc kubenswrapper[4786]: E0313 15:06:10.990235 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.490223197 +0000 UTC m=+201.653435008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.003639 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" event={"ID":"e50609e3-d329-4c22-9be4-4100f122508d","Type":"ContainerStarted","Data":"6945b3f872de3b128b16a1a082d8e7b2a7054d247e746231bc11c6c902b4e2ee"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.004290 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.007947 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-znwnc" event={"ID":"e077b561-62da-4d5b-b7a2-faf0e03f46b1","Type":"ContainerStarted","Data":"13cdb3646da7ae974f59e7ce1c75f9b309995b098f6c1e4ebf5e376c56deb42a"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.016982 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" event={"ID":"b3e6058a-1d79-4bb5-a514-0c05c6185279","Type":"ContainerStarted","Data":"6f4f7e2264a2b6bd82fd52171847d2971e5b17704b5a04e22f97b925da1ff50f"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.036631 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ph2dx" event={"ID":"f195d176-af0f-4048-a981-358f0240f6cd","Type":"ContainerStarted","Data":"ab32f7136f278c5fd864df98e7fc1ea3eb9402440a294c52c629a85bb49884df"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.042282 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" event={"ID":"29df0f63-b9e8-4694-b587-539d1fe80658","Type":"ContainerStarted","Data":"add4393dcbd85e7d372b0e6d3371d4bab8dd30c97ec18956404baad547e83346"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.042333 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" event={"ID":"29df0f63-b9e8-4694-b587-539d1fe80658","Type":"ContainerStarted","Data":"132835a8f840aec42756e8f0d8e9de4c160e422bc01d8d626aacee9308720515"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.059983 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.090933 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-92qtc"] Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.091814 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.092960 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-utilities\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.093021 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np77p\" (UniqueName: \"kubernetes.io/projected/4bcce315-5828-4a7c-870f-6dd6518af3dd-kube-api-access-np77p\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.093041 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.093123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-catalog-content\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.094315 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-utilities\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.094505 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.594491175 +0000 UTC m=+201.757702986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.095316 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sdzbq" podStartSLOduration=135.095295407 podStartE2EDuration="2m15.095295407s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:11.091613517 +0000 UTC m=+201.254825328" watchObservedRunningTime="2026-03-13 15:06:11.095295407 +0000 UTC m=+201.258507218" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.100494 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-catalog-content\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.128361 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-92qtc"] Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.139120 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" podStartSLOduration=135.139103675 podStartE2EDuration="2m15.139103675s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:11.138591321 +0000 UTC m=+201.301803132" watchObservedRunningTime="2026-03-13 15:06:11.139103675 +0000 UTC m=+201.302315486" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.141840 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np77p\" (UniqueName: \"kubernetes.io/projected/4bcce315-5828-4a7c-870f-6dd6518af3dd-kube-api-access-np77p\") pod \"community-operators-kbj4r\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.175420 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-996mk" event={"ID":"3d726b70-e168-4d44-954a-c9a3c8b0db5c","Type":"ContainerStarted","Data":"336bce82f760de19033ff3444a8146f73d8a0985c4e308a4645b83e8565fa72a"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.175703 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-996mk" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.194302 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vxx8z" podStartSLOduration=135.194283831 podStartE2EDuration="2m15.194283831s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:11.194103207 +0000 UTC m=+201.357315018" watchObservedRunningTime="2026-03-13 15:06:11.194283831 +0000 UTC m=+201.357495642" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.194649 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.194809 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.694791725 +0000 UTC m=+201.858003536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.194913 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-catalog-content\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.194964 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.195121 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4pmd\" (UniqueName: \"kubernetes.io/projected/0115a774-8b25-4e1d-9d6f-c4202035efa9-kube-api-access-w4pmd\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.195220 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-utilities\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.196086 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.69607692 +0000 UTC m=+201.859288731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.197255 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" event={"ID":"174316ed-1085-4cbe-8ec9-406c566e914d","Type":"ContainerStarted","Data":"9131108b4dc77379501283bb8f872ca57ba2d7050ed94b8fba8bfaae9e512c3e"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.244733 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:11 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:11 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:11 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.244788 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.246079 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.251929 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-996mk" podStartSLOduration=9.251907424 podStartE2EDuration="9.251907424s" podCreationTimestamp="2026-03-13 15:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:11.245302405 +0000 UTC m=+201.408514216" watchObservedRunningTime="2026-03-13 15:06:11.251907424 +0000 UTC m=+201.415119235" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.260253 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" event={"ID":"d36f11ac-fb3e-4577-ba62-127115e6cc87","Type":"ContainerStarted","Data":"3693919548fdaafc67ffdc49addee19d67cdf9df35cd6e5c70283ea38563f136"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.260297 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" event={"ID":"d36f11ac-fb3e-4577-ba62-127115e6cc87","Type":"ContainerStarted","Data":"2ad00fbcfc6e04bf38ede0fa1e7d34c6c8be2eb8d69088596e771853ae0121c0"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.288097 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k626p"] Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.293646 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" event={"ID":"6812c7a2-d36e-434d-90ca-079fc0a3390d","Type":"ContainerStarted","Data":"f86d168b8979b2ee9b353dc7c5df44df87d8397a1972ddc54a51d7e9445a3400"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.293784 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.297530 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.297758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-utilities\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.297902 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-catalog-content\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.297952 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4pmd\" (UniqueName: \"kubernetes.io/projected/0115a774-8b25-4e1d-9d6f-c4202035efa9-kube-api-access-w4pmd\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.298364 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.798344994 +0000 UTC m=+201.961556805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.299717 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-utilities\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.308087 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-catalog-content\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.317461 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k626p"] Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.329567 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4pmd\" (UniqueName: \"kubernetes.io/projected/0115a774-8b25-4e1d-9d6f-c4202035efa9-kube-api-access-w4pmd\") pod \"certified-operators-92qtc\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.342124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rm4xd" event={"ID":"111d054e-76a1-4783-aaa4-a05fd3250b1a","Type":"ContainerStarted","Data":"bc7a0fa7c44618b9272c038afc1370c0dc81566f339772126321b739f3c28d5a"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.349538 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c65p"] Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.349749 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" podUID="3a2643aa-9180-475a-af73-8e7b311cc77c" containerName="controller-manager" containerID="cri-o://068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa" gracePeriod=30 Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.350112 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-gm72l" podStartSLOduration=135.350097078 podStartE2EDuration="2m15.350097078s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:11.34169253 +0000 UTC m=+201.504904341" watchObservedRunningTime="2026-03-13 15:06:11.350097078 +0000 UTC m=+201.513308889" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.369292 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg"] Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.373549 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.394175 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2c944" event={"ID":"b960566c-bcc9-41ff-9fbc-c132f0e4d6e5","Type":"ContainerStarted","Data":"6220dbd0a03967a0e6c7370dad4ee15b59bfff3c8ba68ce61afce3a64c831e2e"} Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.403333 4786 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wcljz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.403716 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.411041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-utilities\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.411134 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlvdz\" (UniqueName: \"kubernetes.io/projected/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-kube-api-access-mlvdz\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.411169 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-catalog-content\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.411214 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.418868 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bj4z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.418913 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2bj4z" podUID="7babe28d-a681-4e19-ba38-150d554380a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.422722 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:11.922706277 +0000 UTC m=+202.085918088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.423174 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.437698 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mdvv7" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.447160 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.451066 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-7cf4m" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.516066 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.516417 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-utilities\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.516734 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlvdz\" (UniqueName: \"kubernetes.io/projected/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-kube-api-access-mlvdz\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.516810 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-catalog-content\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.517808 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.017783056 +0000 UTC m=+202.180994927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.533331 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-utilities\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.542186 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-catalog-content\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.550627 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58864: no serving certificate available for the kubelet" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.607928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlvdz\" (UniqueName: \"kubernetes.io/projected/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-kube-api-access-mlvdz\") pod \"community-operators-k626p\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.624664 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.627092 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.127072319 +0000 UTC m=+202.290284130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.634849 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k626p" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.645939 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58876: no serving certificate available for the kubelet" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.728829 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.729140 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.229113036 +0000 UTC m=+202.392324847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.729711 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.730142 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.230125644 +0000 UTC m=+202.393337455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.748682 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58884: no serving certificate available for the kubelet" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.793301 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dkvg"] Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.831482 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.844059 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.344020623 +0000 UTC m=+202.507232434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.844306 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58894: no serving certificate available for the kubelet" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.872443 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58896: no serving certificate available for the kubelet" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.914522 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.939552 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:11 crc kubenswrapper[4786]: E0313 15:06:11.945769 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.439838962 +0000 UTC m=+202.603050773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:11 crc kubenswrapper[4786]: I0313 15:06:11.964136 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58912: no serving certificate available for the kubelet" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.040706 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.041193 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.5411759 +0000 UTC m=+202.704387711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.041322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.043652 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.543641767 +0000 UTC m=+202.706853578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.142366 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.142686 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.642673013 +0000 UTC m=+202.805884824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.166908 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbj4r"] Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.172648 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-92qtc"] Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.209137 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58924: no serving certificate available for the kubelet" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.222555 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:12 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:12 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:12 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.222604 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.228701 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.249655 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.250045 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.750033605 +0000 UTC m=+202.913245416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: W0313 15:06:12.258882 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0115a774_8b25_4e1d_9d6f_c4202035efa9.slice/crio-90b78259b57dbe97ef65ec7e9ad8da36b01873f67aaad642637353d3bd4b4c7e WatchSource:0}: Error finding container 90b78259b57dbe97ef65ec7e9ad8da36b01873f67aaad642637353d3bd4b4c7e: Status 404 returned error can't find the container with id 90b78259b57dbe97ef65ec7e9ad8da36b01873f67aaad642637353d3bd4b4c7e Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.267311 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k626p"] Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.351635 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.351998 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.85197213 +0000 UTC m=+203.015183941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.352031 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-client-ca\") pod \"3a2643aa-9180-475a-af73-8e7b311cc77c\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.352072 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-proxy-ca-bundles\") pod \"3a2643aa-9180-475a-af73-8e7b311cc77c\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.352108 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-config\") pod \"3a2643aa-9180-475a-af73-8e7b311cc77c\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.352128 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a2643aa-9180-475a-af73-8e7b311cc77c-serving-cert\") pod \"3a2643aa-9180-475a-af73-8e7b311cc77c\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.352146 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fz8c\" (UniqueName: \"kubernetes.io/projected/3a2643aa-9180-475a-af73-8e7b311cc77c-kube-api-access-2fz8c\") pod \"3a2643aa-9180-475a-af73-8e7b311cc77c\" (UID: \"3a2643aa-9180-475a-af73-8e7b311cc77c\") " Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.352459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.352720 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.85271322 +0000 UTC m=+203.015925031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.354171 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3a2643aa-9180-475a-af73-8e7b311cc77c" (UID: "3a2643aa-9180-475a-af73-8e7b311cc77c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.354182 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-client-ca" (OuterVolumeSpecName: "client-ca") pod "3a2643aa-9180-475a-af73-8e7b311cc77c" (UID: "3a2643aa-9180-475a-af73-8e7b311cc77c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.354834 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-config" (OuterVolumeSpecName: "config") pod "3a2643aa-9180-475a-af73-8e7b311cc77c" (UID: "3a2643aa-9180-475a-af73-8e7b311cc77c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.356667 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b95c49568-qf5zm"] Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.356843 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2643aa-9180-475a-af73-8e7b311cc77c" containerName="controller-manager" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.358276 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2643aa-9180-475a-af73-8e7b311cc77c" containerName="controller-manager" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.358453 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2643aa-9180-475a-af73-8e7b311cc77c" containerName="controller-manager" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.358771 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.376302 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b95c49568-qf5zm"] Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.390847 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2643aa-9180-475a-af73-8e7b311cc77c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3a2643aa-9180-475a-af73-8e7b311cc77c" (UID: "3a2643aa-9180-475a-af73-8e7b311cc77c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.393128 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2643aa-9180-475a-af73-8e7b311cc77c-kube-api-access-2fz8c" (OuterVolumeSpecName: "kube-api-access-2fz8c") pod "3a2643aa-9180-475a-af73-8e7b311cc77c" (UID: "3a2643aa-9180-475a-af73-8e7b311cc77c"). InnerVolumeSpecName "kube-api-access-2fz8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.453743 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454480 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-serving-cert\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-config\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454559 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6ng\" (UniqueName: \"kubernetes.io/projected/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-kube-api-access-tv6ng\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454586 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-client-ca\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454627 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-proxy-ca-bundles\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454671 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454686 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454701 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2643aa-9180-475a-af73-8e7b311cc77c-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454714 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a2643aa-9180-475a-af73-8e7b311cc77c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.454724 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fz8c\" (UniqueName: \"kubernetes.io/projected/3a2643aa-9180-475a-af73-8e7b311cc77c-kube-api-access-2fz8c\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.454804 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:12.954786399 +0000 UTC m=+203.117998210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.481944 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k626p" event={"ID":"4672e3ad-80ad-4d20-89b6-b6d11c9eb508","Type":"ContainerStarted","Data":"dcd5af43f7bfbadd48be6ef5bdbb7204cd196dfdb4feda95bf67c2f08cc26199"} Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.484668 4786 generic.go:334] "Generic (PLEG): container finished" podID="44d91c62-557f-40c8-a725-33ff965bee1b" containerID="d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19" exitCode=0 Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.484711 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dkvg" event={"ID":"44d91c62-557f-40c8-a725-33ff965bee1b","Type":"ContainerDied","Data":"d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19"} Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.484729 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dkvg" event={"ID":"44d91c62-557f-40c8-a725-33ff965bee1b","Type":"ContainerStarted","Data":"2bd7f7a5aad35099e9f2c1130f4e44be0dfb3310f20ab430c9447fec995687c5"} Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.521892 4786 generic.go:334] "Generic (PLEG): container finished" podID="3a2643aa-9180-475a-af73-8e7b311cc77c" containerID="068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa" exitCode=0 Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.521988 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" event={"ID":"3a2643aa-9180-475a-af73-8e7b311cc77c","Type":"ContainerDied","Data":"068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa"} Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.522019 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" event={"ID":"3a2643aa-9180-475a-af73-8e7b311cc77c","Type":"ContainerDied","Data":"89ebedb800150b8d7a01cd7fbc6fbb9189593ba56a18165425a70e9461d0347e"} Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.522038 4786 scope.go:117] "RemoveContainer" containerID="068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.522171 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7c65p" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.555651 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-serving-cert\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.555720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6ng\" (UniqueName: \"kubernetes.io/projected/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-kube-api-access-tv6ng\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.555746 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-config\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.555776 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-client-ca\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.555836 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-proxy-ca-bundles\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.555940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.556249 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.05623545 +0000 UTC m=+203.219447261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.557189 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-client-ca\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.559098 4786 generic.go:334] "Generic (PLEG): container finished" podID="3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af" containerID="2cd44f730f0e36d056d8017b35bdaab621c0324faa0b9ec39d5d8fe4ae805e60" exitCode=0 Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.560668 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-proxy-ca-bundles\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.563255 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-config\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.566957 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" event={"ID":"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af","Type":"ContainerDied","Data":"2cd44f730f0e36d056d8017b35bdaab621c0324faa0b9ec39d5d8fe4ae805e60"} Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.567614 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-serving-cert\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.572834 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbj4r" event={"ID":"4bcce315-5828-4a7c-870f-6dd6518af3dd","Type":"ContainerStarted","Data":"ff0abccfe597a5524f1e8a31ec63c7e6f82d9fe08573d94cdee3cb4d42a99ac2"} Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.576415 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58928: no serving certificate available for the kubelet" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.578703 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c65p"] Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.581779 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7c65p"] Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.585162 4786 scope.go:117] "RemoveContainer" containerID="068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.585848 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92qtc" event={"ID":"0115a774-8b25-4e1d-9d6f-c4202035efa9","Type":"ContainerStarted","Data":"90b78259b57dbe97ef65ec7e9ad8da36b01873f67aaad642637353d3bd4b4c7e"} Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.585934 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa\": container with ID starting with 068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa not found: ID does not exist" containerID="068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.585959 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa"} err="failed to get container status \"068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa\": rpc error: code = NotFound desc = could not find container \"068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa\": container with ID starting with 068cc6499c2fa74256a450c8befa418e54db36d76e2a763f6898989b744d25fa not found: ID does not exist" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.592580 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6ng\" (UniqueName: \"kubernetes.io/projected/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-kube-api-access-tv6ng\") pod \"controller-manager-b95c49568-qf5zm\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.599051 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" event={"ID":"174316ed-1085-4cbe-8ec9-406c566e914d","Type":"ContainerStarted","Data":"3c77070c5c9d3cb01a7fa77934d5b0afd0e482c739c59575d1b65e00c4542433"} Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.608432 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bj4z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.608506 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2bj4z" podUID="7babe28d-a681-4e19-ba38-150d554380a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.621557 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" podUID="b8ed915e-9dfb-4a63-87c6-f21160cdca5f" containerName="route-controller-manager" containerID="cri-o://e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc" gracePeriod=30 Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.621785 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.658510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.660115 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.160090457 +0000 UTC m=+203.323302308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.685922 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5v9g8"] Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.686328 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" podStartSLOduration=136.686305148 podStartE2EDuration="2m16.686305148s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:12.661407303 +0000 UTC m=+202.824619114" watchObservedRunningTime="2026-03-13 15:06:12.686305148 +0000 UTC m=+202.849516959" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.687008 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.691222 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v9g8"] Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.698189 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.704203 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.778484 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-utilities\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.778541 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.778584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdlwr\" (UniqueName: \"kubernetes.io/projected/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-kube-api-access-xdlwr\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.778628 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-catalog-content\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.778919 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.27890869 +0000 UTC m=+203.442120501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.882307 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.882816 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-catalog-content\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.882882 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-utilities\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.882940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdlwr\" (UniqueName: \"kubernetes.io/projected/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-kube-api-access-xdlwr\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.883440 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-catalog-content\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.883540 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-utilities\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.883542 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.383523307 +0000 UTC m=+203.546735118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.909713 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdlwr\" (UniqueName: \"kubernetes.io/projected/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-kube-api-access-xdlwr\") pod \"redhat-marketplace-5v9g8\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:12 crc kubenswrapper[4786]: I0313 15:06:12.986240 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:12 crc kubenswrapper[4786]: E0313 15:06:12.986570 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.486557582 +0000 UTC m=+203.649769393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.038182 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.039052 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.041018 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.041223 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.048363 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5lnt2"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.049696 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.052399 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.056080 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.074326 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5lnt2"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.089500 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.089673 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.589652028 +0000 UTC m=+203.752863839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.089710 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-utilities\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.089732 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jgt7\" (UniqueName: \"kubernetes.io/projected/38981c4c-48a9-497e-86b7-c3852574cae2-kube-api-access-8jgt7\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.089748 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7179ad3-2e69-4391-9fd7-642af219e9e4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f7179ad3-2e69-4391-9fd7-642af219e9e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.089812 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-catalog-content\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.089871 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.089904 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7179ad3-2e69-4391-9fd7-642af219e9e4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f7179ad3-2e69-4391-9fd7-642af219e9e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.090304 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.590288965 +0000 UTC m=+203.753500776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.113687 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b95c49568-qf5zm"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.123384 4786 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 15:06:13 crc kubenswrapper[4786]: W0313 15:06:13.131536 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod404f5a5d_c2e6_4a9e_898e_b87c1fb67005.slice/crio-70b7c1bad42595fac718399ba951c883a73332f5c6c3c9361125647bc45b5df2 WatchSource:0}: Error finding container 70b7c1bad42595fac718399ba951c883a73332f5c6c3c9361125647bc45b5df2: Status 404 returned error can't find the container with id 70b7c1bad42595fac718399ba951c883a73332f5c6c3c9361125647bc45b5df2 Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.162051 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.190587 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.190745 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.690719339 +0000 UTC m=+203.853931150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.190833 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7179ad3-2e69-4391-9fd7-642af219e9e4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f7179ad3-2e69-4391-9fd7-642af219e9e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.190947 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-utilities\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.190972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jgt7\" (UniqueName: \"kubernetes.io/projected/38981c4c-48a9-497e-86b7-c3852574cae2-kube-api-access-8jgt7\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.190993 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7179ad3-2e69-4391-9fd7-642af219e9e4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f7179ad3-2e69-4391-9fd7-642af219e9e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.191055 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-catalog-content\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.191084 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.191372 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.691360386 +0000 UTC m=+203.854572197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.191517 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7179ad3-2e69-4391-9fd7-642af219e9e4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f7179ad3-2e69-4391-9fd7-642af219e9e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.192477 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-catalog-content\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.193550 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-utilities\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.211579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jgt7\" (UniqueName: \"kubernetes.io/projected/38981c4c-48a9-497e-86b7-c3852574cae2-kube-api-access-8jgt7\") pod \"redhat-marketplace-5lnt2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.215377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7179ad3-2e69-4391-9fd7-642af219e9e4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f7179ad3-2e69-4391-9fd7-642af219e9e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.232015 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:13 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:13 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:13 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.232273 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.291547 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-config\") pod \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.291643 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.291665 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-client-ca\") pod \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.291693 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-serving-cert\") pod \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.291773 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqsl\" (UniqueName: \"kubernetes.io/projected/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-kube-api-access-gfqsl\") pod \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\" (UID: \"b8ed915e-9dfb-4a63-87c6-f21160cdca5f\") " Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.292934 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-config" (OuterVolumeSpecName: "config") pod "b8ed915e-9dfb-4a63-87c6-f21160cdca5f" (UID: "b8ed915e-9dfb-4a63-87c6-f21160cdca5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.292994 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.792981833 +0000 UTC m=+203.956193644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.293357 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b8ed915e-9dfb-4a63-87c6-f21160cdca5f" (UID: "b8ed915e-9dfb-4a63-87c6-f21160cdca5f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.298216 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b8ed915e-9dfb-4a63-87c6-f21160cdca5f" (UID: "b8ed915e-9dfb-4a63-87c6-f21160cdca5f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.301161 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-kube-api-access-gfqsl" (OuterVolumeSpecName: "kube-api-access-gfqsl") pod "b8ed915e-9dfb-4a63-87c6-f21160cdca5f" (UID: "b8ed915e-9dfb-4a63-87c6-f21160cdca5f"). InnerVolumeSpecName "kube-api-access-gfqsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.308563 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58940: no serving certificate available for the kubelet" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.393357 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.393450 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfqsl\" (UniqueName: \"kubernetes.io/projected/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-kube-api-access-gfqsl\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.393467 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.393478 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.393487 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ed915e-9dfb-4a63-87c6-f21160cdca5f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.393765 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.893750766 +0000 UTC m=+204.056962577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.439251 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:13 crc kubenswrapper[4786]: W0313 15:06:13.441841 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e627f8c_63e5_4b85_9fff_0c205c96d0a4.slice/crio-084e8b5790addce5afdc6d136f56b6ba57ce0935e95f206e1dfd7e89245d3ee0 WatchSource:0}: Error finding container 084e8b5790addce5afdc6d136f56b6ba57ce0935e95f206e1dfd7e89245d3ee0: Status 404 returned error can't find the container with id 084e8b5790addce5afdc6d136f56b6ba57ce0935e95f206e1dfd7e89245d3ee0 Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.447929 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v9g8"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.460156 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.495588 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.496108 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:13.996089011 +0000 UTC m=+204.159300822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.596739 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.597056 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 15:06:14.097035049 +0000 UTC m=+204.260246860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ffbml" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.616741 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" event={"ID":"404f5a5d-c2e6-4a9e-898e-b87c1fb67005","Type":"ContainerStarted","Data":"d4c1f3a56b65c1e0f67131a60f570295e596acf415b11ea2a6caafa8145f673e"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.616780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" event={"ID":"404f5a5d-c2e6-4a9e-898e-b87c1fb67005","Type":"ContainerStarted","Data":"70b7c1bad42595fac718399ba951c883a73332f5c6c3c9361125647bc45b5df2"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.617260 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.626032 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v9g8" event={"ID":"8e627f8c-63e5-4b85-9fff-0c205c96d0a4","Type":"ContainerStarted","Data":"084e8b5790addce5afdc6d136f56b6ba57ce0935e95f206e1dfd7e89245d3ee0"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.648665 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.649044 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerID="0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69" exitCode=0 Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.649117 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbj4r" event={"ID":"4bcce315-5828-4a7c-870f-6dd6518af3dd","Type":"ContainerDied","Data":"0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.663455 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" podStartSLOduration=1.66343675 podStartE2EDuration="1.66343675s" podCreationTimestamp="2026-03-13 15:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:13.659564085 +0000 UTC m=+203.822775896" watchObservedRunningTime="2026-03-13 15:06:13.66343675 +0000 UTC m=+203.826648551" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.670246 4786 generic.go:334] "Generic (PLEG): container finished" podID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerID="1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453" exitCode=0 Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.670304 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92qtc" event={"ID":"0115a774-8b25-4e1d-9d6f-c4202035efa9","Type":"ContainerDied","Data":"1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.675626 4786 generic.go:334] "Generic (PLEG): container finished" podID="b8ed915e-9dfb-4a63-87c6-f21160cdca5f" containerID="e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc" exitCode=0 Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.675700 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" event={"ID":"b8ed915e-9dfb-4a63-87c6-f21160cdca5f","Type":"ContainerDied","Data":"e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.675732 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" event={"ID":"b8ed915e-9dfb-4a63-87c6-f21160cdca5f","Type":"ContainerDied","Data":"52a9f1544e8f915b70f5d279a06fe2143d55d88119db614689d9ac85cc144d2c"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.675752 4786 scope.go:117] "RemoveContainer" containerID="e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.675917 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.681433 4786 generic.go:334] "Generic (PLEG): container finished" podID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerID="d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116" exitCode=0 Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.681499 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k626p" event={"ID":"4672e3ad-80ad-4d20-89b6-b6d11c9eb508","Type":"ContainerDied","Data":"d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.697312 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.698563 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 15:06:14.198541693 +0000 UTC m=+204.361753504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.723191 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-znwnc" event={"ID":"e077b561-62da-4d5b-b7a2-faf0e03f46b1","Type":"ContainerStarted","Data":"0fb5c106507df46ea00af64d13fcd58f74edeb033ca191c39e2fde04ee351ec1"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.723228 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-znwnc" event={"ID":"e077b561-62da-4d5b-b7a2-faf0e03f46b1","Type":"ContainerStarted","Data":"ecc953d1ca96ad4072d58dc8ef248d1a52be9d3ea4e91249aaad9f8d53722a07"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.723238 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-znwnc" event={"ID":"e077b561-62da-4d5b-b7a2-faf0e03f46b1","Type":"ContainerStarted","Data":"ebda98fb2215586c4d958055fe217c083dd39e15ec9aea489dbee1133996ee60"} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.726913 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5lnt2"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.728654 4786 scope.go:117] "RemoveContainer" containerID="e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc" Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.731161 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc\": container with ID starting with e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc not found: ID does not exist" containerID="e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.731204 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc"} err="failed to get container status \"e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc\": rpc error: code = NotFound desc = could not find container \"e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc\": container with ID starting with e9cfb119824c2b3c1e551d5330a9a8968a4826a45830a5cda83255b0f7dc9fcc not found: ID does not exist" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.735593 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.738288 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-d5wsg"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.751591 4786 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T15:06:13.123408873Z","Handler":null,"Name":""} Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.769274 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5cbrn" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.771624 4786 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.771644 4786 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.799149 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.819201 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.819252 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.840720 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-znwnc" podStartSLOduration=11.840690558 podStartE2EDuration="11.840690558s" podCreationTimestamp="2026-03-13 15:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:13.832416113 +0000 UTC m=+203.995627924" watchObservedRunningTime="2026-03-13 15:06:13.840690558 +0000 UTC m=+204.003902369" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.858608 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q84vs"] Mar 13 15:06:13 crc kubenswrapper[4786]: E0313 15:06:13.858814 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ed915e-9dfb-4a63-87c6-f21160cdca5f" containerName="route-controller-manager" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.858825 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ed915e-9dfb-4a63-87c6-f21160cdca5f" containerName="route-controller-manager" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.858943 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ed915e-9dfb-4a63-87c6-f21160cdca5f" containerName="route-controller-manager" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.859596 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.868302 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.873026 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q84vs"] Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.897828 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ffbml\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.905947 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.906228 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-catalog-content\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.906265 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-utilities\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.906287 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbmwc\" (UniqueName: \"kubernetes.io/projected/73d30051-975a-4329-8d7b-32d297b35218-kube-api-access-wbmwc\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:13 crc kubenswrapper[4786]: I0313 15:06:13.918293 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.008898 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbmwc\" (UniqueName: \"kubernetes.io/projected/73d30051-975a-4329-8d7b-32d297b35218-kube-api-access-wbmwc\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.009017 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-catalog-content\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.009047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-utilities\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.009420 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-utilities\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.009844 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-catalog-content\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.036465 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.053840 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbmwc\" (UniqueName: \"kubernetes.io/projected/73d30051-975a-4329-8d7b-32d297b35218-kube-api-access-wbmwc\") pod \"redhat-operators-q84vs\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:14 crc kubenswrapper[4786]: W0313 15:06:14.057725 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf7179ad3_2e69_4391_9fd7_642af219e9e4.slice/crio-00057843a6014b08d3332803484ce67e2b1669f45c9f096b07f400f6b26d4364 WatchSource:0}: Error finding container 00057843a6014b08d3332803484ce67e2b1669f45c9f096b07f400f6b26d4364: Status 404 returned error can't find the container with id 00057843a6014b08d3332803484ce67e2b1669f45c9f096b07f400f6b26d4364 Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.113280 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.148751 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.210455 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.212687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-config-volume\") pod \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.212740 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcgns\" (UniqueName: \"kubernetes.io/projected/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-kube-api-access-jcgns\") pod \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.212790 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-secret-volume\") pod \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\" (UID: \"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af\") " Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.213392 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-config-volume" (OuterVolumeSpecName: "config-volume") pod "3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af" (UID: "3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.219385 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-kube-api-access-jcgns" (OuterVolumeSpecName: "kube-api-access-jcgns") pod "3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af" (UID: "3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af"). InnerVolumeSpecName "kube-api-access-jcgns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.219442 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af" (UID: "3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.222933 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:14 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:14 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:14 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.222994 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.234172 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn"] Mar 13 15:06:14 crc kubenswrapper[4786]: E0313 15:06:14.234645 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af" containerName="collect-profiles" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.234666 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af" containerName="collect-profiles" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.236940 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af" containerName="collect-profiles" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.240000 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.253619 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.255377 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.256921 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.257087 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.257164 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.257345 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.265402 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn"] Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.268736 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ggggh"] Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.270179 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.314839 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-catalog-content\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.314955 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mpnr\" (UniqueName: \"kubernetes.io/projected/b09caa91-70b5-4a6a-998b-7a30e489e356-kube-api-access-9mpnr\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.314992 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-client-ca\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.315049 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-config\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.315103 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ptld\" (UniqueName: \"kubernetes.io/projected/b5282ab3-536a-405e-93af-c9c16130ec87-kube-api-access-6ptld\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.315134 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-utilities\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.315177 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09caa91-70b5-4a6a-998b-7a30e489e356-serving-cert\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.315353 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcgns\" (UniqueName: \"kubernetes.io/projected/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-kube-api-access-jcgns\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.315411 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.315434 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.344465 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggggh"] Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.409926 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.410552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.413266 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.413555 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.416273 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mpnr\" (UniqueName: \"kubernetes.io/projected/b09caa91-70b5-4a6a-998b-7a30e489e356-kube-api-access-9mpnr\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.416309 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-client-ca\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.416351 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-config\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.416375 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ptld\" (UniqueName: \"kubernetes.io/projected/b5282ab3-536a-405e-93af-c9c16130ec87-kube-api-access-6ptld\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.416391 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-utilities\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.416419 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09caa91-70b5-4a6a-998b-7a30e489e356-serving-cert\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.416464 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-catalog-content\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.416967 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-catalog-content\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.417895 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-client-ca\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.418773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-config\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.420078 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-utilities\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.426442 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09caa91-70b5-4a6a-998b-7a30e489e356-serving-cert\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.431665 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.468784 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mpnr\" (UniqueName: \"kubernetes.io/projected/b09caa91-70b5-4a6a-998b-7a30e489e356-kube-api-access-9mpnr\") pod \"route-controller-manager-79dbc6cbcd-sz7nn\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.469311 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ptld\" (UniqueName: \"kubernetes.io/projected/b5282ab3-536a-405e-93af-c9c16130ec87-kube-api-access-6ptld\") pod \"redhat-operators-ggggh\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.518715 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa2ebab-1c69-40c2-8668-986951876ab3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"baa2ebab-1c69-40c2-8668-986951876ab3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.518884 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa2ebab-1c69-40c2-8668-986951876ab3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"baa2ebab-1c69-40c2-8668-986951876ab3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.590445 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2643aa-9180-475a-af73-8e7b311cc77c" path="/var/lib/kubelet/pods/3a2643aa-9180-475a-af73-8e7b311cc77c/volumes" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.591445 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.592125 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ed915e-9dfb-4a63-87c6-f21160cdca5f" path="/var/lib/kubelet/pods/b8ed915e-9dfb-4a63-87c6-f21160cdca5f/volumes" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.619952 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa2ebab-1c69-40c2-8668-986951876ab3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"baa2ebab-1c69-40c2-8668-986951876ab3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.620428 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa2ebab-1c69-40c2-8668-986951876ab3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"baa2ebab-1c69-40c2-8668-986951876ab3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.620512 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa2ebab-1c69-40c2-8668-986951876ab3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"baa2ebab-1c69-40c2-8668-986951876ab3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.643339 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa2ebab-1c69-40c2-8668-986951876ab3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"baa2ebab-1c69-40c2-8668-986951876ab3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.666169 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58956: no serving certificate available for the kubelet" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.680756 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.680805 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q84vs"] Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.681067 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.698821 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.704706 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.731880 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.746078 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.751567 4786 generic.go:334] "Generic (PLEG): container finished" podID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerID="6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e" exitCode=0 Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.751632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v9g8" event={"ID":"8e627f8c-63e5-4b85-9fff-0c205c96d0a4","Type":"ContainerDied","Data":"6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e"} Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.755762 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" event={"ID":"3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af","Type":"ContainerDied","Data":"a49f4d674975b1f1f2938e058fd57566f106ed1e94e146d1bb9dc5509194bd85"} Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.755792 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a49f4d674975b1f1f2938e058fd57566f106ed1e94e146d1bb9dc5509194bd85" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.755864 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.762947 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f7179ad3-2e69-4391-9fd7-642af219e9e4","Type":"ContainerStarted","Data":"e1a2db621629fbf299a6949c82d56b3286f0a0eecef7b6741808b0a575b5071f"} Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.762999 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f7179ad3-2e69-4391-9fd7-642af219e9e4","Type":"ContainerStarted","Data":"00057843a6014b08d3332803484ce67e2b1669f45c9f096b07f400f6b26d4364"} Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.776074 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q84vs" event={"ID":"73d30051-975a-4329-8d7b-32d297b35218","Type":"ContainerStarted","Data":"8f53d36f53e486976ee0f94789c2aa607485d031ef1a18d023804d78eade2df4"} Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.778190 4786 generic.go:334] "Generic (PLEG): container finished" podID="38981c4c-48a9-497e-86b7-c3852574cae2" containerID="e6f949a05f768c96e5fe100ef65d0a13680aede8f46ec13f3bfd1f1336954776" exitCode=0 Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.778343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5lnt2" event={"ID":"38981c4c-48a9-497e-86b7-c3852574cae2","Type":"ContainerDied","Data":"e6f949a05f768c96e5fe100ef65d0a13680aede8f46ec13f3bfd1f1336954776"} Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.778373 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5lnt2" event={"ID":"38981c4c-48a9-497e-86b7-c3852574cae2","Type":"ContainerStarted","Data":"fcd107a86c9ef1b8d3575aaff317a54888c86f593afa803c1f38d337f50c723c"} Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.788798 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bg6tw" Mar 13 15:06:14 crc kubenswrapper[4786]: I0313 15:06:14.803326 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ffbml"] Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.001016 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.001304 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.002926 4786 patch_prober.go:28] interesting pod/console-f9d7485db-2tqr8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.002969 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2tqr8" podUID="0c6ae29c-e743-4193-bce1-22b4c5732f45" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.015190 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bj4z container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.015225 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2bj4z" podUID="7babe28d-a681-4e19-ba38-150d554380a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.015327 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bj4z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.015371 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2bj4z" podUID="7babe28d-a681-4e19-ba38-150d554380a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.096607 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggggh"] Mar 13 15:06:15 crc kubenswrapper[4786]: W0313 15:06:15.114840 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5282ab3_536a_405e_93af_c9c16130ec87.slice/crio-3c9798dd8adcb6bf614170bb1a84f4cb4749a884055a5a23f8766bfbde8e18d2 WatchSource:0}: Error finding container 3c9798dd8adcb6bf614170bb1a84f4cb4749a884055a5a23f8766bfbde8e18d2: Status 404 returned error can't find the container with id 3c9798dd8adcb6bf614170bb1a84f4cb4749a884055a5a23f8766bfbde8e18d2 Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.143766 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.144710 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.144784 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.155244 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:15 crc kubenswrapper[4786]: W0313 15:06:15.193151 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbaa2ebab_1c69_40c2_8668_986951876ab3.slice/crio-e0fba8d490d9ac866ab11b1029c620100f8467703b3dac0625652bc1bee0de49 WatchSource:0}: Error finding container e0fba8d490d9ac866ab11b1029c620100f8467703b3dac0625652bc1bee0de49: Status 404 returned error can't find the container with id e0fba8d490d9ac866ab11b1029c620100f8467703b3dac0625652bc1bee0de49 Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.219067 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.221220 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:15 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:15 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:15 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.221262 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.256128 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn"] Mar 13 15:06:15 crc kubenswrapper[4786]: W0313 15:06:15.274036 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb09caa91_70b5_4a6a_998b_7a30e489e356.slice/crio-3d8a9c108548c548c84b5c9511fd2c707f07ef12760475e24a58a255b545f3c2 WatchSource:0}: Error finding container 3d8a9c108548c548c84b5c9511fd2c707f07ef12760475e24a58a255b545f3c2: Status 404 returned error can't find the container with id 3d8a9c108548c548c84b5c9511fd2c707f07ef12760475e24a58a255b545f3c2 Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.563167 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.563297 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.563320 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.569424 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.569902 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.576038 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.664179 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.668533 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.678422 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.696549 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.704644 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.791778 4786 generic.go:334] "Generic (PLEG): container finished" podID="73d30051-975a-4329-8d7b-32d297b35218" containerID="2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3" exitCode=0 Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.791848 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q84vs" event={"ID":"73d30051-975a-4329-8d7b-32d297b35218","Type":"ContainerDied","Data":"2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.795733 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5282ab3-536a-405e-93af-c9c16130ec87" containerID="4cfddcb622682d8d31cd8af7407c391ba1c9f1c1ccd3df90a45ae38632f2cc1e" exitCode=0 Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.795902 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggggh" event={"ID":"b5282ab3-536a-405e-93af-c9c16130ec87","Type":"ContainerDied","Data":"4cfddcb622682d8d31cd8af7407c391ba1c9f1c1ccd3df90a45ae38632f2cc1e"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.795932 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggggh" event={"ID":"b5282ab3-536a-405e-93af-c9c16130ec87","Type":"ContainerStarted","Data":"3c9798dd8adcb6bf614170bb1a84f4cb4749a884055a5a23f8766bfbde8e18d2"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.801957 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"baa2ebab-1c69-40c2-8668-986951876ab3","Type":"ContainerStarted","Data":"e0fba8d490d9ac866ab11b1029c620100f8467703b3dac0625652bc1bee0de49"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.808770 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" event={"ID":"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d","Type":"ContainerStarted","Data":"6c55db000e74d08a764730984d25dc63942d8af978bb0cf22fa3a7e1ccb4a2db"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.808808 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" event={"ID":"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d","Type":"ContainerStarted","Data":"ce275b3c95baf9262e6476605c5f81f37f89bfa1626d3251bf4c13e02f056887"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.809461 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.831178 4786 generic.go:334] "Generic (PLEG): container finished" podID="f7179ad3-2e69-4391-9fd7-642af219e9e4" containerID="e1a2db621629fbf299a6949c82d56b3286f0a0eecef7b6741808b0a575b5071f" exitCode=0 Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.831321 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f7179ad3-2e69-4391-9fd7-642af219e9e4","Type":"ContainerDied","Data":"e1a2db621629fbf299a6949c82d56b3286f0a0eecef7b6741808b0a575b5071f"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.831539 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" podStartSLOduration=139.831525963 podStartE2EDuration="2m19.831525963s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:15.829301173 +0000 UTC m=+205.992512984" watchObservedRunningTime="2026-03-13 15:06:15.831525963 +0000 UTC m=+205.994737775" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.845197 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" event={"ID":"b09caa91-70b5-4a6a-998b-7a30e489e356","Type":"ContainerStarted","Data":"17f6c19866872e50c2e9a899bc0fb1abd0eb4cbd4a95a72f7f74c961a9932d9e"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.845235 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" event={"ID":"b09caa91-70b5-4a6a-998b-7a30e489e356","Type":"ContainerStarted","Data":"3d8a9c108548c548c84b5c9511fd2c707f07ef12760475e24a58a255b545f3c2"} Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.845249 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.854042 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5m4w2" Mar 13 15:06:15 crc kubenswrapper[4786]: I0313 15:06:15.907056 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" podStartSLOduration=3.907030921 podStartE2EDuration="3.907030921s" podCreationTimestamp="2026-03-13 15:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:15.881623582 +0000 UTC m=+206.044835393" watchObservedRunningTime="2026-03-13 15:06:15.907030921 +0000 UTC m=+206.070242732" Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.069120 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:16 crc kubenswrapper[4786]: W0313 15:06:16.119048 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-47db8467ef6d228c743332ad3b2144b17daa31cdc296a97204da9cd4f5d91859 WatchSource:0}: Error finding container 47db8467ef6d228c743332ad3b2144b17daa31cdc296a97204da9cd4f5d91859: Status 404 returned error can't find the container with id 47db8467ef6d228c743332ad3b2144b17daa31cdc296a97204da9cd4f5d91859 Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.230405 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:16 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:16 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:16 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.232286 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.884462 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e92d3ba2c77853e323b28d717f88a57586e65b54dc91ba12fed22bc9988de252"} Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.884521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"476f1034ca0ec1c94d182635dd152bfb9ea8f4dbaccfb7d848d42c6c353ec78b"} Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.908767 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6b5509e1acb1429841262a182c93afc9b73c85417de5aa87a549bd607cdaee69"} Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.908812 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"68c85e9541b1cde4f97b5965e26ffdc7dcde1d0bc051ab099927f0738a39803c"} Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.914667 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c04f8dba76bf12cea93f99e2aa652d286f3135611e51f46b1df512d8457531e0"} Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.914702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"47db8467ef6d228c743332ad3b2144b17daa31cdc296a97204da9cd4f5d91859"} Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.915146 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.936230 4786 generic.go:334] "Generic (PLEG): container finished" podID="baa2ebab-1c69-40c2-8668-986951876ab3" containerID="ab4365d871dc8ccd850d3ea4ae6529ab43d477e67e31d078badfe1c04ca23fb9" exitCode=0 Mar 13 15:06:16 crc kubenswrapper[4786]: I0313 15:06:16.936913 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"baa2ebab-1c69-40c2-8668-986951876ab3","Type":"ContainerDied","Data":"ab4365d871dc8ccd850d3ea4ae6529ab43d477e67e31d078badfe1c04ca23fb9"} Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.223131 4786 patch_prober.go:28] interesting pod/router-default-5444994796-z752l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 15:06:17 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Mar 13 15:06:17 crc kubenswrapper[4786]: [+]process-running ok Mar 13 15:06:17 crc kubenswrapper[4786]: healthz check failed Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.223394 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z752l" podUID="8cf76d74-f1bd-446f-90fe-2006ac188804" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.266489 4786 ???:1] "http: TLS handshake error from 192.168.126.11:58968: no serving certificate available for the kubelet" Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.361995 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.493575 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7179ad3-2e69-4391-9fd7-642af219e9e4-kube-api-access\") pod \"f7179ad3-2e69-4391-9fd7-642af219e9e4\" (UID: \"f7179ad3-2e69-4391-9fd7-642af219e9e4\") " Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.493626 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7179ad3-2e69-4391-9fd7-642af219e9e4-kubelet-dir\") pod \"f7179ad3-2e69-4391-9fd7-642af219e9e4\" (UID: \"f7179ad3-2e69-4391-9fd7-642af219e9e4\") " Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.493910 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7179ad3-2e69-4391-9fd7-642af219e9e4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f7179ad3-2e69-4391-9fd7-642af219e9e4" (UID: "f7179ad3-2e69-4391-9fd7-642af219e9e4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.510520 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7179ad3-2e69-4391-9fd7-642af219e9e4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f7179ad3-2e69-4391-9fd7-642af219e9e4" (UID: "f7179ad3-2e69-4391-9fd7-642af219e9e4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.595676 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7179ad3-2e69-4391-9fd7-642af219e9e4-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.595717 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7179ad3-2e69-4391-9fd7-642af219e9e4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.955815 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.956052 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f7179ad3-2e69-4391-9fd7-642af219e9e4","Type":"ContainerDied","Data":"00057843a6014b08d3332803484ce67e2b1669f45c9f096b07f400f6b26d4364"} Mar 13 15:06:17 crc kubenswrapper[4786]: I0313 15:06:17.956279 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00057843a6014b08d3332803484ce67e2b1669f45c9f096b07f400f6b26d4364" Mar 13 15:06:18 crc kubenswrapper[4786]: I0313 15:06:18.229604 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:18 crc kubenswrapper[4786]: I0313 15:06:18.234523 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-z752l" Mar 13 15:06:18 crc kubenswrapper[4786]: I0313 15:06:18.510249 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:06:18 crc kubenswrapper[4786]: I0313 15:06:18.523119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ded4bfa-6d71-4c0f-982d-aee3c61c5612-metrics-certs\") pod \"network-metrics-daemon-2v688\" (UID: \"2ded4bfa-6d71-4c0f-982d-aee3c61c5612\") " pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:06:18 crc kubenswrapper[4786]: I0313 15:06:18.706183 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v688" Mar 13 15:06:20 crc kubenswrapper[4786]: I0313 15:06:20.033717 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-996mk" Mar 13 15:06:20 crc kubenswrapper[4786]: I0313 15:06:20.078258 4786 ???:1] "http: TLS handshake error from 192.168.126.11:53636: no serving certificate available for the kubelet" Mar 13 15:06:22 crc kubenswrapper[4786]: I0313 15:06:22.403578 4786 ???:1] "http: TLS handshake error from 192.168.126.11:53640: no serving certificate available for the kubelet" Mar 13 15:06:25 crc kubenswrapper[4786]: I0313 15:06:25.007478 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:25 crc kubenswrapper[4786]: I0313 15:06:25.011462 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:06:25 crc kubenswrapper[4786]: I0313 15:06:25.015052 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bj4z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 13 15:06:25 crc kubenswrapper[4786]: I0313 15:06:25.015088 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2bj4z" podUID="7babe28d-a681-4e19-ba38-150d554380a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 13 15:06:25 crc kubenswrapper[4786]: I0313 15:06:25.015297 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-2bj4z container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 13 15:06:25 crc kubenswrapper[4786]: I0313 15:06:25.015366 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2bj4z" podUID="7babe28d-a681-4e19-ba38-150d554380a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 13 15:06:27 crc kubenswrapper[4786]: I0313 15:06:27.256745 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:27 crc kubenswrapper[4786]: I0313 15:06:27.375804 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa2ebab-1c69-40c2-8668-986951876ab3-kubelet-dir\") pod \"baa2ebab-1c69-40c2-8668-986951876ab3\" (UID: \"baa2ebab-1c69-40c2-8668-986951876ab3\") " Mar 13 15:06:27 crc kubenswrapper[4786]: I0313 15:06:27.375977 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa2ebab-1c69-40c2-8668-986951876ab3-kube-api-access\") pod \"baa2ebab-1c69-40c2-8668-986951876ab3\" (UID: \"baa2ebab-1c69-40c2-8668-986951876ab3\") " Mar 13 15:06:27 crc kubenswrapper[4786]: I0313 15:06:27.376175 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa2ebab-1c69-40c2-8668-986951876ab3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "baa2ebab-1c69-40c2-8668-986951876ab3" (UID: "baa2ebab-1c69-40c2-8668-986951876ab3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:06:27 crc kubenswrapper[4786]: I0313 15:06:27.376341 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa2ebab-1c69-40c2-8668-986951876ab3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:27 crc kubenswrapper[4786]: I0313 15:06:27.382693 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa2ebab-1c69-40c2-8668-986951876ab3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "baa2ebab-1c69-40c2-8668-986951876ab3" (UID: "baa2ebab-1c69-40c2-8668-986951876ab3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:27 crc kubenswrapper[4786]: I0313 15:06:27.477950 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baa2ebab-1c69-40c2-8668-986951876ab3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:28 crc kubenswrapper[4786]: I0313 15:06:28.080124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"baa2ebab-1c69-40c2-8668-986951876ab3","Type":"ContainerDied","Data":"e0fba8d490d9ac866ab11b1029c620100f8467703b3dac0625652bc1bee0de49"} Mar 13 15:06:28 crc kubenswrapper[4786]: I0313 15:06:28.080171 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0fba8d490d9ac866ab11b1029c620100f8467703b3dac0625652bc1bee0de49" Mar 13 15:06:28 crc kubenswrapper[4786]: I0313 15:06:28.080233 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 15:06:30 crc kubenswrapper[4786]: I0313 15:06:30.428593 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b95c49568-qf5zm"] Mar 13 15:06:30 crc kubenswrapper[4786]: I0313 15:06:30.430775 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" podUID="404f5a5d-c2e6-4a9e-898e-b87c1fb67005" containerName="controller-manager" containerID="cri-o://d4c1f3a56b65c1e0f67131a60f570295e596acf415b11ea2a6caafa8145f673e" gracePeriod=30 Mar 13 15:06:30 crc kubenswrapper[4786]: I0313 15:06:30.443429 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn"] Mar 13 15:06:30 crc kubenswrapper[4786]: I0313 15:06:30.443686 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" podUID="b09caa91-70b5-4a6a-998b-7a30e489e356" containerName="route-controller-manager" containerID="cri-o://17f6c19866872e50c2e9a899bc0fb1abd0eb4cbd4a95a72f7f74c961a9932d9e" gracePeriod=30 Mar 13 15:06:32 crc kubenswrapper[4786]: I0313 15:06:32.106264 4786 generic.go:334] "Generic (PLEG): container finished" podID="b09caa91-70b5-4a6a-998b-7a30e489e356" containerID="17f6c19866872e50c2e9a899bc0fb1abd0eb4cbd4a95a72f7f74c961a9932d9e" exitCode=0 Mar 13 15:06:32 crc kubenswrapper[4786]: I0313 15:06:32.106341 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" event={"ID":"b09caa91-70b5-4a6a-998b-7a30e489e356","Type":"ContainerDied","Data":"17f6c19866872e50c2e9a899bc0fb1abd0eb4cbd4a95a72f7f74c961a9932d9e"} Mar 13 15:06:32 crc kubenswrapper[4786]: I0313 15:06:32.699508 4786 patch_prober.go:28] interesting pod/controller-manager-b95c49568-qf5zm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 13 15:06:32 crc kubenswrapper[4786]: I0313 15:06:32.699569 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" podUID="404f5a5d-c2e6-4a9e-898e-b87c1fb67005" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 13 15:06:33 crc kubenswrapper[4786]: I0313 15:06:33.115968 4786 generic.go:334] "Generic (PLEG): container finished" podID="404f5a5d-c2e6-4a9e-898e-b87c1fb67005" containerID="d4c1f3a56b65c1e0f67131a60f570295e596acf415b11ea2a6caafa8145f673e" exitCode=0 Mar 13 15:06:33 crc kubenswrapper[4786]: I0313 15:06:33.116034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" event={"ID":"404f5a5d-c2e6-4a9e-898e-b87c1fb67005","Type":"ContainerDied","Data":"d4c1f3a56b65c1e0f67131a60f570295e596acf415b11ea2a6caafa8145f673e"} Mar 13 15:06:34 crc kubenswrapper[4786]: I0313 15:06:34.156031 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:06:34 crc kubenswrapper[4786]: I0313 15:06:34.705276 4786 patch_prober.go:28] interesting pod/route-controller-manager-79dbc6cbcd-sz7nn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 13 15:06:34 crc kubenswrapper[4786]: I0313 15:06:34.705360 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" podUID="b09caa91-70b5-4a6a-998b-7a30e489e356" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 13 15:06:35 crc kubenswrapper[4786]: I0313 15:06:35.031418 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2bj4z" Mar 13 15:06:37 crc kubenswrapper[4786]: I0313 15:06:37.869005 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:06:37 crc kubenswrapper[4786]: I0313 15:06:37.869094 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:06:42 crc kubenswrapper[4786]: E0313 15:06:42.541816 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 15:06:42 crc kubenswrapper[4786]: E0313 15:06:42.543141 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:06:42 crc kubenswrapper[4786]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 15:06:42 crc kubenswrapper[4786]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nvkds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29556906-bz7fd_openshift-infra(1eda6a73-a8cf-406d-ab33-394ec1982f4a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 15:06:42 crc kubenswrapper[4786]: > logger="UnhandledError" Mar 13 15:06:42 crc kubenswrapper[4786]: E0313 15:06:42.545020 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" podUID="1eda6a73-a8cf-406d-ab33-394ec1982f4a" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.555205 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.570641 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.597975 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5"] Mar 13 15:06:42 crc kubenswrapper[4786]: E0313 15:06:42.598250 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7179ad3-2e69-4391-9fd7-642af219e9e4" containerName="pruner" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598262 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7179ad3-2e69-4391-9fd7-642af219e9e4" containerName="pruner" Mar 13 15:06:42 crc kubenswrapper[4786]: E0313 15:06:42.598275 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09caa91-70b5-4a6a-998b-7a30e489e356" containerName="route-controller-manager" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598283 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09caa91-70b5-4a6a-998b-7a30e489e356" containerName="route-controller-manager" Mar 13 15:06:42 crc kubenswrapper[4786]: E0313 15:06:42.598292 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa2ebab-1c69-40c2-8668-986951876ab3" containerName="pruner" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598298 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa2ebab-1c69-40c2-8668-986951876ab3" containerName="pruner" Mar 13 15:06:42 crc kubenswrapper[4786]: E0313 15:06:42.598308 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="404f5a5d-c2e6-4a9e-898e-b87c1fb67005" containerName="controller-manager" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598314 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="404f5a5d-c2e6-4a9e-898e-b87c1fb67005" containerName="controller-manager" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598400 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa2ebab-1c69-40c2-8668-986951876ab3" containerName="pruner" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598413 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="404f5a5d-c2e6-4a9e-898e-b87c1fb67005" containerName="controller-manager" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598422 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09caa91-70b5-4a6a-998b-7a30e489e356" containerName="route-controller-manager" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598434 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7179ad3-2e69-4391-9fd7-642af219e9e4" containerName="pruner" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.598794 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603398 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mpnr\" (UniqueName: \"kubernetes.io/projected/b09caa91-70b5-4a6a-998b-7a30e489e356-kube-api-access-9mpnr\") pod \"b09caa91-70b5-4a6a-998b-7a30e489e356\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603462 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-client-ca\") pod \"b09caa91-70b5-4a6a-998b-7a30e489e356\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603524 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-proxy-ca-bundles\") pod \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv6ng\" (UniqueName: \"kubernetes.io/projected/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-kube-api-access-tv6ng\") pod \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603591 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-serving-cert\") pod \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603649 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-config\") pod \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603666 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-config\") pod \"b09caa91-70b5-4a6a-998b-7a30e489e356\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603686 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09caa91-70b5-4a6a-998b-7a30e489e356-serving-cert\") pod \"b09caa91-70b5-4a6a-998b-7a30e489e356\" (UID: \"b09caa91-70b5-4a6a-998b-7a30e489e356\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.603705 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-client-ca\") pod \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\" (UID: \"404f5a5d-c2e6-4a9e-898e-b87c1fb67005\") " Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.604676 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-client-ca" (OuterVolumeSpecName: "client-ca") pod "404f5a5d-c2e6-4a9e-898e-b87c1fb67005" (UID: "404f5a5d-c2e6-4a9e-898e-b87c1fb67005"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.604742 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-config" (OuterVolumeSpecName: "config") pod "404f5a5d-c2e6-4a9e-898e-b87c1fb67005" (UID: "404f5a5d-c2e6-4a9e-898e-b87c1fb67005"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.605325 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-config" (OuterVolumeSpecName: "config") pod "b09caa91-70b5-4a6a-998b-7a30e489e356" (UID: "b09caa91-70b5-4a6a-998b-7a30e489e356"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.605319 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "404f5a5d-c2e6-4a9e-898e-b87c1fb67005" (UID: "404f5a5d-c2e6-4a9e-898e-b87c1fb67005"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.606396 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-client-ca" (OuterVolumeSpecName: "client-ca") pod "b09caa91-70b5-4a6a-998b-7a30e489e356" (UID: "b09caa91-70b5-4a6a-998b-7a30e489e356"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.625478 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "404f5a5d-c2e6-4a9e-898e-b87c1fb67005" (UID: "404f5a5d-c2e6-4a9e-898e-b87c1fb67005"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.627112 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-kube-api-access-tv6ng" (OuterVolumeSpecName: "kube-api-access-tv6ng") pod "404f5a5d-c2e6-4a9e-898e-b87c1fb67005" (UID: "404f5a5d-c2e6-4a9e-898e-b87c1fb67005"). InnerVolumeSpecName "kube-api-access-tv6ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.629057 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09caa91-70b5-4a6a-998b-7a30e489e356-kube-api-access-9mpnr" (OuterVolumeSpecName: "kube-api-access-9mpnr") pod "b09caa91-70b5-4a6a-998b-7a30e489e356" (UID: "b09caa91-70b5-4a6a-998b-7a30e489e356"). InnerVolumeSpecName "kube-api-access-9mpnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.631587 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09caa91-70b5-4a6a-998b-7a30e489e356-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b09caa91-70b5-4a6a-998b-7a30e489e356" (UID: "b09caa91-70b5-4a6a-998b-7a30e489e356"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.640901 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5"] Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704595 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-serving-cert\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bmkw\" (UniqueName: \"kubernetes.io/projected/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-kube-api-access-4bmkw\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704673 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-config\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704699 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-client-ca\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704734 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704743 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704754 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b09caa91-70b5-4a6a-998b-7a30e489e356-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704764 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704772 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mpnr\" (UniqueName: \"kubernetes.io/projected/b09caa91-70b5-4a6a-998b-7a30e489e356-kube-api-access-9mpnr\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704780 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b09caa91-70b5-4a6a-998b-7a30e489e356-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704789 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704797 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv6ng\" (UniqueName: \"kubernetes.io/projected/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-kube-api-access-tv6ng\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.704806 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/404f5a5d-c2e6-4a9e-898e-b87c1fb67005-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.805852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bmkw\" (UniqueName: \"kubernetes.io/projected/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-kube-api-access-4bmkw\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.805952 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-config\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.806001 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-client-ca\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.806064 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-serving-cert\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.807271 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-client-ca\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.807341 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-config\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.809193 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-serving-cert\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.824281 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bmkw\" (UniqueName: \"kubernetes.io/projected/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-kube-api-access-4bmkw\") pod \"route-controller-manager-6568677dd7-wg9j5\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.905506 4786 ???:1] "http: TLS handshake error from 192.168.126.11:39098: no serving certificate available for the kubelet" Mar 13 15:06:42 crc kubenswrapper[4786]: I0313 15:06:42.956121 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.177830 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.177895 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn" event={"ID":"b09caa91-70b5-4a6a-998b-7a30e489e356","Type":"ContainerDied","Data":"3d8a9c108548c548c84b5c9511fd2c707f07ef12760475e24a58a255b545f3c2"} Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.177994 4786 scope.go:117] "RemoveContainer" containerID="17f6c19866872e50c2e9a899bc0fb1abd0eb4cbd4a95a72f7f74c961a9932d9e" Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.180051 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.180171 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b95c49568-qf5zm" event={"ID":"404f5a5d-c2e6-4a9e-898e-b87c1fb67005","Type":"ContainerDied","Data":"70b7c1bad42595fac718399ba951c883a73332f5c6c3c9361125647bc45b5df2"} Mar 13 15:06:43 crc kubenswrapper[4786]: E0313 15:06:43.180785 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" podUID="1eda6a73-a8cf-406d-ab33-394ec1982f4a" Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.223144 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn"] Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.232164 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dbc6cbcd-sz7nn"] Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.237407 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b95c49568-qf5zm"] Mar 13 15:06:43 crc kubenswrapper[4786]: I0313 15:06:43.240657 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b95c49568-qf5zm"] Mar 13 15:06:44 crc kubenswrapper[4786]: E0313 15:06:44.069201 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 15:06:44 crc kubenswrapper[4786]: E0313 15:06:44.069352 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jgt7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5lnt2_openshift-marketplace(38981c4c-48a9-497e-86b7-c3852574cae2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 15:06:44 crc kubenswrapper[4786]: E0313 15:06:44.070534 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5lnt2" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" Mar 13 15:06:44 crc kubenswrapper[4786]: I0313 15:06:44.564488 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="404f5a5d-c2e6-4a9e-898e-b87c1fb67005" path="/var/lib/kubelet/pods/404f5a5d-c2e6-4a9e-898e-b87c1fb67005/volumes" Mar 13 15:06:44 crc kubenswrapper[4786]: I0313 15:06:44.565009 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b09caa91-70b5-4a6a-998b-7a30e489e356" path="/var/lib/kubelet/pods/b09caa91-70b5-4a6a-998b-7a30e489e356/volumes" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.216896 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5kfqf" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.254711 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks"] Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.255626 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.260518 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.260553 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.261086 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.261480 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.261725 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.261956 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.265371 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks"] Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.267041 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.337062 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-proxy-ca-bundles\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.337134 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-client-ca\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.337173 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-serving-cert\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.337231 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4c4f\" (UniqueName: \"kubernetes.io/projected/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-kube-api-access-g4c4f\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.337298 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-config\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.439226 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-config\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.439363 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-proxy-ca-bundles\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.440764 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-client-ca\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.440918 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-config\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.439394 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-client-ca\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.441139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-serving-cert\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.441173 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-proxy-ca-bundles\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.441194 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4c4f\" (UniqueName: \"kubernetes.io/projected/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-kube-api-access-g4c4f\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.451144 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-serving-cert\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.459601 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4c4f\" (UniqueName: \"kubernetes.io/projected/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-kube-api-access-g4c4f\") pod \"controller-manager-6b8c7fd94d-m4gks\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: I0313 15:06:45.574974 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:45 crc kubenswrapper[4786]: E0313 15:06:45.761199 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5lnt2" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" Mar 13 15:06:45 crc kubenswrapper[4786]: E0313 15:06:45.852356 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 15:06:45 crc kubenswrapper[4786]: E0313 15:06:45.852543 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4pmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-92qtc_openshift-marketplace(0115a774-8b25-4e1d-9d6f-c4202035efa9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 15:06:45 crc kubenswrapper[4786]: E0313 15:06:45.854130 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-92qtc" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" Mar 13 15:06:47 crc kubenswrapper[4786]: E0313 15:06:47.052406 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-92qtc" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" Mar 13 15:06:47 crc kubenswrapper[4786]: E0313 15:06:47.126994 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 15:06:47 crc kubenswrapper[4786]: E0313 15:06:47.127191 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlvdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-k626p_openshift-marketplace(4672e3ad-80ad-4d20-89b6-b6d11c9eb508): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 15:06:47 crc kubenswrapper[4786]: E0313 15:06:47.128381 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-k626p" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.410361 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.411610 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.413566 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.413926 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.418676 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.485621 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/332d0c36-3655-49cf-811d-d925a22ccfc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"332d0c36-3655-49cf-811d-d925a22ccfc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.485685 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/332d0c36-3655-49cf-811d-d925a22ccfc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"332d0c36-3655-49cf-811d-d925a22ccfc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.587303 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/332d0c36-3655-49cf-811d-d925a22ccfc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"332d0c36-3655-49cf-811d-d925a22ccfc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.587635 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/332d0c36-3655-49cf-811d-d925a22ccfc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"332d0c36-3655-49cf-811d-d925a22ccfc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.587742 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/332d0c36-3655-49cf-811d-d925a22ccfc7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"332d0c36-3655-49cf-811d-d925a22ccfc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.616846 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/332d0c36-3655-49cf-811d-d925a22ccfc7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"332d0c36-3655-49cf-811d-d925a22ccfc7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:48 crc kubenswrapper[4786]: I0313 15:06:48.732475 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:50 crc kubenswrapper[4786]: I0313 15:06:50.379611 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks"] Mar 13 15:06:50 crc kubenswrapper[4786]: I0313 15:06:50.479837 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5"] Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.216565 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-k626p" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" Mar 13 15:06:51 crc kubenswrapper[4786]: I0313 15:06:51.407106 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2v688"] Mar 13 15:06:51 crc kubenswrapper[4786]: I0313 15:06:51.481220 4786 scope.go:117] "RemoveContainer" containerID="d4c1f3a56b65c1e0f67131a60f570295e596acf415b11ea2a6caafa8145f673e" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.648575 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.649043 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbmwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-q84vs_openshift-marketplace(73d30051-975a-4329-8d7b-32d297b35218): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.650487 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-q84vs" podUID="73d30051-975a-4329-8d7b-32d297b35218" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.745880 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.746031 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-np77p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kbj4r_openshift-marketplace(4bcce315-5828-4a7c-870f-6dd6518af3dd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.748879 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kbj4r" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" Mar 13 15:06:51 crc kubenswrapper[4786]: I0313 15:06:51.766753 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.841753 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.841999 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdlwr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5v9g8_openshift-marketplace(8e627f8c-63e5-4b85-9fff-0c205c96d0a4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.843168 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5v9g8" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" Mar 13 15:06:51 crc kubenswrapper[4786]: I0313 15:06:51.892249 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5"] Mar 13 15:06:51 crc kubenswrapper[4786]: I0313 15:06:51.895102 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks"] Mar 13 15:06:51 crc kubenswrapper[4786]: W0313 15:06:51.900525 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd5b5b1_7572_4d4e_a412_41159bb7dc30.slice/crio-9403c2823212f91eaf2461111fcd3a0cc72c79216865a6e712df5ae19328e119 WatchSource:0}: Error finding container 9403c2823212f91eaf2461111fcd3a0cc72c79216865a6e712df5ae19328e119: Status 404 returned error can't find the container with id 9403c2823212f91eaf2461111fcd3a0cc72c79216865a6e712df5ae19328e119 Mar 13 15:06:51 crc kubenswrapper[4786]: W0313 15:06:51.901378 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a0edbe_ba3b_42e4_a73f_3fc35c0d3b87.slice/crio-09345e8bba71267b53f1e646417e989fdb3e726dab69dba7689181a48524a557 WatchSource:0}: Error finding container 09345e8bba71267b53f1e646417e989fdb3e726dab69dba7689181a48524a557: Status 404 returned error can't find the container with id 09345e8bba71267b53f1e646417e989fdb3e726dab69dba7689181a48524a557 Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.976534 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.976947 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ptld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ggggh_openshift-marketplace(b5282ab3-536a-405e-93af-c9c16130ec87): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 15:06:51 crc kubenswrapper[4786]: E0313 15:06:51.978089 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ggggh" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.238329 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" event={"ID":"4bd5b5b1-7572-4d4e-a412-41159bb7dc30","Type":"ContainerStarted","Data":"1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.238369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" event={"ID":"4bd5b5b1-7572-4d4e-a412-41159bb7dc30","Type":"ContainerStarted","Data":"9403c2823212f91eaf2461111fcd3a0cc72c79216865a6e712df5ae19328e119"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.238466 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" podUID="4bd5b5b1-7572-4d4e-a412-41159bb7dc30" containerName="route-controller-manager" containerID="cri-o://1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6" gracePeriod=30 Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.239167 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.241338 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" event={"ID":"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87","Type":"ContainerStarted","Data":"256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.241363 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" event={"ID":"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87","Type":"ContainerStarted","Data":"09345e8bba71267b53f1e646417e989fdb3e726dab69dba7689181a48524a557"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.241434 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" podUID="48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" containerName="controller-manager" containerID="cri-o://256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0" gracePeriod=30 Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.241627 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.246113 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.246461 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"332d0c36-3655-49cf-811d-d925a22ccfc7","Type":"ContainerStarted","Data":"1894e4573d447fd70396f8dcdaf73a118d60046dfdc140ddba17aec411ff00b2"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.246484 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"332d0c36-3655-49cf-811d-d925a22ccfc7","Type":"ContainerStarted","Data":"88cc78426239755be714032332ae9dad95728664b659aab878a725b003a72056"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.250491 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v688" event={"ID":"2ded4bfa-6d71-4c0f-982d-aee3c61c5612","Type":"ContainerStarted","Data":"980aa27c4ccc963525cd7b842b3abc66cac71117de6461b5b1e82b36f17443c7"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.250516 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v688" event={"ID":"2ded4bfa-6d71-4c0f-982d-aee3c61c5612","Type":"ContainerStarted","Data":"eebad8c4d53025b176afc06ae40413f68ccaeb9bde29d094488aadb0851581cc"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.250527 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v688" event={"ID":"2ded4bfa-6d71-4c0f-982d-aee3c61c5612","Type":"ContainerStarted","Data":"5952fae0f3c92008bf740ca895803fd2838e026dbfcc2108f907fecbaec210fe"} Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.253020 4786 generic.go:334] "Generic (PLEG): container finished" podID="44d91c62-557f-40c8-a725-33ff965bee1b" containerID="645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0" exitCode=0 Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.253121 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dkvg" event={"ID":"44d91c62-557f-40c8-a725-33ff965bee1b","Type":"ContainerDied","Data":"645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0"} Mar 13 15:06:52 crc kubenswrapper[4786]: E0313 15:06:52.255041 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5v9g8" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" Mar 13 15:06:52 crc kubenswrapper[4786]: E0313 15:06:52.255502 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kbj4r" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" Mar 13 15:06:52 crc kubenswrapper[4786]: E0313 15:06:52.255922 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-q84vs" podUID="73d30051-975a-4329-8d7b-32d297b35218" Mar 13 15:06:52 crc kubenswrapper[4786]: E0313 15:06:52.256089 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ggggh" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.269068 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" podStartSLOduration=22.269031402 podStartE2EDuration="22.269031402s" podCreationTimestamp="2026-03-13 15:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:52.261287461 +0000 UTC m=+242.424499272" watchObservedRunningTime="2026-03-13 15:06:52.269031402 +0000 UTC m=+242.432243213" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.294112 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2v688" podStartSLOduration=176.294094631 podStartE2EDuration="2m56.294094631s" podCreationTimestamp="2026-03-13 15:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:52.293379402 +0000 UTC m=+242.456591213" watchObservedRunningTime="2026-03-13 15:06:52.294094631 +0000 UTC m=+242.457306452" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.340880 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.340846829 podStartE2EDuration="4.340846829s" podCreationTimestamp="2026-03-13 15:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:52.321632218 +0000 UTC m=+242.484844029" watchObservedRunningTime="2026-03-13 15:06:52.340846829 +0000 UTC m=+242.504058780" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.397487 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" podStartSLOduration=22.397467855 podStartE2EDuration="22.397467855s" podCreationTimestamp="2026-03-13 15:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:52.397466005 +0000 UTC m=+242.560677826" watchObservedRunningTime="2026-03-13 15:06:52.397467855 +0000 UTC m=+242.560679666" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.446302 4786 patch_prober.go:28] interesting pod/route-controller-manager-6568677dd7-wg9j5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:59746->10.217.0.57:8443: read: connection reset by peer" start-of-body= Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.446350 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" podUID="4bd5b5b1-7572-4d4e-a412-41159bb7dc30" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:59746->10.217.0.57:8443: read: connection reset by peer" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.622473 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.648376 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f955cc5d9-tb5ls"] Mar 13 15:06:52 crc kubenswrapper[4786]: E0313 15:06:52.648593 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" containerName="controller-manager" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.648605 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" containerName="controller-manager" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.648705 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" containerName="controller-manager" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.649041 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.655921 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f955cc5d9-tb5ls"] Mar 13 15:06:52 crc kubenswrapper[4786]: E0313 15:06:52.673244 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod332d0c36_3655_49cf_811d_d925a22ccfc7.slice/crio-1894e4573d447fd70396f8dcdaf73a118d60046dfdc140ddba17aec411ff00b2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod332d0c36_3655_49cf_811d_d925a22ccfc7.slice/crio-conmon-1894e4573d447fd70396f8dcdaf73a118d60046dfdc140ddba17aec411ff00b2.scope\": RecentStats: unable to find data in memory cache]" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.692415 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6568677dd7-wg9j5_4bd5b5b1-7572-4d4e-a412-41159bb7dc30/route-controller-manager/0.log" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.692473 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741213 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-config\") pod \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741289 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4c4f\" (UniqueName: \"kubernetes.io/projected/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-kube-api-access-g4c4f\") pod \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741316 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bmkw\" (UniqueName: \"kubernetes.io/projected/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-kube-api-access-4bmkw\") pod \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741361 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-proxy-ca-bundles\") pod \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741386 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-serving-cert\") pod \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741432 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-config\") pod \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741472 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-client-ca\") pod \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\" (UID: \"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741524 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-client-ca\") pod \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741542 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-serving-cert\") pod \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\" (UID: \"4bd5b5b1-7572-4d4e-a412-41159bb7dc30\") " Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741758 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzqks\" (UniqueName: \"kubernetes.io/projected/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-kube-api-access-bzqks\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741787 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-proxy-ca-bundles\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-client-ca\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741896 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-serving-cert\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741916 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-config\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.741990 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-client-ca" (OuterVolumeSpecName: "client-ca") pod "4bd5b5b1-7572-4d4e-a412-41159bb7dc30" (UID: "4bd5b5b1-7572-4d4e-a412-41159bb7dc30"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.742153 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-config" (OuterVolumeSpecName: "config") pod "4bd5b5b1-7572-4d4e-a412-41159bb7dc30" (UID: "4bd5b5b1-7572-4d4e-a412-41159bb7dc30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.742323 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-client-ca" (OuterVolumeSpecName: "client-ca") pod "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" (UID: "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.742432 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" (UID: "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.742510 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-config" (OuterVolumeSpecName: "config") pod "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" (UID: "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.746730 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-kube-api-access-g4c4f" (OuterVolumeSpecName: "kube-api-access-g4c4f") pod "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" (UID: "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87"). InnerVolumeSpecName "kube-api-access-g4c4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.746944 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-kube-api-access-4bmkw" (OuterVolumeSpecName: "kube-api-access-4bmkw") pod "4bd5b5b1-7572-4d4e-a412-41159bb7dc30" (UID: "4bd5b5b1-7572-4d4e-a412-41159bb7dc30"). InnerVolumeSpecName "kube-api-access-4bmkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.746825 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" (UID: "48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.746826 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4bd5b5b1-7572-4d4e-a412-41159bb7dc30" (UID: "4bd5b5b1-7572-4d4e-a412-41159bb7dc30"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842553 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-client-ca\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842610 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-serving-cert\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-config\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842674 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-proxy-ca-bundles\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842691 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzqks\" (UniqueName: \"kubernetes.io/projected/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-kube-api-access-bzqks\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842744 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842770 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842781 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842790 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842799 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4c4f\" (UniqueName: \"kubernetes.io/projected/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-kube-api-access-g4c4f\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842808 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bmkw\" (UniqueName: \"kubernetes.io/projected/4bd5b5b1-7572-4d4e-a412-41159bb7dc30-kube-api-access-4bmkw\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842818 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842826 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.842834 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.843647 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-client-ca\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.844020 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-config\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.844114 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-proxy-ca-bundles\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.847314 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-serving-cert\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.858415 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzqks\" (UniqueName: \"kubernetes.io/projected/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-kube-api-access-bzqks\") pod \"controller-manager-f955cc5d9-tb5ls\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:52 crc kubenswrapper[4786]: I0313 15:06:52.961701 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.261583 4786 generic.go:334] "Generic (PLEG): container finished" podID="48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" containerID="256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0" exitCode=0 Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.262065 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.262071 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" event={"ID":"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87","Type":"ContainerDied","Data":"256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0"} Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.262354 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks" event={"ID":"48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87","Type":"ContainerDied","Data":"09345e8bba71267b53f1e646417e989fdb3e726dab69dba7689181a48524a557"} Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.262378 4786 scope.go:117] "RemoveContainer" containerID="256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.267427 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6568677dd7-wg9j5_4bd5b5b1-7572-4d4e-a412-41159bb7dc30/route-controller-manager/0.log" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.267479 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bd5b5b1-7572-4d4e-a412-41159bb7dc30" containerID="1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6" exitCode=255 Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.267529 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" event={"ID":"4bd5b5b1-7572-4d4e-a412-41159bb7dc30","Type":"ContainerDied","Data":"1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6"} Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.267555 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" event={"ID":"4bd5b5b1-7572-4d4e-a412-41159bb7dc30","Type":"ContainerDied","Data":"9403c2823212f91eaf2461111fcd3a0cc72c79216865a6e712df5ae19328e119"} Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.267600 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.274656 4786 generic.go:334] "Generic (PLEG): container finished" podID="332d0c36-3655-49cf-811d-d925a22ccfc7" containerID="1894e4573d447fd70396f8dcdaf73a118d60046dfdc140ddba17aec411ff00b2" exitCode=0 Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.274715 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"332d0c36-3655-49cf-811d-d925a22ccfc7","Type":"ContainerDied","Data":"1894e4573d447fd70396f8dcdaf73a118d60046dfdc140ddba17aec411ff00b2"} Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.278593 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dkvg" event={"ID":"44d91c62-557f-40c8-a725-33ff965bee1b","Type":"ContainerStarted","Data":"98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2"} Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.291213 4786 scope.go:117] "RemoveContainer" containerID="256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0" Mar 13 15:06:53 crc kubenswrapper[4786]: E0313 15:06:53.291805 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0\": container with ID starting with 256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0 not found: ID does not exist" containerID="256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.291892 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0"} err="failed to get container status \"256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0\": rpc error: code = NotFound desc = could not find container \"256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0\": container with ID starting with 256bf82ebe4c68fbd311fb0bebda52771ad459656aa7935e0e1f2a8b0da76ed0 not found: ID does not exist" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.291943 4786 scope.go:117] "RemoveContainer" containerID="1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.308386 4786 scope.go:117] "RemoveContainer" containerID="1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.308545 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks"] Mar 13 15:06:53 crc kubenswrapper[4786]: E0313 15:06:53.308932 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6\": container with ID starting with 1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6 not found: ID does not exist" containerID="1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.308965 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6"} err="failed to get container status \"1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6\": rpc error: code = NotFound desc = could not find container \"1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6\": container with ID starting with 1f65c7953cbdeba438ff65de723055b72f8bbf820b6d632bb2bc1d275b6d17e6 not found: ID does not exist" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.321254 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b8c7fd94d-m4gks"] Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.324226 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5"] Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.327571 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6568677dd7-wg9j5"] Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.337592 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2dkvg" podStartSLOduration=3.150088387 podStartE2EDuration="43.337575393s" podCreationTimestamp="2026-03-13 15:06:10 +0000 UTC" firstStartedPulling="2026-03-13 15:06:12.486298823 +0000 UTC m=+202.649510624" lastFinishedPulling="2026-03-13 15:06:52.673785809 +0000 UTC m=+242.836997630" observedRunningTime="2026-03-13 15:06:53.335586089 +0000 UTC m=+243.498797920" watchObservedRunningTime="2026-03-13 15:06:53.337575393 +0000 UTC m=+243.500787194" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.362477 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f955cc5d9-tb5ls"] Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.393151 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 15:06:53 crc kubenswrapper[4786]: E0313 15:06:53.393372 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd5b5b1-7572-4d4e-a412-41159bb7dc30" containerName="route-controller-manager" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.393385 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd5b5b1-7572-4d4e-a412-41159bb7dc30" containerName="route-controller-manager" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.393484 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd5b5b1-7572-4d4e-a412-41159bb7dc30" containerName="route-controller-manager" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.393834 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.400149 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.451423 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.451491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9b158d4-083e-4d6e-9237-561014e45538-kube-api-access\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.451537 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-var-lock\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.552361 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9b158d4-083e-4d6e-9237-561014e45538-kube-api-access\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.552469 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-var-lock\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.552556 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.552611 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-var-lock\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.552654 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.572644 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9b158d4-083e-4d6e-9237-561014e45538-kube-api-access\") pod \"installer-9-crc\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.749692 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:06:53 crc kubenswrapper[4786]: I0313 15:06:53.963400 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 15:06:53 crc kubenswrapper[4786]: W0313 15:06:53.969805 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd9b158d4_083e_4d6e_9237_561014e45538.slice/crio-29dfe67a251df6ec533714d17e46cfdb13258d0fefc0c898114344ecb26efdb8 WatchSource:0}: Error finding container 29dfe67a251df6ec533714d17e46cfdb13258d0fefc0c898114344ecb26efdb8: Status 404 returned error can't find the container with id 29dfe67a251df6ec533714d17e46cfdb13258d0fefc0c898114344ecb26efdb8 Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.286496 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d9b158d4-083e-4d6e-9237-561014e45538","Type":"ContainerStarted","Data":"29dfe67a251df6ec533714d17e46cfdb13258d0fefc0c898114344ecb26efdb8"} Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.289593 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" event={"ID":"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5","Type":"ContainerStarted","Data":"d06ff3c6a6c74d7597673a8dfb362b412cd0772664a67c8cf822d73e1d88cbad"} Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.290112 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.290160 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" event={"ID":"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5","Type":"ContainerStarted","Data":"d8fee653e903b2f3d0864bfbde0247e4d751e1562b6e0f582beed70a4c6842f4"} Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.295659 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.305355 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" podStartSLOduration=4.305340251 podStartE2EDuration="4.305340251s" podCreationTimestamp="2026-03-13 15:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:54.3041995 +0000 UTC m=+244.467411311" watchObservedRunningTime="2026-03-13 15:06:54.305340251 +0000 UTC m=+244.468552062" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.543658 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.559822 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87" path="/var/lib/kubelet/pods/48a0edbe-ba3b-42e4-a73f-3fc35c0d3b87/volumes" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.560698 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd5b5b1-7572-4d4e-a412-41159bb7dc30" path="/var/lib/kubelet/pods/4bd5b5b1-7572-4d4e-a412-41159bb7dc30/volumes" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.665803 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/332d0c36-3655-49cf-811d-d925a22ccfc7-kubelet-dir\") pod \"332d0c36-3655-49cf-811d-d925a22ccfc7\" (UID: \"332d0c36-3655-49cf-811d-d925a22ccfc7\") " Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.665947 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/332d0c36-3655-49cf-811d-d925a22ccfc7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "332d0c36-3655-49cf-811d-d925a22ccfc7" (UID: "332d0c36-3655-49cf-811d-d925a22ccfc7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.665986 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/332d0c36-3655-49cf-811d-d925a22ccfc7-kube-api-access\") pod \"332d0c36-3655-49cf-811d-d925a22ccfc7\" (UID: \"332d0c36-3655-49cf-811d-d925a22ccfc7\") " Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.666278 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/332d0c36-3655-49cf-811d-d925a22ccfc7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.670943 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/332d0c36-3655-49cf-811d-d925a22ccfc7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "332d0c36-3655-49cf-811d-d925a22ccfc7" (UID: "332d0c36-3655-49cf-811d-d925a22ccfc7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:54 crc kubenswrapper[4786]: I0313 15:06:54.766721 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/332d0c36-3655-49cf-811d-d925a22ccfc7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.260727 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc"] Mar 13 15:06:55 crc kubenswrapper[4786]: E0313 15:06:55.261053 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332d0c36-3655-49cf-811d-d925a22ccfc7" containerName="pruner" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.261077 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="332d0c36-3655-49cf-811d-d925a22ccfc7" containerName="pruner" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.261278 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="332d0c36-3655-49cf-811d-d925a22ccfc7" containerName="pruner" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.261824 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.264043 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.264444 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.265940 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.266634 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.267389 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.267921 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.271179 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc"] Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.297569 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"332d0c36-3655-49cf-811d-d925a22ccfc7","Type":"ContainerDied","Data":"88cc78426239755be714032332ae9dad95728664b659aab878a725b003a72056"} Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.297632 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88cc78426239755be714032332ae9dad95728664b659aab878a725b003a72056" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.297583 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.298748 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d9b158d4-083e-4d6e-9237-561014e45538","Type":"ContainerStarted","Data":"d5eb24a29a14ffe8f94b59ec2687a623871207889c5d2f0ab66ab4a8b0c62c00"} Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.319991 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.31997413 podStartE2EDuration="2.31997413s" podCreationTimestamp="2026-03-13 15:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:55.317628116 +0000 UTC m=+245.480839927" watchObservedRunningTime="2026-03-13 15:06:55.31997413 +0000 UTC m=+245.483185941" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.376986 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-config\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.377102 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-client-ca\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.377127 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxggx\" (UniqueName: \"kubernetes.io/projected/33a66bd1-e72a-4b76-9873-04a2be83f276-kube-api-access-hxggx\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.377426 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33a66bd1-e72a-4b76-9873-04a2be83f276-serving-cert\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.478903 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-client-ca\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.478946 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxggx\" (UniqueName: \"kubernetes.io/projected/33a66bd1-e72a-4b76-9873-04a2be83f276-kube-api-access-hxggx\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.478979 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33a66bd1-e72a-4b76-9873-04a2be83f276-serving-cert\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.479018 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-config\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.479833 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-client-ca\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.480068 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-config\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.489328 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33a66bd1-e72a-4b76-9873-04a2be83f276-serving-cert\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.502657 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxggx\" (UniqueName: \"kubernetes.io/projected/33a66bd1-e72a-4b76-9873-04a2be83f276-kube-api-access-hxggx\") pod \"route-controller-manager-548f59b895-zhkbc\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.588503 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.702203 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 15:06:55 crc kubenswrapper[4786]: I0313 15:06:55.988505 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc"] Mar 13 15:06:56 crc kubenswrapper[4786]: I0313 15:06:56.306259 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" event={"ID":"1eda6a73-a8cf-406d-ab33-394ec1982f4a","Type":"ContainerStarted","Data":"f0706fa676c676803b709d131c215b589e47c181d768675fc2d4eff10a128f08"} Mar 13 15:06:56 crc kubenswrapper[4786]: I0313 15:06:56.307362 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" event={"ID":"33a66bd1-e72a-4b76-9873-04a2be83f276","Type":"ContainerStarted","Data":"b06be669316cda8c21035dcaf3005d554fbbb9f14007edba604041d451d02edc"} Mar 13 15:06:56 crc kubenswrapper[4786]: I0313 15:06:56.323490 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" podStartSLOduration=8.456872607 podStartE2EDuration="56.323473247s" podCreationTimestamp="2026-03-13 15:06:00 +0000 UTC" firstStartedPulling="2026-03-13 15:06:07.594518828 +0000 UTC m=+197.757730639" lastFinishedPulling="2026-03-13 15:06:55.461119468 +0000 UTC m=+245.624331279" observedRunningTime="2026-03-13 15:06:56.322436889 +0000 UTC m=+246.485648710" watchObservedRunningTime="2026-03-13 15:06:56.323473247 +0000 UTC m=+246.486685058" Mar 13 15:06:56 crc kubenswrapper[4786]: I0313 15:06:56.495383 4786 csr.go:261] certificate signing request csr-v7qj2 is approved, waiting to be issued Mar 13 15:06:56 crc kubenswrapper[4786]: I0313 15:06:56.505633 4786 csr.go:257] certificate signing request csr-v7qj2 is issued Mar 13 15:06:57 crc kubenswrapper[4786]: I0313 15:06:57.314001 4786 generic.go:334] "Generic (PLEG): container finished" podID="1eda6a73-a8cf-406d-ab33-394ec1982f4a" containerID="f0706fa676c676803b709d131c215b589e47c181d768675fc2d4eff10a128f08" exitCode=0 Mar 13 15:06:57 crc kubenswrapper[4786]: I0313 15:06:57.314076 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" event={"ID":"1eda6a73-a8cf-406d-ab33-394ec1982f4a","Type":"ContainerDied","Data":"f0706fa676c676803b709d131c215b589e47c181d768675fc2d4eff10a128f08"} Mar 13 15:06:57 crc kubenswrapper[4786]: I0313 15:06:57.317135 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" event={"ID":"33a66bd1-e72a-4b76-9873-04a2be83f276","Type":"ContainerStarted","Data":"2099fda3d7c9919dc621e50ef177cd19e9ae45608c2d6703386465ad7493aed2"} Mar 13 15:06:57 crc kubenswrapper[4786]: I0313 15:06:57.317532 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:57 crc kubenswrapper[4786]: I0313 15:06:57.323511 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:06:57 crc kubenswrapper[4786]: I0313 15:06:57.354040 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" podStartSLOduration=7.354020748 podStartE2EDuration="7.354020748s" podCreationTimestamp="2026-03-13 15:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:06:57.351805268 +0000 UTC m=+247.515017079" watchObservedRunningTime="2026-03-13 15:06:57.354020748 +0000 UTC m=+247.517232559" Mar 13 15:06:57 crc kubenswrapper[4786]: I0313 15:06:57.506497 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-22 01:00:28.545940825 +0000 UTC Mar 13 15:06:57 crc kubenswrapper[4786]: I0313 15:06:57.506530 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6801h53m31.03941332s for next certificate rotation Mar 13 15:06:58 crc kubenswrapper[4786]: I0313 15:06:58.507092 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-12 16:07:52.057338399 +0000 UTC Mar 13 15:06:58 crc kubenswrapper[4786]: I0313 15:06:58.507132 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7321h0m53.550209306s for next certificate rotation Mar 13 15:06:58 crc kubenswrapper[4786]: I0313 15:06:58.590792 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" Mar 13 15:06:58 crc kubenswrapper[4786]: I0313 15:06:58.621132 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvkds\" (UniqueName: \"kubernetes.io/projected/1eda6a73-a8cf-406d-ab33-394ec1982f4a-kube-api-access-nvkds\") pod \"1eda6a73-a8cf-406d-ab33-394ec1982f4a\" (UID: \"1eda6a73-a8cf-406d-ab33-394ec1982f4a\") " Mar 13 15:06:58 crc kubenswrapper[4786]: I0313 15:06:58.634250 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eda6a73-a8cf-406d-ab33-394ec1982f4a-kube-api-access-nvkds" (OuterVolumeSpecName: "kube-api-access-nvkds") pod "1eda6a73-a8cf-406d-ab33-394ec1982f4a" (UID: "1eda6a73-a8cf-406d-ab33-394ec1982f4a"). InnerVolumeSpecName "kube-api-access-nvkds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:58 crc kubenswrapper[4786]: I0313 15:06:58.722889 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvkds\" (UniqueName: \"kubernetes.io/projected/1eda6a73-a8cf-406d-ab33-394ec1982f4a-kube-api-access-nvkds\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:59 crc kubenswrapper[4786]: I0313 15:06:59.328667 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" event={"ID":"1eda6a73-a8cf-406d-ab33-394ec1982f4a","Type":"ContainerDied","Data":"3dcb19c3a1321ecb5f357d620e857b85ba43d8b1489fb0a88ed50ba265e8284f"} Mar 13 15:06:59 crc kubenswrapper[4786]: I0313 15:06:59.328723 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dcb19c3a1321ecb5f357d620e857b85ba43d8b1489fb0a88ed50ba265e8284f" Mar 13 15:06:59 crc kubenswrapper[4786]: I0313 15:06:59.328766 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-bz7fd" Mar 13 15:07:01 crc kubenswrapper[4786]: I0313 15:07:01.060908 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:07:01 crc kubenswrapper[4786]: I0313 15:07:01.060982 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:07:01 crc kubenswrapper[4786]: I0313 15:07:01.610198 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:07:01 crc kubenswrapper[4786]: I0313 15:07:01.670243 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:07:02 crc kubenswrapper[4786]: I0313 15:07:02.345954 4786 generic.go:334] "Generic (PLEG): container finished" podID="38981c4c-48a9-497e-86b7-c3852574cae2" containerID="7b25b81b9a19c0fe81bf6b227aec3dee81a0b07ee827117252170eecabd2031b" exitCode=0 Mar 13 15:07:02 crc kubenswrapper[4786]: I0313 15:07:02.346025 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5lnt2" event={"ID":"38981c4c-48a9-497e-86b7-c3852574cae2","Type":"ContainerDied","Data":"7b25b81b9a19c0fe81bf6b227aec3dee81a0b07ee827117252170eecabd2031b"} Mar 13 15:07:03 crc kubenswrapper[4786]: I0313 15:07:03.823551 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-b7krw"] Mar 13 15:07:06 crc kubenswrapper[4786]: I0313 15:07:06.368469 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5lnt2" event={"ID":"38981c4c-48a9-497e-86b7-c3852574cae2","Type":"ContainerStarted","Data":"779a35354582dfac2b21a29f7ee4fdaee1a05b2e618c58879f5d06437789de31"} Mar 13 15:07:06 crc kubenswrapper[4786]: I0313 15:07:06.390101 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5lnt2" podStartSLOduration=2.609116472 podStartE2EDuration="53.390084076s" podCreationTimestamp="2026-03-13 15:06:13 +0000 UTC" firstStartedPulling="2026-03-13 15:06:14.784178608 +0000 UTC m=+204.947390419" lastFinishedPulling="2026-03-13 15:07:05.565146212 +0000 UTC m=+255.728358023" observedRunningTime="2026-03-13 15:07:06.387373793 +0000 UTC m=+256.550585614" watchObservedRunningTime="2026-03-13 15:07:06.390084076 +0000 UTC m=+256.553295887" Mar 13 15:07:07 crc kubenswrapper[4786]: I0313 15:07:07.377380 4786 generic.go:334] "Generic (PLEG): container finished" podID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerID="d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb" exitCode=0 Mar 13 15:07:07 crc kubenswrapper[4786]: I0313 15:07:07.377851 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92qtc" event={"ID":"0115a774-8b25-4e1d-9d6f-c4202035efa9","Type":"ContainerDied","Data":"d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb"} Mar 13 15:07:07 crc kubenswrapper[4786]: I0313 15:07:07.868754 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:07:07 crc kubenswrapper[4786]: I0313 15:07:07.868841 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:07:09 crc kubenswrapper[4786]: I0313 15:07:09.391382 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerID="ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d" exitCode=0 Mar 13 15:07:09 crc kubenswrapper[4786]: I0313 15:07:09.391472 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbj4r" event={"ID":"4bcce315-5828-4a7c-870f-6dd6518af3dd","Type":"ContainerDied","Data":"ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d"} Mar 13 15:07:09 crc kubenswrapper[4786]: I0313 15:07:09.393900 4786 generic.go:334] "Generic (PLEG): container finished" podID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerID="79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4" exitCode=0 Mar 13 15:07:09 crc kubenswrapper[4786]: I0313 15:07:09.393968 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k626p" event={"ID":"4672e3ad-80ad-4d20-89b6-b6d11c9eb508","Type":"ContainerDied","Data":"79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4"} Mar 13 15:07:09 crc kubenswrapper[4786]: I0313 15:07:09.396218 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5282ab3-536a-405e-93af-c9c16130ec87" containerID="aebadc150a1750ddbb6f1719b8d3bd55e1abc71c0d4b08b7e82129aa424d53f9" exitCode=0 Mar 13 15:07:09 crc kubenswrapper[4786]: I0313 15:07:09.396258 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggggh" event={"ID":"b5282ab3-536a-405e-93af-c9c16130ec87","Type":"ContainerDied","Data":"aebadc150a1750ddbb6f1719b8d3bd55e1abc71c0d4b08b7e82129aa424d53f9"} Mar 13 15:07:10 crc kubenswrapper[4786]: I0313 15:07:10.405275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q84vs" event={"ID":"73d30051-975a-4329-8d7b-32d297b35218","Type":"ContainerStarted","Data":"143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583"} Mar 13 15:07:10 crc kubenswrapper[4786]: I0313 15:07:10.408476 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92qtc" event={"ID":"0115a774-8b25-4e1d-9d6f-c4202035efa9","Type":"ContainerStarted","Data":"ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c"} Mar 13 15:07:10 crc kubenswrapper[4786]: I0313 15:07:10.413452 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f955cc5d9-tb5ls"] Mar 13 15:07:10 crc kubenswrapper[4786]: I0313 15:07:10.413635 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" podUID="ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" containerName="controller-manager" containerID="cri-o://d06ff3c6a6c74d7597673a8dfb362b412cd0772664a67c8cf822d73e1d88cbad" gracePeriod=30 Mar 13 15:07:10 crc kubenswrapper[4786]: I0313 15:07:10.434672 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc"] Mar 13 15:07:10 crc kubenswrapper[4786]: I0313 15:07:10.434930 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" podUID="33a66bd1-e72a-4b76-9873-04a2be83f276" containerName="route-controller-manager" containerID="cri-o://2099fda3d7c9919dc621e50ef177cd19e9ae45608c2d6703386465ad7493aed2" gracePeriod=30 Mar 13 15:07:10 crc kubenswrapper[4786]: I0313 15:07:10.482560 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-92qtc" podStartSLOduration=3.233375038 podStartE2EDuration="59.482538913s" podCreationTimestamp="2026-03-13 15:06:11 +0000 UTC" firstStartedPulling="2026-03-13 15:06:13.67190546 +0000 UTC m=+203.835117271" lastFinishedPulling="2026-03-13 15:07:09.921069335 +0000 UTC m=+260.084281146" observedRunningTime="2026-03-13 15:07:10.478531745 +0000 UTC m=+260.641743566" watchObservedRunningTime="2026-03-13 15:07:10.482538913 +0000 UTC m=+260.645750724" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.415305 4786 generic.go:334] "Generic (PLEG): container finished" podID="ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" containerID="d06ff3c6a6c74d7597673a8dfb362b412cd0772664a67c8cf822d73e1d88cbad" exitCode=0 Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.415632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" event={"ID":"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5","Type":"ContainerDied","Data":"d06ff3c6a6c74d7597673a8dfb362b412cd0772664a67c8cf822d73e1d88cbad"} Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.418547 4786 generic.go:334] "Generic (PLEG): container finished" podID="33a66bd1-e72a-4b76-9873-04a2be83f276" containerID="2099fda3d7c9919dc621e50ef177cd19e9ae45608c2d6703386465ad7493aed2" exitCode=0 Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.419025 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" event={"ID":"33a66bd1-e72a-4b76-9873-04a2be83f276","Type":"ContainerDied","Data":"2099fda3d7c9919dc621e50ef177cd19e9ae45608c2d6703386465ad7493aed2"} Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.422635 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k626p" event={"ID":"4672e3ad-80ad-4d20-89b6-b6d11c9eb508","Type":"ContainerStarted","Data":"4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa"} Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.425093 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.425257 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.433074 4786 generic.go:334] "Generic (PLEG): container finished" podID="73d30051-975a-4329-8d7b-32d297b35218" containerID="143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583" exitCode=0 Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.433781 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q84vs" event={"ID":"73d30051-975a-4329-8d7b-32d297b35218","Type":"ContainerDied","Data":"143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583"} Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.453075 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k626p" podStartSLOduration=3.241649742 podStartE2EDuration="1m0.453034073s" podCreationTimestamp="2026-03-13 15:06:11 +0000 UTC" firstStartedPulling="2026-03-13 15:06:13.682904458 +0000 UTC m=+203.846116269" lastFinishedPulling="2026-03-13 15:07:10.894288789 +0000 UTC m=+261.057500600" observedRunningTime="2026-03-13 15:07:11.449429516 +0000 UTC m=+261.612641327" watchObservedRunningTime="2026-03-13 15:07:11.453034073 +0000 UTC m=+261.616245904" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.636154 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k626p" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.636203 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k626p" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.918014 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.922990 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.958392 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9989746bc-q797k"] Mar 13 15:07:11 crc kubenswrapper[4786]: E0313 15:07:11.958697 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eda6a73-a8cf-406d-ab33-394ec1982f4a" containerName="oc" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.958718 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eda6a73-a8cf-406d-ab33-394ec1982f4a" containerName="oc" Mar 13 15:07:11 crc kubenswrapper[4786]: E0313 15:07:11.958743 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" containerName="controller-manager" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.958754 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" containerName="controller-manager" Mar 13 15:07:11 crc kubenswrapper[4786]: E0313 15:07:11.958780 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a66bd1-e72a-4b76-9873-04a2be83f276" containerName="route-controller-manager" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.958796 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a66bd1-e72a-4b76-9873-04a2be83f276" containerName="route-controller-manager" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.958995 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eda6a73-a8cf-406d-ab33-394ec1982f4a" containerName="oc" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.959014 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" containerName="controller-manager" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.959035 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a66bd1-e72a-4b76-9873-04a2be83f276" containerName="route-controller-manager" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.959573 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:11 crc kubenswrapper[4786]: I0313 15:07:11.982623 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9989746bc-q797k"] Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002214 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzqks\" (UniqueName: \"kubernetes.io/projected/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-kube-api-access-bzqks\") pod \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002282 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-serving-cert\") pod \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002312 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-config\") pod \"33a66bd1-e72a-4b76-9873-04a2be83f276\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-client-ca\") pod \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002386 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxggx\" (UniqueName: \"kubernetes.io/projected/33a66bd1-e72a-4b76-9873-04a2be83f276-kube-api-access-hxggx\") pod \"33a66bd1-e72a-4b76-9873-04a2be83f276\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-client-ca\") pod \"33a66bd1-e72a-4b76-9873-04a2be83f276\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002445 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-proxy-ca-bundles\") pod \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002462 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-config\") pod \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\" (UID: \"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002489 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33a66bd1-e72a-4b76-9873-04a2be83f276-serving-cert\") pod \"33a66bd1-e72a-4b76-9873-04a2be83f276\" (UID: \"33a66bd1-e72a-4b76-9873-04a2be83f276\") " Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002632 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-config\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.002693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-serving-cert\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.003298 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-client-ca\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.003360 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmvg5\" (UniqueName: \"kubernetes.io/projected/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-kube-api-access-xmvg5\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.003355 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" (UID: "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.003355 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-client-ca" (OuterVolumeSpecName: "client-ca") pod "33a66bd1-e72a-4b76-9873-04a2be83f276" (UID: "33a66bd1-e72a-4b76-9873-04a2be83f276"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.003391 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-client-ca" (OuterVolumeSpecName: "client-ca") pod "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" (UID: "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.003705 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-config" (OuterVolumeSpecName: "config") pod "33a66bd1-e72a-4b76-9873-04a2be83f276" (UID: "33a66bd1-e72a-4b76-9873-04a2be83f276"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.003721 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-config" (OuterVolumeSpecName: "config") pod "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" (UID: "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.011327 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" (UID: "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.011375 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-kube-api-access-bzqks" (OuterVolumeSpecName: "kube-api-access-bzqks") pod "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" (UID: "ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5"). InnerVolumeSpecName "kube-api-access-bzqks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.011468 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a66bd1-e72a-4b76-9873-04a2be83f276-kube-api-access-hxggx" (OuterVolumeSpecName: "kube-api-access-hxggx") pod "33a66bd1-e72a-4b76-9873-04a2be83f276" (UID: "33a66bd1-e72a-4b76-9873-04a2be83f276"). InnerVolumeSpecName "kube-api-access-hxggx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.025888 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33a66bd1-e72a-4b76-9873-04a2be83f276-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "33a66bd1-e72a-4b76-9873-04a2be83f276" (UID: "33a66bd1-e72a-4b76-9873-04a2be83f276"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104687 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmvg5\" (UniqueName: \"kubernetes.io/projected/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-kube-api-access-xmvg5\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104738 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-config\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-serving-cert\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104820 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-client-ca\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104873 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxggx\" (UniqueName: \"kubernetes.io/projected/33a66bd1-e72a-4b76-9873-04a2be83f276-kube-api-access-hxggx\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104885 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104894 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104902 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104911 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33a66bd1-e72a-4b76-9873-04a2be83f276-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104919 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzqks\" (UniqueName: \"kubernetes.io/projected/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-kube-api-access-bzqks\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104927 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104934 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a66bd1-e72a-4b76-9873-04a2be83f276-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.104941 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.105736 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-client-ca\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.105911 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-config\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.110900 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-serving-cert\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.137830 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmvg5\" (UniqueName: \"kubernetes.io/projected/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-kube-api-access-xmvg5\") pod \"route-controller-manager-9989746bc-q797k\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.275708 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.441792 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" event={"ID":"33a66bd1-e72a-4b76-9873-04a2be83f276","Type":"ContainerDied","Data":"b06be669316cda8c21035dcaf3005d554fbbb9f14007edba604041d451d02edc"} Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.441877 4786 scope.go:117] "RemoveContainer" containerID="2099fda3d7c9919dc621e50ef177cd19e9ae45608c2d6703386465ad7493aed2" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.441932 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.444523 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.445056 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f955cc5d9-tb5ls" event={"ID":"ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5","Type":"ContainerDied","Data":"d8fee653e903b2f3d0864bfbde0247e4d751e1562b6e0f582beed70a4c6842f4"} Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.470547 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-92qtc" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="registry-server" probeResult="failure" output=< Mar 13 15:07:12 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 15:07:12 crc kubenswrapper[4786]: > Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.474686 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f955cc5d9-tb5ls"] Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.479089 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f955cc5d9-tb5ls"] Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.484796 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc"] Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.487400 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548f59b895-zhkbc"] Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.491539 4786 scope.go:117] "RemoveContainer" containerID="d06ff3c6a6c74d7597673a8dfb362b412cd0772664a67c8cf822d73e1d88cbad" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.558815 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a66bd1-e72a-4b76-9873-04a2be83f276" path="/var/lib/kubelet/pods/33a66bd1-e72a-4b76-9873-04a2be83f276/volumes" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.559368 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5" path="/var/lib/kubelet/pods/ce92a6df-b2c4-4328-8e9e-a62b7af8d2f5/volumes" Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.692061 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-k626p" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="registry-server" probeResult="failure" output=< Mar 13 15:07:12 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 15:07:12 crc kubenswrapper[4786]: > Mar 13 15:07:12 crc kubenswrapper[4786]: W0313 15:07:12.731774 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058572a7_4afb_4f0f_ac7f_2e3979fe31a9.slice/crio-ab318944345bb6d30468389c724baeb40df637cce1f9ec0fb491cb858e3d101b WatchSource:0}: Error finding container ab318944345bb6d30468389c724baeb40df637cce1f9ec0fb491cb858e3d101b: Status 404 returned error can't find the container with id ab318944345bb6d30468389c724baeb40df637cce1f9ec0fb491cb858e3d101b Mar 13 15:07:12 crc kubenswrapper[4786]: I0313 15:07:12.752093 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9989746bc-q797k"] Mar 13 15:07:13 crc kubenswrapper[4786]: I0313 15:07:13.457361 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" event={"ID":"058572a7-4afb-4f0f-ac7f-2e3979fe31a9","Type":"ContainerStarted","Data":"ab318944345bb6d30468389c724baeb40df637cce1f9ec0fb491cb858e3d101b"} Mar 13 15:07:13 crc kubenswrapper[4786]: I0313 15:07:13.461130 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:07:13 crc kubenswrapper[4786]: I0313 15:07:13.461336 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:07:13 crc kubenswrapper[4786]: I0313 15:07:13.506924 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.270554 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc"] Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.271728 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.274770 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.275010 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.279108 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.279117 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.279195 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.279280 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.283058 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.287901 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc"] Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.333527 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-serving-cert\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.333607 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-client-ca\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.333662 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-config\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.333695 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-proxy-ca-bundles\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.333751 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s2hm\" (UniqueName: \"kubernetes.io/projected/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-kube-api-access-4s2hm\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.435340 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s2hm\" (UniqueName: \"kubernetes.io/projected/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-kube-api-access-4s2hm\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.435411 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-proxy-ca-bundles\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.435514 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-serving-cert\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.435573 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-client-ca\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.435642 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-config\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.436569 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-client-ca\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.437102 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-proxy-ca-bundles\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.438088 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-config\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.453499 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-serving-cert\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.458928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s2hm\" (UniqueName: \"kubernetes.io/projected/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-kube-api-access-4s2hm\") pod \"controller-manager-779cb7cbc8-4q8bc\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.510743 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:07:14 crc kubenswrapper[4786]: I0313 15:07:14.630601 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:15 crc kubenswrapper[4786]: I0313 15:07:15.030443 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc"] Mar 13 15:07:15 crc kubenswrapper[4786]: W0313 15:07:15.042356 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f03a0c_d5c7_45ba_80a6_9d7c0ec4266f.slice/crio-977a1ecad2cc44cedfb47cd2f8807c716fdcb1a3cd1a1c9c6087316274b861d3 WatchSource:0}: Error finding container 977a1ecad2cc44cedfb47cd2f8807c716fdcb1a3cd1a1c9c6087316274b861d3: Status 404 returned error can't find the container with id 977a1ecad2cc44cedfb47cd2f8807c716fdcb1a3cd1a1c9c6087316274b861d3 Mar 13 15:07:15 crc kubenswrapper[4786]: I0313 15:07:15.476603 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbj4r" event={"ID":"4bcce315-5828-4a7c-870f-6dd6518af3dd","Type":"ContainerStarted","Data":"17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4"} Mar 13 15:07:15 crc kubenswrapper[4786]: I0313 15:07:15.478991 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" event={"ID":"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f","Type":"ContainerStarted","Data":"977a1ecad2cc44cedfb47cd2f8807c716fdcb1a3cd1a1c9c6087316274b861d3"} Mar 13 15:07:16 crc kubenswrapper[4786]: I0313 15:07:16.485744 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" event={"ID":"058572a7-4afb-4f0f-ac7f-2e3979fe31a9","Type":"ContainerStarted","Data":"4e7efe0e3c0b8d034e6a3c2c7249f5d3fdc289b6c227e2406f3a4011f254dc70"} Mar 13 15:07:17 crc kubenswrapper[4786]: I0313 15:07:17.246130 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5lnt2"] Mar 13 15:07:17 crc kubenswrapper[4786]: I0313 15:07:17.493607 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" event={"ID":"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f","Type":"ContainerStarted","Data":"0856f33841fdcbf8a81c9e9ad54607fc44e158a95d14747ea77c2aa847035ca7"} Mar 13 15:07:17 crc kubenswrapper[4786]: I0313 15:07:17.493789 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5lnt2" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" containerName="registry-server" containerID="cri-o://779a35354582dfac2b21a29f7ee4fdaee1a05b2e618c58879f5d06437789de31" gracePeriod=2 Mar 13 15:07:18 crc kubenswrapper[4786]: I0313 15:07:18.499140 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:18 crc kubenswrapper[4786]: I0313 15:07:18.505020 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:18 crc kubenswrapper[4786]: I0313 15:07:18.541504 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" podStartSLOduration=8.541484119 podStartE2EDuration="8.541484119s" podCreationTimestamp="2026-03-13 15:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:07:18.539867922 +0000 UTC m=+268.703079733" watchObservedRunningTime="2026-03-13 15:07:18.541484119 +0000 UTC m=+268.704695940" Mar 13 15:07:18 crc kubenswrapper[4786]: I0313 15:07:18.563335 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kbj4r" podStartSLOduration=9.723016296 podStartE2EDuration="1m8.563318127s" podCreationTimestamp="2026-03-13 15:06:10 +0000 UTC" firstStartedPulling="2026-03-13 15:06:13.651518017 +0000 UTC m=+203.814729828" lastFinishedPulling="2026-03-13 15:07:12.491819838 +0000 UTC m=+262.655031659" observedRunningTime="2026-03-13 15:07:18.559834865 +0000 UTC m=+268.723046676" watchObservedRunningTime="2026-03-13 15:07:18.563318127 +0000 UTC m=+268.726529948" Mar 13 15:07:19 crc kubenswrapper[4786]: I0313 15:07:19.506753 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:19 crc kubenswrapper[4786]: I0313 15:07:19.514049 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:19 crc kubenswrapper[4786]: I0313 15:07:19.530576 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" podStartSLOduration=9.530557845 podStartE2EDuration="9.530557845s" podCreationTimestamp="2026-03-13 15:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:07:19.529795128 +0000 UTC m=+269.693006999" watchObservedRunningTime="2026-03-13 15:07:19.530557845 +0000 UTC m=+269.693769666" Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.246994 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.248207 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.314538 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.486344 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.525112 4786 generic.go:334] "Generic (PLEG): container finished" podID="38981c4c-48a9-497e-86b7-c3852574cae2" containerID="779a35354582dfac2b21a29f7ee4fdaee1a05b2e618c58879f5d06437789de31" exitCode=0 Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.525364 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5lnt2" event={"ID":"38981c4c-48a9-497e-86b7-c3852574cae2","Type":"ContainerDied","Data":"779a35354582dfac2b21a29f7ee4fdaee1a05b2e618c58879f5d06437789de31"} Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.533334 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.582619 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.674787 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k626p" Mar 13 15:07:21 crc kubenswrapper[4786]: I0313 15:07:21.719637 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k626p" Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.656398 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.736144 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-utilities\") pod \"38981c4c-48a9-497e-86b7-c3852574cae2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.736245 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jgt7\" (UniqueName: \"kubernetes.io/projected/38981c4c-48a9-497e-86b7-c3852574cae2-kube-api-access-8jgt7\") pod \"38981c4c-48a9-497e-86b7-c3852574cae2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.736268 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-catalog-content\") pod \"38981c4c-48a9-497e-86b7-c3852574cae2\" (UID: \"38981c4c-48a9-497e-86b7-c3852574cae2\") " Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.737683 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-utilities" (OuterVolumeSpecName: "utilities") pod "38981c4c-48a9-497e-86b7-c3852574cae2" (UID: "38981c4c-48a9-497e-86b7-c3852574cae2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.741230 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38981c4c-48a9-497e-86b7-c3852574cae2-kube-api-access-8jgt7" (OuterVolumeSpecName: "kube-api-access-8jgt7") pod "38981c4c-48a9-497e-86b7-c3852574cae2" (UID: "38981c4c-48a9-497e-86b7-c3852574cae2"). InnerVolumeSpecName "kube-api-access-8jgt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.763273 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38981c4c-48a9-497e-86b7-c3852574cae2" (UID: "38981c4c-48a9-497e-86b7-c3852574cae2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.838096 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.838130 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jgt7\" (UniqueName: \"kubernetes.io/projected/38981c4c-48a9-497e-86b7-c3852574cae2-kube-api-access-8jgt7\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.838175 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38981c4c-48a9-497e-86b7-c3852574cae2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.851475 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-92qtc"] Mar 13 15:07:22 crc kubenswrapper[4786]: I0313 15:07:22.851839 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-92qtc" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="registry-server" containerID="cri-o://ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c" gracePeriod=2 Mar 13 15:07:23 crc kubenswrapper[4786]: I0313 15:07:23.548224 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5lnt2" Mar 13 15:07:23 crc kubenswrapper[4786]: I0313 15:07:23.548381 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5lnt2" event={"ID":"38981c4c-48a9-497e-86b7-c3852574cae2","Type":"ContainerDied","Data":"fcd107a86c9ef1b8d3575aaff317a54888c86f593afa803c1f38d337f50c723c"} Mar 13 15:07:23 crc kubenswrapper[4786]: I0313 15:07:23.548905 4786 scope.go:117] "RemoveContainer" containerID="779a35354582dfac2b21a29f7ee4fdaee1a05b2e618c58879f5d06437789de31" Mar 13 15:07:23 crc kubenswrapper[4786]: I0313 15:07:23.597001 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5lnt2"] Mar 13 15:07:23 crc kubenswrapper[4786]: I0313 15:07:23.603844 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5lnt2"] Mar 13 15:07:23 crc kubenswrapper[4786]: I0313 15:07:23.850906 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k626p"] Mar 13 15:07:23 crc kubenswrapper[4786]: I0313 15:07:23.851168 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k626p" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="registry-server" containerID="cri-o://4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa" gracePeriod=2 Mar 13 15:07:24 crc kubenswrapper[4786]: I0313 15:07:24.560081 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" path="/var/lib/kubelet/pods/38981c4c-48a9-497e-86b7-c3852574cae2/volumes" Mar 13 15:07:24 crc kubenswrapper[4786]: I0313 15:07:24.936220 4786 scope.go:117] "RemoveContainer" containerID="7b25b81b9a19c0fe81bf6b227aec3dee81a0b07ee827117252170eecabd2031b" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.002751 4786 scope.go:117] "RemoveContainer" containerID="e6f949a05f768c96e5fe100ef65d0a13680aede8f46ec13f3bfd1f1336954776" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.345418 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k626p" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.380930 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-catalog-content\") pod \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.380986 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-utilities\") pod \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.381055 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlvdz\" (UniqueName: \"kubernetes.io/projected/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-kube-api-access-mlvdz\") pod \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\" (UID: \"4672e3ad-80ad-4d20-89b6-b6d11c9eb508\") " Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.382804 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-utilities" (OuterVolumeSpecName: "utilities") pod "4672e3ad-80ad-4d20-89b6-b6d11c9eb508" (UID: "4672e3ad-80ad-4d20-89b6-b6d11c9eb508"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.386079 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-kube-api-access-mlvdz" (OuterVolumeSpecName: "kube-api-access-mlvdz") pod "4672e3ad-80ad-4d20-89b6-b6d11c9eb508" (UID: "4672e3ad-80ad-4d20-89b6-b6d11c9eb508"). InnerVolumeSpecName "kube-api-access-mlvdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.388158 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.443215 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4672e3ad-80ad-4d20-89b6-b6d11c9eb508" (UID: "4672e3ad-80ad-4d20-89b6-b6d11c9eb508"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.481701 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-catalog-content\") pod \"0115a774-8b25-4e1d-9d6f-c4202035efa9\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.481750 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4pmd\" (UniqueName: \"kubernetes.io/projected/0115a774-8b25-4e1d-9d6f-c4202035efa9-kube-api-access-w4pmd\") pod \"0115a774-8b25-4e1d-9d6f-c4202035efa9\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.481825 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-utilities\") pod \"0115a774-8b25-4e1d-9d6f-c4202035efa9\" (UID: \"0115a774-8b25-4e1d-9d6f-c4202035efa9\") " Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.482057 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlvdz\" (UniqueName: \"kubernetes.io/projected/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-kube-api-access-mlvdz\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.482074 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.482083 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4672e3ad-80ad-4d20-89b6-b6d11c9eb508-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.482651 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-utilities" (OuterVolumeSpecName: "utilities") pod "0115a774-8b25-4e1d-9d6f-c4202035efa9" (UID: "0115a774-8b25-4e1d-9d6f-c4202035efa9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.486444 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0115a774-8b25-4e1d-9d6f-c4202035efa9-kube-api-access-w4pmd" (OuterVolumeSpecName: "kube-api-access-w4pmd") pod "0115a774-8b25-4e1d-9d6f-c4202035efa9" (UID: "0115a774-8b25-4e1d-9d6f-c4202035efa9"). InnerVolumeSpecName "kube-api-access-w4pmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.541708 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0115a774-8b25-4e1d-9d6f-c4202035efa9" (UID: "0115a774-8b25-4e1d-9d6f-c4202035efa9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.559289 4786 generic.go:334] "Generic (PLEG): container finished" podID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerID="ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c" exitCode=0 Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.559339 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-92qtc" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.559369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92qtc" event={"ID":"0115a774-8b25-4e1d-9d6f-c4202035efa9","Type":"ContainerDied","Data":"ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c"} Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.559403 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-92qtc" event={"ID":"0115a774-8b25-4e1d-9d6f-c4202035efa9","Type":"ContainerDied","Data":"90b78259b57dbe97ef65ec7e9ad8da36b01873f67aaad642637353d3bd4b4c7e"} Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.559420 4786 scope.go:117] "RemoveContainer" containerID="ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.561233 4786 generic.go:334] "Generic (PLEG): container finished" podID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerID="4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa" exitCode=0 Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.561268 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k626p" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.561278 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k626p" event={"ID":"4672e3ad-80ad-4d20-89b6-b6d11c9eb508","Type":"ContainerDied","Data":"4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa"} Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.561331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k626p" event={"ID":"4672e3ad-80ad-4d20-89b6-b6d11c9eb508","Type":"ContainerDied","Data":"dcd5af43f7bfbadd48be6ef5bdbb7204cd196dfdb4feda95bf67c2f08cc26199"} Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.569035 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q84vs" event={"ID":"73d30051-975a-4329-8d7b-32d297b35218","Type":"ContainerStarted","Data":"2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b"} Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.574477 4786 generic.go:334] "Generic (PLEG): container finished" podID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerID="6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c" exitCode=0 Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.574698 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v9g8" event={"ID":"8e627f8c-63e5-4b85-9fff-0c205c96d0a4","Type":"ContainerDied","Data":"6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c"} Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.575526 4786 scope.go:117] "RemoveContainer" containerID="d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.581461 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggggh" event={"ID":"b5282ab3-536a-405e-93af-c9c16130ec87","Type":"ContainerStarted","Data":"d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8"} Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.582659 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.582683 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4pmd\" (UniqueName: \"kubernetes.io/projected/0115a774-8b25-4e1d-9d6f-c4202035efa9-kube-api-access-w4pmd\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.582716 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0115a774-8b25-4e1d-9d6f-c4202035efa9-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.590001 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q84vs" podStartSLOduration=3.378028497 podStartE2EDuration="1m12.58998254s" podCreationTimestamp="2026-03-13 15:06:13 +0000 UTC" firstStartedPulling="2026-03-13 15:06:15.792995728 +0000 UTC m=+205.956207539" lastFinishedPulling="2026-03-13 15:07:25.004949761 +0000 UTC m=+275.168161582" observedRunningTime="2026-03-13 15:07:25.588438586 +0000 UTC m=+275.751650397" watchObservedRunningTime="2026-03-13 15:07:25.58998254 +0000 UTC m=+275.753194351" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.603746 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ggggh" podStartSLOduration=2.397450487 podStartE2EDuration="1m11.603727884s" podCreationTimestamp="2026-03-13 15:06:14 +0000 UTC" firstStartedPulling="2026-03-13 15:06:15.798602531 +0000 UTC m=+205.961814342" lastFinishedPulling="2026-03-13 15:07:25.004879918 +0000 UTC m=+275.168091739" observedRunningTime="2026-03-13 15:07:25.602535362 +0000 UTC m=+275.765747173" watchObservedRunningTime="2026-03-13 15:07:25.603727884 +0000 UTC m=+275.766939695" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.610421 4786 scope.go:117] "RemoveContainer" containerID="1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.616415 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-92qtc"] Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.627317 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-92qtc"] Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.631133 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k626p"] Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.634099 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k626p"] Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.635678 4786 scope.go:117] "RemoveContainer" containerID="ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c" Mar 13 15:07:25 crc kubenswrapper[4786]: E0313 15:07:25.636118 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c\": container with ID starting with ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c not found: ID does not exist" containerID="ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.636145 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c"} err="failed to get container status \"ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c\": rpc error: code = NotFound desc = could not find container \"ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c\": container with ID starting with ead7738dee6ad981a2fe046dfe0c119a0f53ab65eb6e204c76d5eae400db674c not found: ID does not exist" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.636165 4786 scope.go:117] "RemoveContainer" containerID="d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb" Mar 13 15:07:25 crc kubenswrapper[4786]: E0313 15:07:25.636829 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb\": container with ID starting with d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb not found: ID does not exist" containerID="d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.636946 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb"} err="failed to get container status \"d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb\": rpc error: code = NotFound desc = could not find container \"d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb\": container with ID starting with d7790125cf65494389ec4c7a51cdf31804d9d4c6d721248408d52a216b08dcfb not found: ID does not exist" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.636981 4786 scope.go:117] "RemoveContainer" containerID="1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453" Mar 13 15:07:25 crc kubenswrapper[4786]: E0313 15:07:25.637543 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453\": container with ID starting with 1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453 not found: ID does not exist" containerID="1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.637565 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453"} err="failed to get container status \"1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453\": rpc error: code = NotFound desc = could not find container \"1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453\": container with ID starting with 1bbc7d88e79b9835f858439b960281246154a84a60d1545414a6def6d6ee1453 not found: ID does not exist" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.637578 4786 scope.go:117] "RemoveContainer" containerID="4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.651226 4786 scope.go:117] "RemoveContainer" containerID="79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.663481 4786 scope.go:117] "RemoveContainer" containerID="d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.676826 4786 scope.go:117] "RemoveContainer" containerID="4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa" Mar 13 15:07:25 crc kubenswrapper[4786]: E0313 15:07:25.677314 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa\": container with ID starting with 4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa not found: ID does not exist" containerID="4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.677358 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa"} err="failed to get container status \"4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa\": rpc error: code = NotFound desc = could not find container \"4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa\": container with ID starting with 4d68223691538002dbc03c7fe3804eb70accd448005ab52d063d31c2a5a793aa not found: ID does not exist" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.677391 4786 scope.go:117] "RemoveContainer" containerID="79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4" Mar 13 15:07:25 crc kubenswrapper[4786]: E0313 15:07:25.677707 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4\": container with ID starting with 79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4 not found: ID does not exist" containerID="79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.677740 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4"} err="failed to get container status \"79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4\": rpc error: code = NotFound desc = could not find container \"79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4\": container with ID starting with 79b4bd05a1089d31ac894ba09bfe075cfb63c51d057d5e9918ad431d2b428ae4 not found: ID does not exist" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.677765 4786 scope.go:117] "RemoveContainer" containerID="d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116" Mar 13 15:07:25 crc kubenswrapper[4786]: E0313 15:07:25.678177 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116\": container with ID starting with d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116 not found: ID does not exist" containerID="d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116" Mar 13 15:07:25 crc kubenswrapper[4786]: I0313 15:07:25.678207 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116"} err="failed to get container status \"d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116\": rpc error: code = NotFound desc = could not find container \"d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116\": container with ID starting with d10b939b11bfe957fb4fcd070c4216c9e8f87fb8a0f6162d44e686e78afe3116 not found: ID does not exist" Mar 13 15:07:26 crc kubenswrapper[4786]: I0313 15:07:26.562605 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" path="/var/lib/kubelet/pods/0115a774-8b25-4e1d-9d6f-c4202035efa9/volumes" Mar 13 15:07:26 crc kubenswrapper[4786]: I0313 15:07:26.563453 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" path="/var/lib/kubelet/pods/4672e3ad-80ad-4d20-89b6-b6d11c9eb508/volumes" Mar 13 15:07:26 crc kubenswrapper[4786]: I0313 15:07:26.594012 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v9g8" event={"ID":"8e627f8c-63e5-4b85-9fff-0c205c96d0a4","Type":"ContainerStarted","Data":"0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6"} Mar 13 15:07:26 crc kubenswrapper[4786]: I0313 15:07:26.624425 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5v9g8" podStartSLOduration=3.404374764 podStartE2EDuration="1m14.624396242s" podCreationTimestamp="2026-03-13 15:06:12 +0000 UTC" firstStartedPulling="2026-03-13 15:06:14.759490748 +0000 UTC m=+204.922702559" lastFinishedPulling="2026-03-13 15:07:25.979512206 +0000 UTC m=+276.142724037" observedRunningTime="2026-03-13 15:07:26.62403784 +0000 UTC m=+276.787249661" watchObservedRunningTime="2026-03-13 15:07:26.624396242 +0000 UTC m=+276.787608093" Mar 13 15:07:28 crc kubenswrapper[4786]: I0313 15:07:28.851341 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" podUID="3303beb2-619f-4973-b3c9-1f75a6e4e88c" containerName="oauth-openshift" containerID="cri-o://ea936a8c2bcfc05af8d5e7564240645979c9f5072dfd95e5596acbc73a5d44c5" gracePeriod=15 Mar 13 15:07:30 crc kubenswrapper[4786]: I0313 15:07:30.443262 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc"] Mar 13 15:07:30 crc kubenswrapper[4786]: I0313 15:07:30.443583 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" containerName="controller-manager" containerID="cri-o://0856f33841fdcbf8a81c9e9ad54607fc44e158a95d14747ea77c2aa847035ca7" gracePeriod=30 Mar 13 15:07:30 crc kubenswrapper[4786]: I0313 15:07:30.541138 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9989746bc-q797k"] Mar 13 15:07:30 crc kubenswrapper[4786]: I0313 15:07:30.541753 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" containerName="route-controller-manager" containerID="cri-o://4e7efe0e3c0b8d034e6a3c2c7249f5d3fdc289b6c227e2406f3a4011f254dc70" gracePeriod=30 Mar 13 15:07:30 crc kubenswrapper[4786]: I0313 15:07:30.614492 4786 generic.go:334] "Generic (PLEG): container finished" podID="3303beb2-619f-4973-b3c9-1f75a6e4e88c" containerID="ea936a8c2bcfc05af8d5e7564240645979c9f5072dfd95e5596acbc73a5d44c5" exitCode=0 Mar 13 15:07:30 crc kubenswrapper[4786]: I0313 15:07:30.614539 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" event={"ID":"3303beb2-619f-4973-b3c9-1f75a6e4e88c","Type":"ContainerDied","Data":"ea936a8c2bcfc05af8d5e7564240645979c9f5072dfd95e5596acbc73a5d44c5"} Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.130937 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164161 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-ocp-branding-template\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164237 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-trusted-ca-bundle\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164286 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-policies\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164323 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-idp-0-file-data\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164357 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-provider-selection\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164393 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-login\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164448 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-session\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164488 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-error\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.164509 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-dir\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.165052 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-serving-cert\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.165098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrjdx\" (UniqueName: \"kubernetes.io/projected/3303beb2-619f-4973-b3c9-1f75a6e4e88c-kube-api-access-lrjdx\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.165290 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.165720 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-cliconfig\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.165768 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-router-certs\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.165821 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-service-ca\") pod \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\" (UID: \"3303beb2-619f-4973-b3c9-1f75a6e4e88c\") " Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.165756 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.166107 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.166019 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.166167 4786 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.166340 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.172048 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.173486 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.184439 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.184569 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.184749 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.187816 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.187826 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.187895 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3303beb2-619f-4973-b3c9-1f75a6e4e88c-kube-api-access-lrjdx" (OuterVolumeSpecName: "kube-api-access-lrjdx") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "kube-api-access-lrjdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.194444 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3303beb2-619f-4973-b3c9-1f75a6e4e88c" (UID: "3303beb2-619f-4973-b3c9-1f75a6e4e88c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267684 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267725 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267740 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267753 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267769 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267783 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267795 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267807 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267819 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267831 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrjdx\" (UniqueName: \"kubernetes.io/projected/3303beb2-619f-4973-b3c9-1f75a6e4e88c-kube-api-access-lrjdx\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267843 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267870 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.267882 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3303beb2-619f-4973-b3c9-1f75a6e4e88c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283060 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-55896b6b9d-lsgxh"] Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283240 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" containerName="extract-utilities" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283250 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" containerName="extract-utilities" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283263 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="extract-utilities" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283271 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="extract-utilities" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283283 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283289 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283296 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" containerName="extract-content" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283302 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" containerName="extract-content" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283311 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="extract-content" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283316 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="extract-content" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283328 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="extract-utilities" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283334 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="extract-utilities" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283342 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="extract-content" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283348 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="extract-content" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283356 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283361 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283370 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3303beb2-619f-4973-b3c9-1f75a6e4e88c" containerName="oauth-openshift" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283375 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3303beb2-619f-4973-b3c9-1f75a6e4e88c" containerName="oauth-openshift" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.283383 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283388 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283468 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0115a774-8b25-4e1d-9d6f-c4202035efa9" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283477 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3303beb2-619f-4973-b3c9-1f75a6e4e88c" containerName="oauth-openshift" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283486 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4672e3ad-80ad-4d20-89b6-b6d11c9eb508" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283495 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="38981c4c-48a9-497e-86b7-c3852574cae2" containerName="registry-server" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.283812 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.295510 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55896b6b9d-lsgxh"] Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369398 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369503 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369551 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-error\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369597 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-login\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369624 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-router-certs\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369752 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-service-ca\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369815 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-session\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369839 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369909 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zhg\" (UniqueName: \"kubernetes.io/projected/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-kube-api-access-94zhg\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369936 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.369999 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.370034 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-audit-dir\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.370070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-audit-policies\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471305 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471395 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471457 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-error\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471494 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471523 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-login\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471557 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-router-certs\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471590 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-service-ca\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-session\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471652 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471693 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zhg\" (UniqueName: \"kubernetes.io/projected/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-kube-api-access-94zhg\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471727 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471780 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471823 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-audit-dir\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.471901 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-audit-policies\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.472238 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-audit-dir\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.472763 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-service-ca\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.472929 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-audit-policies\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.473294 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.473476 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.476211 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-error\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.476227 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-router-certs\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.476715 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.476848 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-login\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.477203 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.478336 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.478685 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.480200 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-v4-0-config-system-session\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.498434 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zhg\" (UniqueName: \"kubernetes.io/projected/8199eb78-7bf5-40c5-8f5b-65c1f49a610e-kube-api-access-94zhg\") pod \"oauth-openshift-55896b6b9d-lsgxh\" (UID: \"8199eb78-7bf5-40c5-8f5b-65c1f49a610e\") " pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.622736 4786 generic.go:334] "Generic (PLEG): container finished" podID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" containerID="4e7efe0e3c0b8d034e6a3c2c7249f5d3fdc289b6c227e2406f3a4011f254dc70" exitCode=0 Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.622916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" event={"ID":"058572a7-4afb-4f0f-ac7f-2e3979fe31a9","Type":"ContainerDied","Data":"4e7efe0e3c0b8d034e6a3c2c7249f5d3fdc289b6c227e2406f3a4011f254dc70"} Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.625045 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" event={"ID":"3303beb2-619f-4973-b3c9-1f75a6e4e88c","Type":"ContainerDied","Data":"45e9c57e28a98790760a0850f017d8374e2fcb5bf2bce35f53f0537ba307abab"} Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.625086 4786 scope.go:117] "RemoveContainer" containerID="ea936a8c2bcfc05af8d5e7564240645979c9f5072dfd95e5596acbc73a5d44c5" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.625195 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-b7krw" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.629697 4786 generic.go:334] "Generic (PLEG): container finished" podID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" containerID="0856f33841fdcbf8a81c9e9ad54607fc44e158a95d14747ea77c2aa847035ca7" exitCode=0 Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.629751 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" event={"ID":"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f","Type":"ContainerDied","Data":"0856f33841fdcbf8a81c9e9ad54607fc44e158a95d14747ea77c2aa847035ca7"} Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.641403 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.704086 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-b7krw"] Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.711306 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-b7krw"] Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.947051 4786 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954337 4786 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954382 4786 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.954616 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954635 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.954644 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954651 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.954664 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954672 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.954684 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954690 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.954697 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954704 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.954717 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954723 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954722 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.954738 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.954978 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.955016 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955026 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955179 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368" gracePeriod=15 Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955565 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17" gracePeriod=15 Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955701 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955716 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955726 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955741 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955750 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955761 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955774 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955868 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5" gracePeriod=15 Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955896 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68" gracePeriod=15 Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.955840 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f" gracePeriod=15 Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.956470 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.956484 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: E0313 15:07:31.956500 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.956506 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.957569 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.967148 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 13 15:07:31 crc kubenswrapper[4786]: I0313 15:07:31.969337 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.088144 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.088395 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.088430 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.088476 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.088497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.088522 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.088630 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.088690 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189330 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189367 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189388 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189433 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189455 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189483 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189532 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189501 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189566 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189586 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189606 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189625 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189657 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.189675 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.277001 4786 patch_prober.go:28] interesting pod/route-controller-manager-9989746bc-q797k container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.277108 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 13 15:07:32 crc kubenswrapper[4786]: E0313 15:07:32.277935 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.12:6443: connect: connection refused" event=< Mar 13 15:07:32 crc kubenswrapper[4786]: &Event{ObjectMeta:{route-controller-manager-9989746bc-q797k.189c6f0c63849feb openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-9989746bc-q797k,UID:058572a7-4afb-4f0f-ac7f-2e3979fe31a9,APIVersion:v1,ResourceVersion:29657,FieldPath:spec.containers{route-controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.64:8443/healthz": dial tcp 10.217.0.64:8443: connect: connection refused Mar 13 15:07:32 crc kubenswrapper[4786]: body: Mar 13 15:07:32 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:07:32.277059563 +0000 UTC m=+282.440271394,LastTimestamp:2026-03-13 15:07:32.277059563 +0000 UTC m=+282.440271394,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:07:32 crc kubenswrapper[4786]: > Mar 13 15:07:32 crc kubenswrapper[4786]: E0313 15:07:32.394891 4786 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 15:07:32 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d" Netns:"/var/run/netns/131f6785-23cc-470c-8983-bf090a0b2d3f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:32 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:32 crc kubenswrapper[4786]: > Mar 13 15:07:32 crc kubenswrapper[4786]: E0313 15:07:32.394972 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 15:07:32 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d" Netns:"/var/run/netns/131f6785-23cc-470c-8983-bf090a0b2d3f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:32 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:32 crc kubenswrapper[4786]: > pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:32 crc kubenswrapper[4786]: E0313 15:07:32.394994 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 15:07:32 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d" Netns:"/var/run/netns/131f6785-23cc-470c-8983-bf090a0b2d3f" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:32 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:32 crc kubenswrapper[4786]: > pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:32 crc kubenswrapper[4786]: E0313 15:07:32.395078 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d\\\" Netns:\\\"/var/run/netns/131f6785-23cc-470c-8983-bf090a0b2d3f\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=edb1b40223c27b16f5846bdb3655cad11ac231ecef6918bbf3ceaef2706a008d;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s\\\": dial tcp 38.102.83.12:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podUID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.559439 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3303beb2-619f-4973-b3c9-1f75a6e4e88c" path="/var/lib/kubelet/pods/3303beb2-619f-4973-b3c9-1f75a6e4e88c/volumes" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.637770 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.639259 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.639805 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5" exitCode=2 Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.639895 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:32 crc kubenswrapper[4786]: I0313 15:07:32.640351 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:33 crc kubenswrapper[4786]: I0313 15:07:33.055366 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:07:33 crc kubenswrapper[4786]: I0313 15:07:33.055679 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:07:33 crc kubenswrapper[4786]: I0313 15:07:33.119564 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:07:33 crc kubenswrapper[4786]: I0313 15:07:33.120602 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.411162 4786 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 15:07:33 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025" Netns:"/var/run/netns/8219ce86-d931-4e15-b82b-d1e66fbbb69e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:33 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:33 crc kubenswrapper[4786]: > Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.411225 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 15:07:33 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025" Netns:"/var/run/netns/8219ce86-d931-4e15-b82b-d1e66fbbb69e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:33 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:33 crc kubenswrapper[4786]: > pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.411249 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 15:07:33 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025" Netns:"/var/run/netns/8219ce86-d931-4e15-b82b-d1e66fbbb69e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:33 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:33 crc kubenswrapper[4786]: > pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.411302 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025\\\" Netns:\\\"/var/run/netns/8219ce86-d931-4e15-b82b-d1e66fbbb69e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=a31959443359eb5141a6de6adf1887b3655d23916ffe95e29000475f0fd72025;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s\\\": dial tcp 38.102.83.12:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podUID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Mar 13 15:07:33 crc kubenswrapper[4786]: I0313 15:07:33.716055 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:07:33 crc kubenswrapper[4786]: I0313 15:07:33.716852 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.763510 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.764696 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.765294 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.765639 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.765983 4786 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:33 crc kubenswrapper[4786]: I0313 15:07:33.766022 4786 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.766231 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="200ms" Mar 13 15:07:33 crc kubenswrapper[4786]: E0313 15:07:33.967968 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="400ms" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.211142 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.211545 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.336129 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.336842 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.337569 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: E0313 15:07:34.369915 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="800ms" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.664254 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.666553 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.667482 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17" exitCode=0 Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.667537 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68" exitCode=0 Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.667577 4786 scope.go:117] "RemoveContainer" containerID="26f592d0e61ba4767d9fc731764c1233fd83409d8e0a330a2a706db81f74af29" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.732688 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.732747 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.753036 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.753866 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.754406 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: E0313 15:07:34.760667 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:07:34Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:07:34Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:07:34Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:07:34Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:78e7aaa968de1d93075fecf5ed08fed9f420858eba24a71e7b9e30879a155d48\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:de3bed9f6322bf8cc18821e9445419627e66ef6dd1e718f581649e6fd79a637f\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221770962},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: E0313 15:07:34.762130 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: E0313 15:07:34.762706 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: E0313 15:07:34.763275 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: E0313 15:07:34.763820 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: E0313 15:07:34.764053 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.782326 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.783198 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.783618 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.784197 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.799325 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.800218 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.800563 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.800963 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.801421 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.862853 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.863715 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.864282 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.865087 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.865474 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.865905 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.929421 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmvg5\" (UniqueName: \"kubernetes.io/projected/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-kube-api-access-xmvg5\") pod \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.929937 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-client-ca\") pod \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.930022 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-serving-cert\") pod \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.930089 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-config\") pod \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.930119 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-serving-cert\") pod \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\" (UID: \"058572a7-4afb-4f0f-ac7f-2e3979fe31a9\") " Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.930207 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-proxy-ca-bundles\") pod \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.930277 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s2hm\" (UniqueName: \"kubernetes.io/projected/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-kube-api-access-4s2hm\") pod \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.930805 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" (UID: "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.931006 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.930804 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-client-ca" (OuterVolumeSpecName: "client-ca") pod "058572a7-4afb-4f0f-ac7f-2e3979fe31a9" (UID: "058572a7-4afb-4f0f-ac7f-2e3979fe31a9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.931338 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-config" (OuterVolumeSpecName: "config") pod "058572a7-4afb-4f0f-ac7f-2e3979fe31a9" (UID: "058572a7-4afb-4f0f-ac7f-2e3979fe31a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.949757 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" (UID: "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.962312 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "058572a7-4afb-4f0f-ac7f-2e3979fe31a9" (UID: "058572a7-4afb-4f0f-ac7f-2e3979fe31a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.962451 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-kube-api-access-xmvg5" (OuterVolumeSpecName: "kube-api-access-xmvg5") pod "058572a7-4afb-4f0f-ac7f-2e3979fe31a9" (UID: "058572a7-4afb-4f0f-ac7f-2e3979fe31a9"). InnerVolumeSpecName "kube-api-access-xmvg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:34 crc kubenswrapper[4786]: I0313 15:07:34.966615 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-kube-api-access-4s2hm" (OuterVolumeSpecName: "kube-api-access-4s2hm") pod "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" (UID: "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f"). InnerVolumeSpecName "kube-api-access-4s2hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.031973 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-config\") pod \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032103 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-client-ca\") pod \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\" (UID: \"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f\") " Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032536 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s2hm\" (UniqueName: \"kubernetes.io/projected/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-kube-api-access-4s2hm\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032570 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmvg5\" (UniqueName: \"kubernetes.io/projected/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-kube-api-access-xmvg5\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032593 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032615 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032636 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032660 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/058572a7-4afb-4f0f-ac7f-2e3979fe31a9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032713 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-client-ca" (OuterVolumeSpecName: "client-ca") pod "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" (UID: "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.032755 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-config" (OuterVolumeSpecName: "config") pod "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" (UID: "97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.133694 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.133722 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:35 crc kubenswrapper[4786]: E0313 15:07:35.170346 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="1.6s" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.631767 4786 patch_prober.go:28] interesting pod/controller-manager-779cb7cbc8-4q8bc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: i/o timeout" start-of-body= Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.631822 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: i/o timeout" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.677735 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.678503 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f" exitCode=0 Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.680415 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" event={"ID":"058572a7-4afb-4f0f-ac7f-2e3979fe31a9","Type":"ContainerDied","Data":"ab318944345bb6d30468389c724baeb40df637cce1f9ec0fb491cb858e3d101b"} Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.680518 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.682551 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.683053 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.684472 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.684918 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.685264 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" event={"ID":"97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f","Type":"ContainerDied","Data":"977a1ecad2cc44cedfb47cd2f8807c716fdcb1a3cd1a1c9c6087316274b861d3"} Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.686110 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.686588 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.687145 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.687542 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.688019 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.688368 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.688742 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.700896 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.701389 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.702058 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.702412 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.703093 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.703459 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.703928 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.704231 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.704499 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.704833 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.734814 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.736385 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.736600 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.736794 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.737130 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:35 crc kubenswrapper[4786]: I0313 15:07:35.737483 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.697393 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.698293 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368" exitCode=0 Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.700380 4786 generic.go:334] "Generic (PLEG): container finished" podID="d9b158d4-083e-4d6e-9237-561014e45538" containerID="d5eb24a29a14ffe8f94b59ec2687a623871207889c5d2f0ab66ab4a8b0c62c00" exitCode=0 Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.700499 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d9b158d4-083e-4d6e-9237-561014e45538","Type":"ContainerDied","Data":"d5eb24a29a14ffe8f94b59ec2687a623871207889c5d2f0ab66ab4a8b0c62c00"} Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.701258 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.701716 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.702171 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.702576 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.702919 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.703171 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:36 crc kubenswrapper[4786]: E0313 15:07:36.771579 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="3.2s" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.952847 4786 scope.go:117] "RemoveContainer" containerID="4e7efe0e3c0b8d034e6a3c2c7249f5d3fdc289b6c227e2406f3a4011f254dc70" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.987843 4786 scope.go:117] "RemoveContainer" containerID="0856f33841fdcbf8a81c9e9ad54607fc44e158a95d14747ea77c2aa847035ca7" Mar 13 15:07:36 crc kubenswrapper[4786]: E0313 15:07:36.996398 4786 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.12:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:36 crc kubenswrapper[4786]: I0313 15:07:36.996902 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:37 crc kubenswrapper[4786]: W0313 15:07:37.039930 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b2f230278c6429d2eaca61d2d4d6297401ef46e120f641c725548671133eb33c WatchSource:0}: Error finding container b2f230278c6429d2eaca61d2d4d6297401ef46e120f641c725548671133eb33c: Status 404 returned error can't find the container with id b2f230278c6429d2eaca61d2d4d6297401ef46e120f641c725548671133eb33c Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.332553 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.334379 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.335515 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.336133 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.336588 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.337036 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.337556 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.338093 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.338439 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.460288 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.460371 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.460414 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.460575 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.460645 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.460631 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.460981 4786 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.461016 4786 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.461035 4786 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.712748 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.713800 4786 scope.go:117] "RemoveContainer" containerID="9f3430a685aad384bde7b94e96c39d58653818c088487320440e871cd2c6aa17" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.714043 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.717242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b2f230278c6429d2eaca61d2d4d6297401ef46e120f641c725548671133eb33c"} Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.735475 4786 scope.go:117] "RemoveContainer" containerID="53d36a5ca25e183fcb6db07dbf350f6ecbff2466c9c138ca491dab23ce9cec68" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.746375 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.747188 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.747612 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.748276 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.749054 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.749404 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.749708 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.765079 4786 scope.go:117] "RemoveContainer" containerID="1a28948a8bd38b4b1105a04a79e549677a87e9a2737be015f0379c2b8c520e6f" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.801626 4786 scope.go:117] "RemoveContainer" containerID="c2555e5e1fd6836961fd2e83d62ad6bbc9559b5e0294ebd6f56f5577024951f5" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.819824 4786 scope.go:117] "RemoveContainer" containerID="b028d3f7ca30676713833bdfe49004024901b92d171389947b3fe6ecafd56368" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.840903 4786 scope.go:117] "RemoveContainer" containerID="392c215d57bbc3ea8ed84f1f10d8640f40a40aafb0b10b6d7e4b9f0db8795aa8" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.868488 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.868888 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.868962 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.869964 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:07:37 crc kubenswrapper[4786]: I0313 15:07:37.870050 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513" gracePeriod=600 Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.013108 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.014052 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.014410 4786 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.014686 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.015002 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.015413 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.015834 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.016076 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.087868 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9b158d4-083e-4d6e-9237-561014e45538-kube-api-access\") pod \"d9b158d4-083e-4d6e-9237-561014e45538\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.087926 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-var-lock\") pod \"d9b158d4-083e-4d6e-9237-561014e45538\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.087966 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-kubelet-dir\") pod \"d9b158d4-083e-4d6e-9237-561014e45538\" (UID: \"d9b158d4-083e-4d6e-9237-561014e45538\") " Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.088020 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-var-lock" (OuterVolumeSpecName: "var-lock") pod "d9b158d4-083e-4d6e-9237-561014e45538" (UID: "d9b158d4-083e-4d6e-9237-561014e45538"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.088144 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d9b158d4-083e-4d6e-9237-561014e45538" (UID: "d9b158d4-083e-4d6e-9237-561014e45538"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.088239 4786 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.088251 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9b158d4-083e-4d6e-9237-561014e45538-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.093072 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b158d4-083e-4d6e-9237-561014e45538-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d9b158d4-083e-4d6e-9237-561014e45538" (UID: "d9b158d4-083e-4d6e-9237-561014e45538"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.189529 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9b158d4-083e-4d6e-9237-561014e45538-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.559917 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.731399 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af"} Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.732227 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: E0313 15:07:38.732587 4786 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.12:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.732784 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.733353 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d9b158d4-083e-4d6e-9237-561014e45538","Type":"ContainerDied","Data":"29dfe67a251df6ec533714d17e46cfdb13258d0fefc0c898114344ecb26efdb8"} Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.733393 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29dfe67a251df6ec533714d17e46cfdb13258d0fefc0c898114344ecb26efdb8" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.733370 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.733455 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.734344 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.734821 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.735336 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.737762 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.738292 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513" exitCode=0 Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.738297 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.738326 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513"} Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.738506 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"b8f95de79bd567add7c54012ca9cf8f384df817fb4c5a400948b699024ddf22f"} Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.739080 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.739556 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.740018 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.740490 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.740995 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.741376 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.741778 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.742107 4786 status_manager.go:851] "Failed to get status for pod" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-zqb49\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.742432 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.742841 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:38 crc kubenswrapper[4786]: I0313 15:07:38.743179 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:39 crc kubenswrapper[4786]: E0313 15:07:39.747314 4786 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.12:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:07:39 crc kubenswrapper[4786]: E0313 15:07:39.973328 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="6.4s" Mar 13 15:07:40 crc kubenswrapper[4786]: I0313 15:07:40.555505 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:40 crc kubenswrapper[4786]: I0313 15:07:40.556351 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:40 crc kubenswrapper[4786]: I0313 15:07:40.557099 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:40 crc kubenswrapper[4786]: I0313 15:07:40.558150 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:40 crc kubenswrapper[4786]: I0313 15:07:40.558657 4786 status_manager.go:851] "Failed to get status for pod" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-zqb49\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:40 crc kubenswrapper[4786]: I0313 15:07:40.559083 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:40 crc kubenswrapper[4786]: I0313 15:07:40.560242 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:41 crc kubenswrapper[4786]: E0313 15:07:41.570261 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.12:6443: connect: connection refused" event=< Mar 13 15:07:41 crc kubenswrapper[4786]: &Event{ObjectMeta:{route-controller-manager-9989746bc-q797k.189c6f0c63849feb openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-9989746bc-q797k,UID:058572a7-4afb-4f0f-ac7f-2e3979fe31a9,APIVersion:v1,ResourceVersion:29657,FieldPath:spec.containers{route-controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.64:8443/healthz": dial tcp 10.217.0.64:8443: connect: connection refused Mar 13 15:07:41 crc kubenswrapper[4786]: body: Mar 13 15:07:41 crc kubenswrapper[4786]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 15:07:32.277059563 +0000 UTC m=+282.440271394,LastTimestamp:2026-03-13 15:07:32.277059563 +0000 UTC m=+282.440271394,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 15:07:41 crc kubenswrapper[4786]: > Mar 13 15:07:44 crc kubenswrapper[4786]: E0313 15:07:44.827099 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:07:44Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:07:44Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:07:44Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:07:44Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:78e7aaa968de1d93075fecf5ed08fed9f420858eba24a71e7b9e30879a155d48\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:de3bed9f6322bf8cc18821e9445419627e66ef6dd1e718f581649e6fd79a637f\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221770962},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:44 crc kubenswrapper[4786]: E0313 15:07:44.828124 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:44 crc kubenswrapper[4786]: E0313 15:07:44.828729 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:44 crc kubenswrapper[4786]: E0313 15:07:44.829279 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:44 crc kubenswrapper[4786]: E0313 15:07:44.829674 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:44 crc kubenswrapper[4786]: E0313 15:07:44.829712 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 15:07:45 crc kubenswrapper[4786]: E0313 15:07:45.570145 4786 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.12:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" volumeName="registry-storage" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.798585 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.801181 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.801265 4786 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4308da01317dc8c0bb829355fb3a6b66e96afffa93ac3f63f531c1405b22fa98" exitCode=1 Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.801327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4308da01317dc8c0bb829355fb3a6b66e96afffa93ac3f63f531c1405b22fa98"} Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.802229 4786 scope.go:117] "RemoveContainer" containerID="4308da01317dc8c0bb829355fb3a6b66e96afffa93ac3f63f531c1405b22fa98" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.802579 4786 status_manager.go:851] "Failed to get status for pod" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-zqb49\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.803407 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.804101 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.804672 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.805164 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.805797 4786 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.806330 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.807272 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:45 crc kubenswrapper[4786]: I0313 15:07:45.992968 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:07:46 crc kubenswrapper[4786]: E0313 15:07:46.375193 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="7s" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.551138 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.552003 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.552588 4786 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.553086 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.553540 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.554037 4786 status_manager.go:851] "Failed to get status for pod" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-zqb49\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.554623 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.555212 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.555644 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.573572 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.573600 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:07:46 crc kubenswrapper[4786]: E0313 15:07:46.574044 4786 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.574756 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:46 crc kubenswrapper[4786]: W0313 15:07:46.600134 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-507d0f51a626f66457a04ccc8ed7634b0838218ebbb1f483f5253532057fb2e4 WatchSource:0}: Error finding container 507d0f51a626f66457a04ccc8ed7634b0838218ebbb1f483f5253532057fb2e4: Status 404 returned error can't find the container with id 507d0f51a626f66457a04ccc8ed7634b0838218ebbb1f483f5253532057fb2e4 Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.809191 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"507d0f51a626f66457a04ccc8ed7634b0838218ebbb1f483f5253532057fb2e4"} Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.811319 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.812495 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.812565 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4cb03de4ea328587953ecbf2a0551537ea8dd1191e2fbdbead3fa63df5e92585"} Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.813519 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.814000 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.814259 4786 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.814622 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.815010 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.815334 4786 status_manager.go:851] "Failed to get status for pod" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-zqb49\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.815654 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:46 crc kubenswrapper[4786]: I0313 15:07:46.815980 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.551093 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.552196 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.823192 4786 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f7db9fbede9ba09b798b18324730bef7c97965844fa6320391b293c35cdd0afa" exitCode=0 Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.823588 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.823618 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.823357 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f7db9fbede9ba09b798b18324730bef7c97965844fa6320391b293c35cdd0afa"} Mar 13 15:07:47 crc kubenswrapper[4786]: E0313 15:07:47.824335 4786 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.824397 4786 status_manager.go:851] "Failed to get status for pod" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-zqb49\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.824617 4786 status_manager.go:851] "Failed to get status for pod" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" pod="openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-779cb7cbc8-4q8bc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.825058 4786 status_manager.go:851] "Failed to get status for pod" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" pod="openshift-route-controller-manager/route-controller-manager-9989746bc-q797k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9989746bc-q797k\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.825475 4786 status_manager.go:851] "Failed to get status for pod" podUID="d9b158d4-083e-4d6e-9237-561014e45538" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.825893 4786 status_manager.go:851] "Failed to get status for pod" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" pod="openshift-marketplace/redhat-marketplace-5v9g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5v9g8\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.826355 4786 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.826682 4786 status_manager.go:851] "Failed to get status for pod" podUID="73d30051-975a-4329-8d7b-32d297b35218" pod="openshift-marketplace/redhat-operators-q84vs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-q84vs\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:47 crc kubenswrapper[4786]: I0313 15:07:47.827245 4786 status_manager.go:851] "Failed to get status for pod" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" pod="openshift-marketplace/redhat-operators-ggggh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ggggh\": dial tcp 38.102.83.12:6443: connect: connection refused" Mar 13 15:07:48 crc kubenswrapper[4786]: E0313 15:07:48.014277 4786 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 15:07:48 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea" Netns:"/var/run/netns/1323c4f4-6bfd-4e5a-94b9-02daad081131" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:48 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:48 crc kubenswrapper[4786]: > Mar 13 15:07:48 crc kubenswrapper[4786]: E0313 15:07:48.014351 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 15:07:48 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea" Netns:"/var/run/netns/1323c4f4-6bfd-4e5a-94b9-02daad081131" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:48 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:48 crc kubenswrapper[4786]: > pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:48 crc kubenswrapper[4786]: E0313 15:07:48.014376 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 15:07:48 crc kubenswrapper[4786]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea" Netns:"/var/run/netns/1323c4f4-6bfd-4e5a-94b9-02daad081131" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s": dial tcp 38.102.83.12:6443: connect: connection refused Mar 13 15:07:48 crc kubenswrapper[4786]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 15:07:48 crc kubenswrapper[4786]: > pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:07:48 crc kubenswrapper[4786]: E0313 15:07:48.014446 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55896b6b9d-lsgxh_openshift-authentication_8199eb78-7bf5-40c5-8f5b-65c1f49a610e_0(c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea): error adding pod openshift-authentication_oauth-openshift-55896b6b9d-lsgxh to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea\\\" Netns:\\\"/var/run/netns/1323c4f4-6bfd-4e5a-94b9-02daad081131\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55896b6b9d-lsgxh;K8S_POD_INFRA_CONTAINER_ID=c68fe6da1252c08c3708a6195a7e88ac53de2e28c44f5c5234e4b0a5e2a254ea;K8S_POD_UID=8199eb78-7bf5-40c5-8f5b-65c1f49a610e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh] networking: Multus: [openshift-authentication/oauth-openshift-55896b6b9d-lsgxh/8199eb78-7bf5-40c5-8f5b-65c1f49a610e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-55896b6b9d-lsgxh in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-55896b6b9d-lsgxh?timeout=1m0s\\\": dial tcp 38.102.83.12:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podUID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Mar 13 15:07:48 crc kubenswrapper[4786]: I0313 15:07:48.830806 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e9f0872b3943b85e144a2258d81eb0b5d196ee3ca0a0bf0e706a6f3e673aedde"} Mar 13 15:07:49 crc kubenswrapper[4786]: I0313 15:07:49.840843 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77b7a453820426944963f637c6d862da7b18fc777b6c63399a1e8b653c12ad08"} Mar 13 15:07:49 crc kubenswrapper[4786]: I0313 15:07:49.840901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"06a324b21c4820e4a1bd11ae4d1c6b52ab3728cfc9c34b60eb11ee95c1902a1c"} Mar 13 15:07:50 crc kubenswrapper[4786]: I0313 15:07:50.848707 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8646bdeda69457d7b2619901699cb8521a22b462024ba4f828b2e57fa2b61ef8"} Mar 13 15:07:50 crc kubenswrapper[4786]: I0313 15:07:50.849015 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:07:50 crc kubenswrapper[4786]: I0313 15:07:50.849055 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:07:50 crc kubenswrapper[4786]: I0313 15:07:50.849057 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:50 crc kubenswrapper[4786]: I0313 15:07:50.849069 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"610ce93bfc7038cd1d25d6ab977f9060c881663753039ebe5ef7267ebbfd249d"} Mar 13 15:07:51 crc kubenswrapper[4786]: I0313 15:07:51.575303 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:51 crc kubenswrapper[4786]: I0313 15:07:51.575462 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:51 crc kubenswrapper[4786]: I0313 15:07:51.580505 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]log ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]etcd ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/priority-and-fairness-filter ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-apiextensions-informers ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-apiextensions-controllers ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/crd-informer-synced ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-system-namespaces-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 13 15:07:51 crc kubenswrapper[4786]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 13 15:07:51 crc kubenswrapper[4786]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/bootstrap-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/start-kube-aggregator-informers ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/apiservice-registration-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/apiservice-discovery-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]autoregister-completion ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/apiservice-openapi-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 13 15:07:51 crc kubenswrapper[4786]: livez check failed Mar 13 15:07:51 crc kubenswrapper[4786]: I0313 15:07:51.580545 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:07:52 crc kubenswrapper[4786]: I0313 15:07:52.503332 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:07:55 crc kubenswrapper[4786]: I0313 15:07:55.857096 4786 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:07:55 crc kubenswrapper[4786]: I0313 15:07:55.888590 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:07:55 crc kubenswrapper[4786]: I0313 15:07:55.888614 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:07:55 crc kubenswrapper[4786]: I0313 15:07:55.905749 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="be8532e7-f257-4d0d-8767-49cec67037bd" Mar 13 15:07:55 crc kubenswrapper[4786]: I0313 15:07:55.993487 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:07:55 crc kubenswrapper[4786]: I0313 15:07:55.997098 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:07:56 crc kubenswrapper[4786]: I0313 15:07:56.901788 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:08:02 crc kubenswrapper[4786]: I0313 15:08:02.555005 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:08:02 crc kubenswrapper[4786]: I0313 15:08:02.556140 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:08:02 crc kubenswrapper[4786]: W0313 15:08:02.837736 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8199eb78_7bf5_40c5_8f5b_65c1f49a610e.slice/crio-c4c7708f0ff0078449992725c66117b2d1dc57a67a6c208ee12dc15169b03754 WatchSource:0}: Error finding container c4c7708f0ff0078449992725c66117b2d1dc57a67a6c208ee12dc15169b03754: Status 404 returned error can't find the container with id c4c7708f0ff0078449992725c66117b2d1dc57a67a6c208ee12dc15169b03754 Mar 13 15:08:02 crc kubenswrapper[4786]: I0313 15:08:02.932839 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" event={"ID":"8199eb78-7bf5-40c5-8f5b-65c1f49a610e","Type":"ContainerStarted","Data":"c4c7708f0ff0078449992725c66117b2d1dc57a67a6c208ee12dc15169b03754"} Mar 13 15:08:03 crc kubenswrapper[4786]: E0313 15:08:03.618166 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8199eb78_7bf5_40c5_8f5b_65c1f49a610e.slice/crio-conmon-a1b780cd1afdeab1f29dd9f4601b4f0160a9a853d0267473d69961e946011d45.scope\": RecentStats: unable to find data in memory cache]" Mar 13 15:08:03 crc kubenswrapper[4786]: I0313 15:08:03.948934 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-55896b6b9d-lsgxh_8199eb78-7bf5-40c5-8f5b-65c1f49a610e/oauth-openshift/0.log" Mar 13 15:08:03 crc kubenswrapper[4786]: I0313 15:08:03.949127 4786 generic.go:334] "Generic (PLEG): container finished" podID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" containerID="a1b780cd1afdeab1f29dd9f4601b4f0160a9a853d0267473d69961e946011d45" exitCode=255 Mar 13 15:08:03 crc kubenswrapper[4786]: I0313 15:08:03.949228 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" event={"ID":"8199eb78-7bf5-40c5-8f5b-65c1f49a610e","Type":"ContainerDied","Data":"a1b780cd1afdeab1f29dd9f4601b4f0160a9a853d0267473d69961e946011d45"} Mar 13 15:08:03 crc kubenswrapper[4786]: I0313 15:08:03.950032 4786 scope.go:117] "RemoveContainer" containerID="a1b780cd1afdeab1f29dd9f4601b4f0160a9a853d0267473d69961e946011d45" Mar 13 15:08:04 crc kubenswrapper[4786]: I0313 15:08:04.957150 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-55896b6b9d-lsgxh_8199eb78-7bf5-40c5-8f5b-65c1f49a610e/oauth-openshift/0.log" Mar 13 15:08:04 crc kubenswrapper[4786]: I0313 15:08:04.957587 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" event={"ID":"8199eb78-7bf5-40c5-8f5b-65c1f49a610e","Type":"ContainerStarted","Data":"d4f62e3295f614ca111fb37677bac2a111cdb9fd56a973ac780395e5ec211f7b"} Mar 13 15:08:04 crc kubenswrapper[4786]: I0313 15:08:04.958112 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.021331 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.265278 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.269283 4786 patch_prober.go:28] interesting pod/oauth-openshift-55896b6b9d-lsgxh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": read tcp 10.217.0.2:41684->10.217.0.66:6443: read: connection reset by peer" start-of-body= Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.269370 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podUID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": read tcp 10.217.0.2:41684->10.217.0.66:6443: read: connection reset by peer" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.794145 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.919650 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.965851 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-55896b6b9d-lsgxh_8199eb78-7bf5-40c5-8f5b-65c1f49a610e/oauth-openshift/1.log" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.966391 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-55896b6b9d-lsgxh_8199eb78-7bf5-40c5-8f5b-65c1f49a610e/oauth-openshift/0.log" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.966436 4786 generic.go:334] "Generic (PLEG): container finished" podID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" containerID="d4f62e3295f614ca111fb37677bac2a111cdb9fd56a973ac780395e5ec211f7b" exitCode=255 Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.966471 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" event={"ID":"8199eb78-7bf5-40c5-8f5b-65c1f49a610e","Type":"ContainerDied","Data":"d4f62e3295f614ca111fb37677bac2a111cdb9fd56a973ac780395e5ec211f7b"} Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.966514 4786 scope.go:117] "RemoveContainer" containerID="a1b780cd1afdeab1f29dd9f4601b4f0160a9a853d0267473d69961e946011d45" Mar 13 15:08:05 crc kubenswrapper[4786]: I0313 15:08:05.967157 4786 scope.go:117] "RemoveContainer" containerID="d4f62e3295f614ca111fb37677bac2a111cdb9fd56a973ac780395e5ec211f7b" Mar 13 15:08:05 crc kubenswrapper[4786]: E0313 15:08:05.967527 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\"" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podUID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Mar 13 15:08:06 crc kubenswrapper[4786]: I0313 15:08:06.050375 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 15:08:06 crc kubenswrapper[4786]: I0313 15:08:06.579463 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 15:08:06 crc kubenswrapper[4786]: I0313 15:08:06.614422 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 15:08:06 crc kubenswrapper[4786]: I0313 15:08:06.973472 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-55896b6b9d-lsgxh_8199eb78-7bf5-40c5-8f5b-65c1f49a610e/oauth-openshift/1.log" Mar 13 15:08:06 crc kubenswrapper[4786]: I0313 15:08:06.974109 4786 scope.go:117] "RemoveContainer" containerID="d4f62e3295f614ca111fb37677bac2a111cdb9fd56a973ac780395e5ec211f7b" Mar 13 15:08:06 crc kubenswrapper[4786]: E0313 15:08:06.974414 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\"" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podUID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Mar 13 15:08:06 crc kubenswrapper[4786]: I0313 15:08:06.994108 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 15:08:07 crc kubenswrapper[4786]: I0313 15:08:07.302154 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 15:08:07 crc kubenswrapper[4786]: I0313 15:08:07.370620 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 15:08:07 crc kubenswrapper[4786]: I0313 15:08:07.776069 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 15:08:07 crc kubenswrapper[4786]: I0313 15:08:07.841575 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 15:08:07 crc kubenswrapper[4786]: I0313 15:08:07.897221 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 15:08:07 crc kubenswrapper[4786]: I0313 15:08:07.910517 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 15:08:07 crc kubenswrapper[4786]: I0313 15:08:07.993128 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.130363 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.169797 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.391977 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.767186 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.898393 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.941734 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.969830 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.974370 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 15:08:08 crc kubenswrapper[4786]: I0313 15:08:08.994481 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.021701 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.239687 4786 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.355302 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.397648 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.403216 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.456032 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.459773 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.599923 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.651425 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.738924 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.763182 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.827089 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.828366 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.858199 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.862619 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.868676 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.875323 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.882220 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.903107 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 15:08:09 crc kubenswrapper[4786]: I0313 15:08:09.961979 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.079474 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.145413 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.268737 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.313441 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.399289 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.697789 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.711015 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.788731 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.808755 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.841646 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.842158 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 15:08:10 crc kubenswrapper[4786]: I0313 15:08:10.907933 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.036270 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.139172 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.173392 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.198092 4786 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.233024 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.259548 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.266155 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.280621 4786 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.321877 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.339326 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.349166 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.427569 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.488543 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.506635 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.572121 4786 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.602706 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.607971 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.624206 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.642118 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.642927 4786 scope.go:117] "RemoveContainer" containerID="d4f62e3295f614ca111fb37677bac2a111cdb9fd56a973ac780395e5ec211f7b" Mar 13 15:08:11 crc kubenswrapper[4786]: E0313 15:08:11.643198 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\"" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podUID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.655306 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.659701 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.782096 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.861935 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 15:08:11 crc kubenswrapper[4786]: I0313 15:08:11.998807 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.058404 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.081354 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.182531 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.222739 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.275187 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.318361 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.372839 4786 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.383119 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-779cb7cbc8-4q8bc","openshift-route-controller-manager/route-controller-manager-9989746bc-q797k"] Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.386747 4786 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.386932 4786 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9f2e4a0c-c8c1-449c-baed-b06c9c647246" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.383235 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.391135 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55896b6b9d-lsgxh"] Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.392175 4786 scope.go:117] "RemoveContainer" containerID="d4f62e3295f614ca111fb37677bac2a111cdb9fd56a973ac780395e5ec211f7b" Mar 13 15:08:12 crc kubenswrapper[4786]: E0313 15:08:12.392625 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-55896b6b9d-lsgxh_openshift-authentication(8199eb78-7bf5-40c5-8f5b-65c1f49a610e)\"" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podUID="8199eb78-7bf5-40c5-8f5b-65c1f49a610e" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.394999 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.426962 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.426932608 podStartE2EDuration="17.426932608s" podCreationTimestamp="2026-03-13 15:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:08:12.421538686 +0000 UTC m=+322.584750527" watchObservedRunningTime="2026-03-13 15:08:12.426932608 +0000 UTC m=+322.590144429" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.464327 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.510767 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.564903 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" path="/var/lib/kubelet/pods/058572a7-4afb-4f0f-ac7f-2e3979fe31a9/volumes" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.566597 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" path="/var/lib/kubelet/pods/97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f/volumes" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.594686 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.619238 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.630285 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.749060 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.887190 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 15:08:12 crc kubenswrapper[4786]: I0313 15:08:12.992303 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.014560 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.130454 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.182004 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.236354 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.309021 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.371308 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.435644 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.474586 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.555219 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.600378 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.663482 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.723166 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.734605 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.758893 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.793312 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.807046 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.812886 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.860923 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 15:08:13 crc kubenswrapper[4786]: I0313 15:08:13.979939 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.061622 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.090203 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.112581 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.186969 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.204249 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.215249 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.252740 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.302325 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.302753 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.353396 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.575388 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.577175 4786 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.674178 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.682944 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.718641 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.748413 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.768264 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.784587 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.884497 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.923571 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.937799 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.945974 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 15:08:14 crc kubenswrapper[4786]: I0313 15:08:14.948023 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.080445 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.103258 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.155882 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.192023 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.458826 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.574738 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.583920 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.618058 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.644894 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.673752 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.752631 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.780492 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.790294 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.854419 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.882172 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 15:08:15 crc kubenswrapper[4786]: I0313 15:08:15.887802 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.080498 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.088772 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.143551 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.329831 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.388737 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.397164 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.571623 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.579483 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.584275 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.657918 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.660962 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.732020 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.758407 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.860444 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.880625 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.906217 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 15:08:16 crc kubenswrapper[4786]: I0313 15:08:16.963981 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.033511 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.072418 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.115648 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.162759 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.199618 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.318989 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.418141 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.477427 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.572110 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.625047 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.687358 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.700043 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.880287 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 15:08:17 crc kubenswrapper[4786]: I0313 15:08:17.901632 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.026681 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.071443 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.135469 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.275655 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.285116 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.341783 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.472000 4786 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.472352 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af" gracePeriod=5 Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.508581 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.615394 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.658631 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.682450 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.732166 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.795557 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.796696 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.817578 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 15:08:18 crc kubenswrapper[4786]: I0313 15:08:18.833168 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.042773 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.074345 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.095130 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.108773 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.128138 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.194485 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.214061 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.271099 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.379039 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.470744 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.514048 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.520416 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.535536 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.550670 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.706754 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.911360 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 15:08:19 crc kubenswrapper[4786]: I0313 15:08:19.926373 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.037978 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.205573 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.282216 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.409193 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.450587 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.542959 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.591602 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.610968 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.649220 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.655765 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 15:08:20 crc kubenswrapper[4786]: I0313 15:08:20.726356 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 15:08:21 crc kubenswrapper[4786]: I0313 15:08:21.132669 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 15:08:21 crc kubenswrapper[4786]: I0313 15:08:21.172684 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 15:08:21 crc kubenswrapper[4786]: I0313 15:08:21.248742 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 15:08:21 crc kubenswrapper[4786]: I0313 15:08:21.727746 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.055174 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.055333 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.084026 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.084097 4786 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af" exitCode=137 Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.084156 4786 scope.go:117] "RemoveContainer" containerID="281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.084209 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.107632 4786 scope.go:117] "RemoveContainer" containerID="281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af" Mar 13 15:08:24 crc kubenswrapper[4786]: E0313 15:08:24.108151 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af\": container with ID starting with 281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af not found: ID does not exist" containerID="281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.108208 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af"} err="failed to get container status \"281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af\": rpc error: code = NotFound desc = could not find container \"281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af\": container with ID starting with 281eebd2977ce36b57acf899e4ae13bb1d989160c33e1d1ca3140e4c6c70c8af not found: ID does not exist" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209037 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209120 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209213 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209255 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209263 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209350 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209375 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209466 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209592 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209729 4786 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209740 4786 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209748 4786 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.209756 4786 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.224758 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.310528 4786 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.554011 4786 scope.go:117] "RemoveContainer" containerID="d4f62e3295f614ca111fb37677bac2a111cdb9fd56a973ac780395e5ec211f7b" Mar 13 15:08:24 crc kubenswrapper[4786]: I0313 15:08:24.561810 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 15:08:25 crc kubenswrapper[4786]: I0313 15:08:25.095236 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-55896b6b9d-lsgxh_8199eb78-7bf5-40c5-8f5b-65c1f49a610e/oauth-openshift/1.log" Mar 13 15:08:25 crc kubenswrapper[4786]: I0313 15:08:25.095696 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" event={"ID":"8199eb78-7bf5-40c5-8f5b-65c1f49a610e","Type":"ContainerStarted","Data":"6ef957c5ff3173c190b45a0de1a625d1e31b289d682fce851d77ff41e18f2c6d"} Mar 13 15:08:25 crc kubenswrapper[4786]: I0313 15:08:25.096280 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:08:25 crc kubenswrapper[4786]: I0313 15:08:25.129648 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" podStartSLOduration=82.129625735 podStartE2EDuration="1m22.129625735s" podCreationTimestamp="2026-03-13 15:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:08:04.984720698 +0000 UTC m=+315.147932509" watchObservedRunningTime="2026-03-13 15:08:25.129625735 +0000 UTC m=+335.292837576" Mar 13 15:08:25 crc kubenswrapper[4786]: I0313 15:08:25.211247 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55896b6b9d-lsgxh" Mar 13 15:08:32 crc kubenswrapper[4786]: I0313 15:08:32.467912 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 15:08:32 crc kubenswrapper[4786]: I0313 15:08:32.579718 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 15:08:35 crc kubenswrapper[4786]: I0313 15:08:35.280143 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 15:08:35 crc kubenswrapper[4786]: I0313 15:08:35.415832 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 15:08:36 crc kubenswrapper[4786]: I0313 15:08:36.577923 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 15:08:42 crc kubenswrapper[4786]: I0313 15:08:42.197498 4786 generic.go:334] "Generic (PLEG): container finished" podID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerID="13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7" exitCode=0 Mar 13 15:08:42 crc kubenswrapper[4786]: I0313 15:08:42.197571 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" event={"ID":"cc2927b9-2e8a-4a34-90e4-932c1f6115c3","Type":"ContainerDied","Data":"13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7"} Mar 13 15:08:42 crc kubenswrapper[4786]: I0313 15:08:42.198682 4786 scope.go:117] "RemoveContainer" containerID="13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7" Mar 13 15:08:42 crc kubenswrapper[4786]: I0313 15:08:42.606722 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 15:08:43 crc kubenswrapper[4786]: I0313 15:08:43.205349 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" event={"ID":"cc2927b9-2e8a-4a34-90e4-932c1f6115c3","Type":"ContainerStarted","Data":"a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f"} Mar 13 15:08:43 crc kubenswrapper[4786]: I0313 15:08:43.206683 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:08:43 crc kubenswrapper[4786]: I0313 15:08:43.211794 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:08:43 crc kubenswrapper[4786]: I0313 15:08:43.326965 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.377455 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556908-ckrmj"] Mar 13 15:08:51 crc kubenswrapper[4786]: E0313 15:08:51.378205 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" containerName="controller-manager" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378220 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" containerName="controller-manager" Mar 13 15:08:51 crc kubenswrapper[4786]: E0313 15:08:51.378237 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b158d4-083e-4d6e-9237-561014e45538" containerName="installer" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378243 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b158d4-083e-4d6e-9237-561014e45538" containerName="installer" Mar 13 15:08:51 crc kubenswrapper[4786]: E0313 15:08:51.378251 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" containerName="route-controller-manager" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378258 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" containerName="route-controller-manager" Mar 13 15:08:51 crc kubenswrapper[4786]: E0313 15:08:51.378271 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378277 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378371 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f03a0c-d5c7-45ba-80a6-9d7c0ec4266f" containerName="controller-manager" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378380 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="058572a7-4afb-4f0f-ac7f-2e3979fe31a9" containerName="route-controller-manager" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378388 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b158d4-083e-4d6e-9237-561014e45538" containerName="installer" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378395 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.378718 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.381368 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.381492 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.381916 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.388698 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt"] Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.393058 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.397132 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78c9bd6695-x22r7"] Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.397669 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.397902 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.397693 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.398350 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.398444 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.398561 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.398747 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.400376 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.402322 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.402345 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.410698 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-ckrmj"] Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.411063 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.411443 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.411469 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.413569 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.416550 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt"] Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.431196 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c9bd6695-x22r7"] Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.450298 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mp9r\" (UniqueName: \"kubernetes.io/projected/73373245-c6fb-46a6-855e-321909bf5959-kube-api-access-9mp9r\") pod \"auto-csr-approver-29556908-ckrmj\" (UID: \"73373245-c6fb-46a6-855e-321909bf5959\") " pod="openshift-infra/auto-csr-approver-29556908-ckrmj" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.552186 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fa29b1d-7939-43c7-9239-7789d81673cd-client-ca\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.552421 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-client-ca\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.552650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mp9r\" (UniqueName: \"kubernetes.io/projected/73373245-c6fb-46a6-855e-321909bf5959-kube-api-access-9mp9r\") pod \"auto-csr-approver-29556908-ckrmj\" (UID: \"73373245-c6fb-46a6-855e-321909bf5959\") " pod="openshift-infra/auto-csr-approver-29556908-ckrmj" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.552747 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa29b1d-7939-43c7-9239-7789d81673cd-config\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.552923 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa29b1d-7939-43c7-9239-7789d81673cd-serving-cert\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.553299 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnv8\" (UniqueName: \"kubernetes.io/projected/2fa29b1d-7939-43c7-9239-7789d81673cd-kube-api-access-vwnv8\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.553539 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-proxy-ca-bundles\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.553656 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-config\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.553897 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjh47\" (UniqueName: \"kubernetes.io/projected/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-kube-api-access-cjh47\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.554015 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-serving-cert\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.583295 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mp9r\" (UniqueName: \"kubernetes.io/projected/73373245-c6fb-46a6-855e-321909bf5959-kube-api-access-9mp9r\") pod \"auto-csr-approver-29556908-ckrmj\" (UID: \"73373245-c6fb-46a6-855e-321909bf5959\") " pod="openshift-infra/auto-csr-approver-29556908-ckrmj" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.655078 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnv8\" (UniqueName: \"kubernetes.io/projected/2fa29b1d-7939-43c7-9239-7789d81673cd-kube-api-access-vwnv8\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.655139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-config\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.655162 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-proxy-ca-bundles\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.655197 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjh47\" (UniqueName: \"kubernetes.io/projected/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-kube-api-access-cjh47\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.656515 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-config\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.657293 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-proxy-ca-bundles\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.657377 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-serving-cert\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.657961 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fa29b1d-7939-43c7-9239-7789d81673cd-client-ca\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.658028 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-client-ca\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.658146 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa29b1d-7939-43c7-9239-7789d81673cd-config\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.658179 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa29b1d-7939-43c7-9239-7789d81673cd-serving-cert\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.660136 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa29b1d-7939-43c7-9239-7789d81673cd-config\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.660395 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fa29b1d-7939-43c7-9239-7789d81673cd-client-ca\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.661904 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-serving-cert\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.664725 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa29b1d-7939-43c7-9239-7789d81673cd-serving-cert\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.667428 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-client-ca\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.675100 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjh47\" (UniqueName: \"kubernetes.io/projected/b9b3e005-c5de-4ec6-9597-3b2b38e513f4-kube-api-access-cjh47\") pod \"controller-manager-78c9bd6695-x22r7\" (UID: \"b9b3e005-c5de-4ec6-9597-3b2b38e513f4\") " pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.676938 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnv8\" (UniqueName: \"kubernetes.io/projected/2fa29b1d-7939-43c7-9239-7789d81673cd-kube-api-access-vwnv8\") pod \"route-controller-manager-59b9ccfd8d-ms8pt\" (UID: \"2fa29b1d-7939-43c7-9239-7789d81673cd\") " pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.709185 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.726272 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:51 crc kubenswrapper[4786]: I0313 15:08:51.737328 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:52 crc kubenswrapper[4786]: I0313 15:08:52.186300 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-ckrmj"] Mar 13 15:08:52 crc kubenswrapper[4786]: W0313 15:08:52.196349 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73373245_c6fb_46a6_855e_321909bf5959.slice/crio-f5d041ba8f220eac03f001ff054b06076a74edb95ab54a1deac06d85ac7da901 WatchSource:0}: Error finding container f5d041ba8f220eac03f001ff054b06076a74edb95ab54a1deac06d85ac7da901: Status 404 returned error can't find the container with id f5d041ba8f220eac03f001ff054b06076a74edb95ab54a1deac06d85ac7da901 Mar 13 15:08:52 crc kubenswrapper[4786]: I0313 15:08:52.258730 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt"] Mar 13 15:08:52 crc kubenswrapper[4786]: I0313 15:08:52.259368 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" event={"ID":"73373245-c6fb-46a6-855e-321909bf5959","Type":"ContainerStarted","Data":"f5d041ba8f220eac03f001ff054b06076a74edb95ab54a1deac06d85ac7da901"} Mar 13 15:08:52 crc kubenswrapper[4786]: W0313 15:08:52.259929 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa29b1d_7939_43c7_9239_7789d81673cd.slice/crio-ed0114aff71a97746a475a45e77c9d8f82677832575c405a6b2b9a8ccf5f5cb1 WatchSource:0}: Error finding container ed0114aff71a97746a475a45e77c9d8f82677832575c405a6b2b9a8ccf5f5cb1: Status 404 returned error can't find the container with id ed0114aff71a97746a475a45e77c9d8f82677832575c405a6b2b9a8ccf5f5cb1 Mar 13 15:08:52 crc kubenswrapper[4786]: I0313 15:08:52.264828 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c9bd6695-x22r7"] Mar 13 15:08:52 crc kubenswrapper[4786]: W0313 15:08:52.266995 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b3e005_c5de_4ec6_9597_3b2b38e513f4.slice/crio-fe1739c6f986f27e7b2baed122efebd26c887856e16855f62c942823417dcb2f WatchSource:0}: Error finding container fe1739c6f986f27e7b2baed122efebd26c887856e16855f62c942823417dcb2f: Status 404 returned error can't find the container with id fe1739c6f986f27e7b2baed122efebd26c887856e16855f62c942823417dcb2f Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.269828 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" event={"ID":"b9b3e005-c5de-4ec6-9597-3b2b38e513f4","Type":"ContainerStarted","Data":"79c5a39cd6ad1914831f2cb1cafa005b54934fe351ce610cce77ac35a843b4c7"} Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.270179 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" event={"ID":"b9b3e005-c5de-4ec6-9597-3b2b38e513f4","Type":"ContainerStarted","Data":"fe1739c6f986f27e7b2baed122efebd26c887856e16855f62c942823417dcb2f"} Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.271330 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.274103 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" event={"ID":"2fa29b1d-7939-43c7-9239-7789d81673cd","Type":"ContainerStarted","Data":"ee3adbb4c9e9f55747ae5b792a98cc56f3238140e229cec1a65c0c5ac0bf53d3"} Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.274186 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" event={"ID":"2fa29b1d-7939-43c7-9239-7789d81673cd","Type":"ContainerStarted","Data":"ed0114aff71a97746a475a45e77c9d8f82677832575c405a6b2b9a8ccf5f5cb1"} Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.274552 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.278005 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.282592 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.300446 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78c9bd6695-x22r7" podStartSLOduration=83.300417283 podStartE2EDuration="1m23.300417283s" podCreationTimestamp="2026-03-13 15:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:08:53.295403448 +0000 UTC m=+363.458615319" watchObservedRunningTime="2026-03-13 15:08:53.300417283 +0000 UTC m=+363.463629104" Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.335727 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59b9ccfd8d-ms8pt" podStartSLOduration=83.335707401 podStartE2EDuration="1m23.335707401s" podCreationTimestamp="2026-03-13 15:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:08:53.334290303 +0000 UTC m=+363.497502154" watchObservedRunningTime="2026-03-13 15:08:53.335707401 +0000 UTC m=+363.498919252" Mar 13 15:08:53 crc kubenswrapper[4786]: I0313 15:08:53.543676 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 15:08:54 crc kubenswrapper[4786]: I0313 15:08:54.278683 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" event={"ID":"73373245-c6fb-46a6-855e-321909bf5959","Type":"ContainerStarted","Data":"b064e301025f633fca08b2c6c6d6715dd2fe77ad1d9b0ad6e22d7c4854ff4f2b"} Mar 13 15:08:54 crc kubenswrapper[4786]: I0313 15:08:54.294722 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" podStartSLOduration=36.687383064 podStartE2EDuration="38.294704165s" podCreationTimestamp="2026-03-13 15:08:16 +0000 UTC" firstStartedPulling="2026-03-13 15:08:52.19888003 +0000 UTC m=+362.362091841" lastFinishedPulling="2026-03-13 15:08:53.806201091 +0000 UTC m=+363.969412942" observedRunningTime="2026-03-13 15:08:54.290653247 +0000 UTC m=+364.453865058" watchObservedRunningTime="2026-03-13 15:08:54.294704165 +0000 UTC m=+364.457915976" Mar 13 15:08:54 crc kubenswrapper[4786]: I0313 15:08:54.985523 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 15:08:55 crc kubenswrapper[4786]: I0313 15:08:55.285525 4786 generic.go:334] "Generic (PLEG): container finished" podID="73373245-c6fb-46a6-855e-321909bf5959" containerID="b064e301025f633fca08b2c6c6d6715dd2fe77ad1d9b0ad6e22d7c4854ff4f2b" exitCode=0 Mar 13 15:08:55 crc kubenswrapper[4786]: I0313 15:08:55.285598 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" event={"ID":"73373245-c6fb-46a6-855e-321909bf5959","Type":"ContainerDied","Data":"b064e301025f633fca08b2c6c6d6715dd2fe77ad1d9b0ad6e22d7c4854ff4f2b"} Mar 13 15:08:56 crc kubenswrapper[4786]: I0313 15:08:56.274592 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 15:08:56 crc kubenswrapper[4786]: I0313 15:08:56.596201 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" Mar 13 15:08:56 crc kubenswrapper[4786]: I0313 15:08:56.735442 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mp9r\" (UniqueName: \"kubernetes.io/projected/73373245-c6fb-46a6-855e-321909bf5959-kube-api-access-9mp9r\") pod \"73373245-c6fb-46a6-855e-321909bf5959\" (UID: \"73373245-c6fb-46a6-855e-321909bf5959\") " Mar 13 15:08:56 crc kubenswrapper[4786]: I0313 15:08:56.740141 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73373245-c6fb-46a6-855e-321909bf5959-kube-api-access-9mp9r" (OuterVolumeSpecName: "kube-api-access-9mp9r") pod "73373245-c6fb-46a6-855e-321909bf5959" (UID: "73373245-c6fb-46a6-855e-321909bf5959"). InnerVolumeSpecName "kube-api-access-9mp9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:08:56 crc kubenswrapper[4786]: I0313 15:08:56.836795 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mp9r\" (UniqueName: \"kubernetes.io/projected/73373245-c6fb-46a6-855e-321909bf5959-kube-api-access-9mp9r\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:57 crc kubenswrapper[4786]: I0313 15:08:57.300996 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" event={"ID":"73373245-c6fb-46a6-855e-321909bf5959","Type":"ContainerDied","Data":"f5d041ba8f220eac03f001ff054b06076a74edb95ab54a1deac06d85ac7da901"} Mar 13 15:08:57 crc kubenswrapper[4786]: I0313 15:08:57.301065 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5d041ba8f220eac03f001ff054b06076a74edb95ab54a1deac06d85ac7da901" Mar 13 15:08:57 crc kubenswrapper[4786]: I0313 15:08:57.301103 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-ckrmj" Mar 13 15:09:01 crc kubenswrapper[4786]: I0313 15:09:01.511799 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.081818 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vbf74"] Mar 13 15:09:35 crc kubenswrapper[4786]: E0313 15:09:35.082660 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73373245-c6fb-46a6-855e-321909bf5959" containerName="oc" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.082676 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="73373245-c6fb-46a6-855e-321909bf5959" containerName="oc" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.082809 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="73373245-c6fb-46a6-855e-321909bf5959" containerName="oc" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.083330 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.103257 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vbf74"] Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.246603 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.246762 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-bound-sa-token\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.246791 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6439de8-5841-4f33-bd33-ad7ebf393def-trusted-ca\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.246840 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6439de8-5841-4f33-bd33-ad7ebf393def-registry-certificates\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.246939 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-registry-tls\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.246986 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6439de8-5841-4f33-bd33-ad7ebf393def-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.247018 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wczx\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-kube-api-access-8wczx\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.247068 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6439de8-5841-4f33-bd33-ad7ebf393def-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.293411 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.348215 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-registry-tls\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.348256 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6439de8-5841-4f33-bd33-ad7ebf393def-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.348280 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wczx\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-kube-api-access-8wczx\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.348304 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6439de8-5841-4f33-bd33-ad7ebf393def-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.348350 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-bound-sa-token\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.348366 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6439de8-5841-4f33-bd33-ad7ebf393def-trusted-ca\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.348384 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6439de8-5841-4f33-bd33-ad7ebf393def-registry-certificates\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.349344 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6439de8-5841-4f33-bd33-ad7ebf393def-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.350047 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6439de8-5841-4f33-bd33-ad7ebf393def-registry-certificates\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.351663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6439de8-5841-4f33-bd33-ad7ebf393def-trusted-ca\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.354371 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-registry-tls\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.363980 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6439de8-5841-4f33-bd33-ad7ebf393def-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.366073 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-bound-sa-token\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.366475 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wczx\" (UniqueName: \"kubernetes.io/projected/d6439de8-5841-4f33-bd33-ad7ebf393def-kube-api-access-8wczx\") pod \"image-registry-66df7c8f76-vbf74\" (UID: \"d6439de8-5841-4f33-bd33-ad7ebf393def\") " pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.405883 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:35 crc kubenswrapper[4786]: I0313 15:09:35.891493 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vbf74"] Mar 13 15:09:36 crc kubenswrapper[4786]: I0313 15:09:36.610006 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" event={"ID":"d6439de8-5841-4f33-bd33-ad7ebf393def","Type":"ContainerStarted","Data":"098705de346aa6e205ce7c0b79596a7604b9368afe285f37312e557945dc922a"} Mar 13 15:09:36 crc kubenswrapper[4786]: I0313 15:09:36.610614 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:36 crc kubenswrapper[4786]: I0313 15:09:36.610645 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" event={"ID":"d6439de8-5841-4f33-bd33-ad7ebf393def","Type":"ContainerStarted","Data":"587a50729ac7f8f8c889b6f43a8e6d614807e20315c668c44b6119a2205b8635"} Mar 13 15:09:36 crc kubenswrapper[4786]: I0313 15:09:36.630406 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" podStartSLOduration=1.630384513 podStartE2EDuration="1.630384513s" podCreationTimestamp="2026-03-13 15:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:09:36.627582475 +0000 UTC m=+406.790794296" watchObservedRunningTime="2026-03-13 15:09:36.630384513 +0000 UTC m=+406.793596324" Mar 13 15:09:44 crc kubenswrapper[4786]: I0313 15:09:44.440123 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggggh"] Mar 13 15:09:44 crc kubenswrapper[4786]: I0313 15:09:44.441694 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ggggh" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="registry-server" containerID="cri-o://d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8" gracePeriod=2 Mar 13 15:09:44 crc kubenswrapper[4786]: I0313 15:09:44.657506 4786 generic.go:334] "Generic (PLEG): container finished" podID="b5282ab3-536a-405e-93af-c9c16130ec87" containerID="d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8" exitCode=0 Mar 13 15:09:44 crc kubenswrapper[4786]: I0313 15:09:44.657597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggggh" event={"ID":"b5282ab3-536a-405e-93af-c9c16130ec87","Type":"ContainerDied","Data":"d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8"} Mar 13 15:09:44 crc kubenswrapper[4786]: E0313 15:09:44.733404 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8 is running failed: container process not found" containerID="d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 15:09:44 crc kubenswrapper[4786]: E0313 15:09:44.734115 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8 is running failed: container process not found" containerID="d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 15:09:44 crc kubenswrapper[4786]: E0313 15:09:44.734723 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8 is running failed: container process not found" containerID="d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 15:09:44 crc kubenswrapper[4786]: E0313 15:09:44.734794 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-ggggh" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="registry-server" Mar 13 15:09:44 crc kubenswrapper[4786]: I0313 15:09:44.941025 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.085099 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-utilities\") pod \"b5282ab3-536a-405e-93af-c9c16130ec87\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.085428 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ptld\" (UniqueName: \"kubernetes.io/projected/b5282ab3-536a-405e-93af-c9c16130ec87-kube-api-access-6ptld\") pod \"b5282ab3-536a-405e-93af-c9c16130ec87\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.085579 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-catalog-content\") pod \"b5282ab3-536a-405e-93af-c9c16130ec87\" (UID: \"b5282ab3-536a-405e-93af-c9c16130ec87\") " Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.087216 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-utilities" (OuterVolumeSpecName: "utilities") pod "b5282ab3-536a-405e-93af-c9c16130ec87" (UID: "b5282ab3-536a-405e-93af-c9c16130ec87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.092201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5282ab3-536a-405e-93af-c9c16130ec87-kube-api-access-6ptld" (OuterVolumeSpecName: "kube-api-access-6ptld") pod "b5282ab3-536a-405e-93af-c9c16130ec87" (UID: "b5282ab3-536a-405e-93af-c9c16130ec87"). InnerVolumeSpecName "kube-api-access-6ptld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.187712 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.187762 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ptld\" (UniqueName: \"kubernetes.io/projected/b5282ab3-536a-405e-93af-c9c16130ec87-kube-api-access-6ptld\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.245238 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5282ab3-536a-405e-93af-c9c16130ec87" (UID: "b5282ab3-536a-405e-93af-c9c16130ec87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.289273 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5282ab3-536a-405e-93af-c9c16130ec87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.665662 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggggh" event={"ID":"b5282ab3-536a-405e-93af-c9c16130ec87","Type":"ContainerDied","Data":"3c9798dd8adcb6bf614170bb1a84f4cb4749a884055a5a23f8766bfbde8e18d2"} Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.665725 4786 scope.go:117] "RemoveContainer" containerID="d95f6a14e5c76402efe34342e585c173064313b403308837b5ede953749e88b8" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.665761 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggggh" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.688793 4786 scope.go:117] "RemoveContainer" containerID="aebadc150a1750ddbb6f1719b8d3bd55e1abc71c0d4b08b7e82129aa424d53f9" Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.697356 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggggh"] Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.705441 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ggggh"] Mar 13 15:09:45 crc kubenswrapper[4786]: I0313 15:09:45.722429 4786 scope.go:117] "RemoveContainer" containerID="4cfddcb622682d8d31cd8af7407c391ba1c9f1c1ccd3df90a45ae38632f2cc1e" Mar 13 15:09:46 crc kubenswrapper[4786]: I0313 15:09:46.559299 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" path="/var/lib/kubelet/pods/b5282ab3-536a-405e-93af-c9c16130ec87/volumes" Mar 13 15:09:55 crc kubenswrapper[4786]: I0313 15:09:55.418695 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vbf74" Mar 13 15:09:55 crc kubenswrapper[4786]: I0313 15:09:55.496645 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ffbml"] Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.941338 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dkvg"] Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.942286 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2dkvg" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" containerName="registry-server" containerID="cri-o://98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2" gracePeriod=30 Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.948746 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbj4r"] Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.949083 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kbj4r" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerName="registry-server" containerID="cri-o://17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4" gracePeriod=30 Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.954678 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wcljz"] Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.957676 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" containerID="cri-o://a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f" gracePeriod=30 Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.973270 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v9g8"] Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.973593 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5v9g8" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerName="registry-server" containerID="cri-o://0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6" gracePeriod=30 Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.978042 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8tbs9"] Mar 13 15:09:58 crc kubenswrapper[4786]: E0313 15:09:58.978284 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="registry-server" Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.978301 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="registry-server" Mar 13 15:09:58 crc kubenswrapper[4786]: E0313 15:09:58.978317 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="extract-utilities" Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.978329 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="extract-utilities" Mar 13 15:09:58 crc kubenswrapper[4786]: E0313 15:09:58.978345 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="extract-content" Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.978353 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="extract-content" Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.978491 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5282ab3-536a-405e-93af-c9c16130ec87" containerName="registry-server" Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.978942 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.986400 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q84vs"] Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.986603 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q84vs" podUID="73d30051-975a-4329-8d7b-32d297b35218" containerName="registry-server" containerID="cri-o://2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b" gracePeriod=30 Mar 13 15:09:58 crc kubenswrapper[4786]: I0313 15:09:58.991749 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8tbs9"] Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.117641 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gvf\" (UniqueName: \"kubernetes.io/projected/37854105-dd2c-4a53-9a0e-45813f321114-kube-api-access-66gvf\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.117705 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37854105-dd2c-4a53-9a0e-45813f321114-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.117755 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/37854105-dd2c-4a53-9a0e-45813f321114-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.219464 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/37854105-dd2c-4a53-9a0e-45813f321114-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.220138 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gvf\" (UniqueName: \"kubernetes.io/projected/37854105-dd2c-4a53-9a0e-45813f321114-kube-api-access-66gvf\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.220186 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37854105-dd2c-4a53-9a0e-45813f321114-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.221276 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37854105-dd2c-4a53-9a0e-45813f321114-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.227192 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/37854105-dd2c-4a53-9a0e-45813f321114-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.242022 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gvf\" (UniqueName: \"kubernetes.io/projected/37854105-dd2c-4a53-9a0e-45813f321114-kube-api-access-66gvf\") pod \"marketplace-operator-79b997595-8tbs9\" (UID: \"37854105-dd2c-4a53-9a0e-45813f321114\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.376494 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.386733 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.507055 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.540344 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-utilities\") pod \"4bcce315-5828-4a7c-870f-6dd6518af3dd\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.540404 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-catalog-content\") pod \"4bcce315-5828-4a7c-870f-6dd6518af3dd\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.540444 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np77p\" (UniqueName: \"kubernetes.io/projected/4bcce315-5828-4a7c-870f-6dd6518af3dd-kube-api-access-np77p\") pod \"4bcce315-5828-4a7c-870f-6dd6518af3dd\" (UID: \"4bcce315-5828-4a7c-870f-6dd6518af3dd\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.540483 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-utilities\") pod \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.540506 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdlwr\" (UniqueName: \"kubernetes.io/projected/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-kube-api-access-xdlwr\") pod \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.540557 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-catalog-content\") pod \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\" (UID: \"8e627f8c-63e5-4b85-9fff-0c205c96d0a4\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.543182 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-utilities" (OuterVolumeSpecName: "utilities") pod "8e627f8c-63e5-4b85-9fff-0c205c96d0a4" (UID: "8e627f8c-63e5-4b85-9fff-0c205c96d0a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.544591 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-utilities" (OuterVolumeSpecName: "utilities") pod "4bcce315-5828-4a7c-870f-6dd6518af3dd" (UID: "4bcce315-5828-4a7c-870f-6dd6518af3dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.549895 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-kube-api-access-xdlwr" (OuterVolumeSpecName: "kube-api-access-xdlwr") pod "8e627f8c-63e5-4b85-9fff-0c205c96d0a4" (UID: "8e627f8c-63e5-4b85-9fff-0c205c96d0a4"). InnerVolumeSpecName "kube-api-access-xdlwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.554040 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bcce315-5828-4a7c-870f-6dd6518af3dd-kube-api-access-np77p" (OuterVolumeSpecName: "kube-api-access-np77p") pod "4bcce315-5828-4a7c-870f-6dd6518af3dd" (UID: "4bcce315-5828-4a7c-870f-6dd6518af3dd"). InnerVolumeSpecName "kube-api-access-np77p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.566017 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.569577 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.572619 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.603170 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e627f8c-63e5-4b85-9fff-0c205c96d0a4" (UID: "8e627f8c-63e5-4b85-9fff-0c205c96d0a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.634611 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bcce315-5828-4a7c-870f-6dd6518af3dd" (UID: "4bcce315-5828-4a7c-870f-6dd6518af3dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.641720 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.641812 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np77p\" (UniqueName: \"kubernetes.io/projected/4bcce315-5828-4a7c-870f-6dd6518af3dd-kube-api-access-np77p\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.641829 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.641845 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdlwr\" (UniqueName: \"kubernetes.io/projected/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-kube-api-access-xdlwr\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.641915 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e627f8c-63e5-4b85-9fff-0c205c96d0a4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.641997 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bcce315-5828-4a7c-870f-6dd6518af3dd-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743346 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-trusted-ca\") pod \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fs9f\" (UniqueName: \"kubernetes.io/projected/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-kube-api-access-4fs9f\") pod \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743547 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-utilities\") pod \"44d91c62-557f-40c8-a725-33ff965bee1b\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743684 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-utilities\") pod \"73d30051-975a-4329-8d7b-32d297b35218\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743748 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-operator-metrics\") pod \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\" (UID: \"cc2927b9-2e8a-4a34-90e4-932c1f6115c3\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743812 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j679r\" (UniqueName: \"kubernetes.io/projected/44d91c62-557f-40c8-a725-33ff965bee1b-kube-api-access-j679r\") pod \"44d91c62-557f-40c8-a725-33ff965bee1b\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743887 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-catalog-content\") pod \"44d91c62-557f-40c8-a725-33ff965bee1b\" (UID: \"44d91c62-557f-40c8-a725-33ff965bee1b\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743935 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbmwc\" (UniqueName: \"kubernetes.io/projected/73d30051-975a-4329-8d7b-32d297b35218-kube-api-access-wbmwc\") pod \"73d30051-975a-4329-8d7b-32d297b35218\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.743993 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-catalog-content\") pod \"73d30051-975a-4329-8d7b-32d297b35218\" (UID: \"73d30051-975a-4329-8d7b-32d297b35218\") " Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.744160 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cc2927b9-2e8a-4a34-90e4-932c1f6115c3" (UID: "cc2927b9-2e8a-4a34-90e4-932c1f6115c3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.744384 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-utilities" (OuterVolumeSpecName: "utilities") pod "44d91c62-557f-40c8-a725-33ff965bee1b" (UID: "44d91c62-557f-40c8-a725-33ff965bee1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.744484 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.744980 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-utilities" (OuterVolumeSpecName: "utilities") pod "73d30051-975a-4329-8d7b-32d297b35218" (UID: "73d30051-975a-4329-8d7b-32d297b35218"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.746769 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d91c62-557f-40c8-a725-33ff965bee1b-kube-api-access-j679r" (OuterVolumeSpecName: "kube-api-access-j679r") pod "44d91c62-557f-40c8-a725-33ff965bee1b" (UID: "44d91c62-557f-40c8-a725-33ff965bee1b"). InnerVolumeSpecName "kube-api-access-j679r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.747413 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-kube-api-access-4fs9f" (OuterVolumeSpecName: "kube-api-access-4fs9f") pod "cc2927b9-2e8a-4a34-90e4-932c1f6115c3" (UID: "cc2927b9-2e8a-4a34-90e4-932c1f6115c3"). InnerVolumeSpecName "kube-api-access-4fs9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.748349 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cc2927b9-2e8a-4a34-90e4-932c1f6115c3" (UID: "cc2927b9-2e8a-4a34-90e4-932c1f6115c3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.748520 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d30051-975a-4329-8d7b-32d297b35218-kube-api-access-wbmwc" (OuterVolumeSpecName: "kube-api-access-wbmwc") pod "73d30051-975a-4329-8d7b-32d297b35218" (UID: "73d30051-975a-4329-8d7b-32d297b35218"). InnerVolumeSpecName "kube-api-access-wbmwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.752134 4786 generic.go:334] "Generic (PLEG): container finished" podID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerID="0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6" exitCode=0 Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.752338 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v9g8" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.752761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v9g8" event={"ID":"8e627f8c-63e5-4b85-9fff-0c205c96d0a4","Type":"ContainerDied","Data":"0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.754646 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v9g8" event={"ID":"8e627f8c-63e5-4b85-9fff-0c205c96d0a4","Type":"ContainerDied","Data":"084e8b5790addce5afdc6d136f56b6ba57ce0935e95f206e1dfd7e89245d3ee0"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.754930 4786 scope.go:117] "RemoveContainer" containerID="0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.756748 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerID="17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4" exitCode=0 Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.756808 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbj4r" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.756821 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbj4r" event={"ID":"4bcce315-5828-4a7c-870f-6dd6518af3dd","Type":"ContainerDied","Data":"17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.757050 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbj4r" event={"ID":"4bcce315-5828-4a7c-870f-6dd6518af3dd","Type":"ContainerDied","Data":"ff0abccfe597a5524f1e8a31ec63c7e6f82d9fe08573d94cdee3cb4d42a99ac2"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.760130 4786 generic.go:334] "Generic (PLEG): container finished" podID="44d91c62-557f-40c8-a725-33ff965bee1b" containerID="98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2" exitCode=0 Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.760199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dkvg" event={"ID":"44d91c62-557f-40c8-a725-33ff965bee1b","Type":"ContainerDied","Data":"98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.760223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dkvg" event={"ID":"44d91c62-557f-40c8-a725-33ff965bee1b","Type":"ContainerDied","Data":"2bd7f7a5aad35099e9f2c1130f4e44be0dfb3310f20ab430c9447fec995687c5"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.760352 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dkvg" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.765274 4786 generic.go:334] "Generic (PLEG): container finished" podID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerID="a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f" exitCode=0 Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.765327 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.765642 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" event={"ID":"cc2927b9-2e8a-4a34-90e4-932c1f6115c3","Type":"ContainerDied","Data":"a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.765679 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wcljz" event={"ID":"cc2927b9-2e8a-4a34-90e4-932c1f6115c3","Type":"ContainerDied","Data":"65d28073b6acd7b97848fb63e3b4f41f8c6f63c8591254bc37ecc5bf20e5f0a2"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.768752 4786 generic.go:334] "Generic (PLEG): container finished" podID="73d30051-975a-4329-8d7b-32d297b35218" containerID="2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b" exitCode=0 Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.768808 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q84vs" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.768811 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q84vs" event={"ID":"73d30051-975a-4329-8d7b-32d297b35218","Type":"ContainerDied","Data":"2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.768838 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q84vs" event={"ID":"73d30051-975a-4329-8d7b-32d297b35218","Type":"ContainerDied","Data":"8f53d36f53e486976ee0f94789c2aa607485d031ef1a18d023804d78eade2df4"} Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.782476 4786 scope.go:117] "RemoveContainer" containerID="6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.807521 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbj4r"] Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.824141 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kbj4r"] Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.826318 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v9g8"] Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.831665 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v9g8"] Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.839377 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wcljz"] Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.841823 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wcljz"] Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.845565 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fs9f\" (UniqueName: \"kubernetes.io/projected/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-kube-api-access-4fs9f\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.845588 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.845598 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.845608 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc2927b9-2e8a-4a34-90e4-932c1f6115c3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.845618 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j679r\" (UniqueName: \"kubernetes.io/projected/44d91c62-557f-40c8-a725-33ff965bee1b-kube-api-access-j679r\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.845626 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbmwc\" (UniqueName: \"kubernetes.io/projected/73d30051-975a-4329-8d7b-32d297b35218-kube-api-access-wbmwc\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.849928 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44d91c62-557f-40c8-a725-33ff965bee1b" (UID: "44d91c62-557f-40c8-a725-33ff965bee1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.855059 4786 scope.go:117] "RemoveContainer" containerID="6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.861469 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8tbs9"] Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.869447 4786 scope.go:117] "RemoveContainer" containerID="0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.869754 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6\": container with ID starting with 0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6 not found: ID does not exist" containerID="0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.869797 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6"} err="failed to get container status \"0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6\": rpc error: code = NotFound desc = could not find container \"0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6\": container with ID starting with 0f5702d9b626f26bfbf995ca42c87faeb1188e800299c6fd877458fa849352c6 not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.869829 4786 scope.go:117] "RemoveContainer" containerID="6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.870115 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c\": container with ID starting with 6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c not found: ID does not exist" containerID="6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.870149 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c"} err="failed to get container status \"6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c\": rpc error: code = NotFound desc = could not find container \"6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c\": container with ID starting with 6a593e66b289ae2da9a68399ef5415af6f35ad182a684ac5b4e8a02302c9f19c not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.870178 4786 scope.go:117] "RemoveContainer" containerID="6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.870384 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e\": container with ID starting with 6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e not found: ID does not exist" containerID="6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.870404 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e"} err="failed to get container status \"6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e\": rpc error: code = NotFound desc = could not find container \"6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e\": container with ID starting with 6c232eb160a5ba5df93c4cf15cc2a6a6a0ec8063eeeb3bc46f007af8ea92fe9e not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.870421 4786 scope.go:117] "RemoveContainer" containerID="17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.888839 4786 scope.go:117] "RemoveContainer" containerID="ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.906877 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73d30051-975a-4329-8d7b-32d297b35218" (UID: "73d30051-975a-4329-8d7b-32d297b35218"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.907607 4786 scope.go:117] "RemoveContainer" containerID="0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.922728 4786 scope.go:117] "RemoveContainer" containerID="17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.923188 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4\": container with ID starting with 17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4 not found: ID does not exist" containerID="17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.923215 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4"} err="failed to get container status \"17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4\": rpc error: code = NotFound desc = could not find container \"17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4\": container with ID starting with 17281e63bd0e06f6d1e307287b1d3f9c3567315853cb124ca632fea47f38b5b4 not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.923238 4786 scope.go:117] "RemoveContainer" containerID="ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.923567 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d\": container with ID starting with ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d not found: ID does not exist" containerID="ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.923605 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d"} err="failed to get container status \"ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d\": rpc error: code = NotFound desc = could not find container \"ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d\": container with ID starting with ae957fcf909016b9302aeda637b8fb9659150bcab2608c5500e8adab926f777d not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.923634 4786 scope.go:117] "RemoveContainer" containerID="0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.923903 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69\": container with ID starting with 0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69 not found: ID does not exist" containerID="0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.923933 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69"} err="failed to get container status \"0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69\": rpc error: code = NotFound desc = could not find container \"0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69\": container with ID starting with 0d8670466f77b363d27169bd6ebaf3976f6c53ddc6ac8586a99f1fad4db22a69 not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.923951 4786 scope.go:117] "RemoveContainer" containerID="98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.941156 4786 scope.go:117] "RemoveContainer" containerID="645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.948053 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d91c62-557f-40c8-a725-33ff965bee1b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.948079 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d30051-975a-4329-8d7b-32d297b35218-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.955684 4786 scope.go:117] "RemoveContainer" containerID="d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.967369 4786 scope.go:117] "RemoveContainer" containerID="98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.967755 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2\": container with ID starting with 98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2 not found: ID does not exist" containerID="98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.967791 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2"} err="failed to get container status \"98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2\": rpc error: code = NotFound desc = could not find container \"98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2\": container with ID starting with 98f37b7acd5ca1f5e95d31728eecd9a39d1fd2eed1e5634546866d7c723a1cd2 not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.967816 4786 scope.go:117] "RemoveContainer" containerID="645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.968388 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0\": container with ID starting with 645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0 not found: ID does not exist" containerID="645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.968427 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0"} err="failed to get container status \"645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0\": rpc error: code = NotFound desc = could not find container \"645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0\": container with ID starting with 645dba85323acbe554e74595ac1fbfd32d7b8565e9be4e57d23315c2be8d8ef0 not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.968455 4786 scope.go:117] "RemoveContainer" containerID="d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.968788 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19\": container with ID starting with d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19 not found: ID does not exist" containerID="d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.968895 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19"} err="failed to get container status \"d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19\": rpc error: code = NotFound desc = could not find container \"d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19\": container with ID starting with d64025b6ae0c170556d115e1a01e40770f18a356490e03b7cc633dbb80f22b19 not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.968981 4786 scope.go:117] "RemoveContainer" containerID="a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.980454 4786 scope.go:117] "RemoveContainer" containerID="13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.992309 4786 scope.go:117] "RemoveContainer" containerID="a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.992800 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f\": container with ID starting with a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f not found: ID does not exist" containerID="a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.992838 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f"} err="failed to get container status \"a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f\": rpc error: code = NotFound desc = could not find container \"a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f\": container with ID starting with a06cc1afa9bbe79d3b62e73f2e82ee444678e375c781b2a6ee8f8a3458eefe8f not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.992886 4786 scope.go:117] "RemoveContainer" containerID="13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7" Mar 13 15:09:59 crc kubenswrapper[4786]: E0313 15:09:59.993184 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7\": container with ID starting with 13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7 not found: ID does not exist" containerID="13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.993216 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7"} err="failed to get container status \"13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7\": rpc error: code = NotFound desc = could not find container \"13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7\": container with ID starting with 13ab1d186d6f509a00e0e32f5b0fce07520053facbfc1c8afa9a45a2dadccce7 not found: ID does not exist" Mar 13 15:09:59 crc kubenswrapper[4786]: I0313 15:09:59.993238 4786 scope.go:117] "RemoveContainer" containerID="2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.006957 4786 scope.go:117] "RemoveContainer" containerID="143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.022658 4786 scope.go:117] "RemoveContainer" containerID="2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.036579 4786 scope.go:117] "RemoveContainer" containerID="2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.037061 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b\": container with ID starting with 2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b not found: ID does not exist" containerID="2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.037230 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b"} err="failed to get container status \"2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b\": rpc error: code = NotFound desc = could not find container \"2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b\": container with ID starting with 2bbd6a71efadf1a1824c67880d529874b11be8a6e530ea8a65992d1df7fdf14b not found: ID does not exist" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.037255 4786 scope.go:117] "RemoveContainer" containerID="143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.037495 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583\": container with ID starting with 143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583 not found: ID does not exist" containerID="143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.037657 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583"} err="failed to get container status \"143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583\": rpc error: code = NotFound desc = could not find container \"143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583\": container with ID starting with 143f76d5143a53d25c5693bd693fdade069f1d9a9a963cdd464946b218f72583 not found: ID does not exist" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.037671 4786 scope.go:117] "RemoveContainer" containerID="2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.037995 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3\": container with ID starting with 2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3 not found: ID does not exist" containerID="2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.038013 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3"} err="failed to get container status \"2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3\": rpc error: code = NotFound desc = could not find container \"2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3\": container with ID starting with 2e62fdff0ad89cb039bab9065fb104e688fc3dc0fbacc4f0659c726616cd4ad3 not found: ID does not exist" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.089846 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dkvg"] Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.095197 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2dkvg"] Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.100337 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q84vs"] Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.104223 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q84vs"] Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132234 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556910-6976f"] Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132468 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132483 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132495 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132502 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132516 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d30051-975a-4329-8d7b-32d297b35218" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132523 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d30051-975a-4329-8d7b-32d297b35218" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132536 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132543 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132554 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132560 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132574 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132581 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132589 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132615 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132624 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132631 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132640 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d30051-975a-4329-8d7b-32d297b35218" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132647 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d30051-975a-4329-8d7b-32d297b35218" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132660 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132666 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132675 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132683 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132695 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d30051-975a-4329-8d7b-32d297b35218" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132702 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d30051-975a-4329-8d7b-32d297b35218" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: E0313 15:10:00.132712 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132719 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132837 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132896 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132906 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132913 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d30051-975a-4329-8d7b-32d297b35218" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.132924 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.133359 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-6976f" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.135343 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.137135 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.138299 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.140431 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-6976f"] Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.150963 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jmgz\" (UniqueName: \"kubernetes.io/projected/fc69663e-8a65-4de4-972f-92b8abe2f715-kube-api-access-6jmgz\") pod \"auto-csr-approver-29556910-6976f\" (UID: \"fc69663e-8a65-4de4-972f-92b8abe2f715\") " pod="openshift-infra/auto-csr-approver-29556910-6976f" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.252394 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jmgz\" (UniqueName: \"kubernetes.io/projected/fc69663e-8a65-4de4-972f-92b8abe2f715-kube-api-access-6jmgz\") pod \"auto-csr-approver-29556910-6976f\" (UID: \"fc69663e-8a65-4de4-972f-92b8abe2f715\") " pod="openshift-infra/auto-csr-approver-29556910-6976f" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.281736 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jmgz\" (UniqueName: \"kubernetes.io/projected/fc69663e-8a65-4de4-972f-92b8abe2f715-kube-api-access-6jmgz\") pod \"auto-csr-approver-29556910-6976f\" (UID: \"fc69663e-8a65-4de4-972f-92b8abe2f715\") " pod="openshift-infra/auto-csr-approver-29556910-6976f" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.468404 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-6976f" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.588594 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d91c62-557f-40c8-a725-33ff965bee1b" path="/var/lib/kubelet/pods/44d91c62-557f-40c8-a725-33ff965bee1b/volumes" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.599893 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bcce315-5828-4a7c-870f-6dd6518af3dd" path="/var/lib/kubelet/pods/4bcce315-5828-4a7c-870f-6dd6518af3dd/volumes" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.601108 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d30051-975a-4329-8d7b-32d297b35218" path="/var/lib/kubelet/pods/73d30051-975a-4329-8d7b-32d297b35218/volumes" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.602345 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e627f8c-63e5-4b85-9fff-0c205c96d0a4" path="/var/lib/kubelet/pods/8e627f8c-63e5-4b85-9fff-0c205c96d0a4/volumes" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.604453 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" path="/var/lib/kubelet/pods/cc2927b9-2e8a-4a34-90e4-932c1f6115c3/volumes" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.781738 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" event={"ID":"37854105-dd2c-4a53-9a0e-45813f321114","Type":"ContainerStarted","Data":"f3ef64a575f99df64a39f705ceb45703f65538d40c2d156c3948d31c73757c40"} Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.781782 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" event={"ID":"37854105-dd2c-4a53-9a0e-45813f321114","Type":"ContainerStarted","Data":"26a4206c7e69d205c97b20c226bffaab91fe748ca0e66742ce521578a1edf406"} Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.782449 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.787807 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.797187 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8tbs9" podStartSLOduration=2.797171 podStartE2EDuration="2.797171s" podCreationTimestamp="2026-03-13 15:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:10:00.796017173 +0000 UTC m=+430.959229004" watchObservedRunningTime="2026-03-13 15:10:00.797171 +0000 UTC m=+430.960382811" Mar 13 15:10:00 crc kubenswrapper[4786]: I0313 15:10:00.939194 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-6976f"] Mar 13 15:10:00 crc kubenswrapper[4786]: W0313 15:10:00.947095 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc69663e_8a65_4de4_972f_92b8abe2f715.slice/crio-565b88666bd8c65bd760975b1e8c1c295a715fffa5c066c3187044bba53e1c9d WatchSource:0}: Error finding container 565b88666bd8c65bd760975b1e8c1c295a715fffa5c066c3187044bba53e1c9d: Status 404 returned error can't find the container with id 565b88666bd8c65bd760975b1e8c1c295a715fffa5c066c3187044bba53e1c9d Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.166580 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vg8fj"] Mar 13 15:10:01 crc kubenswrapper[4786]: E0313 15:10:01.167238 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.167259 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.167428 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc2927b9-2e8a-4a34-90e4-932c1f6115c3" containerName="marketplace-operator" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.168675 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.170737 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.175720 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vg8fj"] Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.314782 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-utilities\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.314894 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhkp\" (UniqueName: \"kubernetes.io/projected/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-kube-api-access-tkhkp\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.314928 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-catalog-content\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.355777 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dstmd"] Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.358621 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.363806 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.368406 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dstmd"] Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.416269 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhkp\" (UniqueName: \"kubernetes.io/projected/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-kube-api-access-tkhkp\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.416442 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-catalog-content\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.417211 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-catalog-content\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.417557 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-utilities\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.418006 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-utilities\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.440159 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhkp\" (UniqueName: \"kubernetes.io/projected/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-kube-api-access-tkhkp\") pod \"certified-operators-vg8fj\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.503329 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.519066 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c65dc28-4b5f-4159-af2e-83b4bffec120-catalog-content\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.519123 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5q2d\" (UniqueName: \"kubernetes.io/projected/5c65dc28-4b5f-4159-af2e-83b4bffec120-kube-api-access-c5q2d\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.519160 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c65dc28-4b5f-4159-af2e-83b4bffec120-utilities\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.624082 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c65dc28-4b5f-4159-af2e-83b4bffec120-catalog-content\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.624134 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5q2d\" (UniqueName: \"kubernetes.io/projected/5c65dc28-4b5f-4159-af2e-83b4bffec120-kube-api-access-c5q2d\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.624164 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c65dc28-4b5f-4159-af2e-83b4bffec120-utilities\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.624704 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c65dc28-4b5f-4159-af2e-83b4bffec120-catalog-content\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.624824 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c65dc28-4b5f-4159-af2e-83b4bffec120-utilities\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.648567 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5q2d\" (UniqueName: \"kubernetes.io/projected/5c65dc28-4b5f-4159-af2e-83b4bffec120-kube-api-access-c5q2d\") pod \"redhat-operators-dstmd\" (UID: \"5c65dc28-4b5f-4159-af2e-83b4bffec120\") " pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.677073 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.791916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-6976f" event={"ID":"fc69663e-8a65-4de4-972f-92b8abe2f715","Type":"ContainerStarted","Data":"565b88666bd8c65bd760975b1e8c1c295a715fffa5c066c3187044bba53e1c9d"} Mar 13 15:10:01 crc kubenswrapper[4786]: I0313 15:10:01.952187 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vg8fj"] Mar 13 15:10:01 crc kubenswrapper[4786]: W0313 15:10:01.960447 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65fe5c1c_407e_4a94_b6d0_a4ac3c22bc43.slice/crio-ca9538c0c2b1d376dda1e0441fd882e36aea882df868be102f125b0d1032d34b WatchSource:0}: Error finding container ca9538c0c2b1d376dda1e0441fd882e36aea882df868be102f125b0d1032d34b: Status 404 returned error can't find the container with id ca9538c0c2b1d376dda1e0441fd882e36aea882df868be102f125b0d1032d34b Mar 13 15:10:02 crc kubenswrapper[4786]: I0313 15:10:02.117938 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dstmd"] Mar 13 15:10:02 crc kubenswrapper[4786]: W0313 15:10:02.123733 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c65dc28_4b5f_4159_af2e_83b4bffec120.slice/crio-746deacdb45d742123d734b0a151c5ebd8f50d7f08d4eba5cdca8b114f0ca95f WatchSource:0}: Error finding container 746deacdb45d742123d734b0a151c5ebd8f50d7f08d4eba5cdca8b114f0ca95f: Status 404 returned error can't find the container with id 746deacdb45d742123d734b0a151c5ebd8f50d7f08d4eba5cdca8b114f0ca95f Mar 13 15:10:02 crc kubenswrapper[4786]: I0313 15:10:02.799501 4786 generic.go:334] "Generic (PLEG): container finished" podID="5c65dc28-4b5f-4159-af2e-83b4bffec120" containerID="ad8e0d3747e6201c6806931a4fddd35d768a989fdeaf817f7e0e00e5244b9988" exitCode=0 Mar 13 15:10:02 crc kubenswrapper[4786]: I0313 15:10:02.799628 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dstmd" event={"ID":"5c65dc28-4b5f-4159-af2e-83b4bffec120","Type":"ContainerDied","Data":"ad8e0d3747e6201c6806931a4fddd35d768a989fdeaf817f7e0e00e5244b9988"} Mar 13 15:10:02 crc kubenswrapper[4786]: I0313 15:10:02.801212 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dstmd" event={"ID":"5c65dc28-4b5f-4159-af2e-83b4bffec120","Type":"ContainerStarted","Data":"746deacdb45d742123d734b0a151c5ebd8f50d7f08d4eba5cdca8b114f0ca95f"} Mar 13 15:10:02 crc kubenswrapper[4786]: I0313 15:10:02.804553 4786 generic.go:334] "Generic (PLEG): container finished" podID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerID="34fdc541a7a79c90770cba93b9801dcb29ff8adfd13e3b299247e1a7d5649146" exitCode=0 Mar 13 15:10:02 crc kubenswrapper[4786]: I0313 15:10:02.804683 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vg8fj" event={"ID":"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43","Type":"ContainerDied","Data":"34fdc541a7a79c90770cba93b9801dcb29ff8adfd13e3b299247e1a7d5649146"} Mar 13 15:10:02 crc kubenswrapper[4786]: I0313 15:10:02.805551 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vg8fj" event={"ID":"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43","Type":"ContainerStarted","Data":"ca9538c0c2b1d376dda1e0441fd882e36aea882df868be102f125b0d1032d34b"} Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.550092 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b45zm"] Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.552165 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.556337 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.564032 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b45zm"] Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.662550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ktq4\" (UniqueName: \"kubernetes.io/projected/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-kube-api-access-6ktq4\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.662939 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-utilities\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.663624 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-catalog-content\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.754605 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qwfx9"] Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.756206 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.765635 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktq4\" (UniqueName: \"kubernetes.io/projected/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-kube-api-access-6ktq4\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.766369 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-utilities\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.766536 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-catalog-content\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.767369 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-utilities\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.767654 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-catalog-content\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.768340 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwfx9"] Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.772631 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.825322 4786 generic.go:334] "Generic (PLEG): container finished" podID="fc69663e-8a65-4de4-972f-92b8abe2f715" containerID="01af5470493edc97e626069e9859171986db6f4535e71cd638f9fbf5158bf999" exitCode=0 Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.825630 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-6976f" event={"ID":"fc69663e-8a65-4de4-972f-92b8abe2f715","Type":"ContainerDied","Data":"01af5470493edc97e626069e9859171986db6f4535e71cd638f9fbf5158bf999"} Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.831774 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ktq4\" (UniqueName: \"kubernetes.io/projected/c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6-kube-api-access-6ktq4\") pod \"community-operators-b45zm\" (UID: \"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6\") " pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.867132 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a298d0-6d11-47c4-9438-488c016e3d49-utilities\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.867185 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlk4c\" (UniqueName: \"kubernetes.io/projected/04a298d0-6d11-47c4-9438-488c016e3d49-kube-api-access-jlk4c\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.867217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a298d0-6d11-47c4-9438-488c016e3d49-catalog-content\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.876875 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.968307 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a298d0-6d11-47c4-9438-488c016e3d49-utilities\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.968361 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlk4c\" (UniqueName: \"kubernetes.io/projected/04a298d0-6d11-47c4-9438-488c016e3d49-kube-api-access-jlk4c\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.968384 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a298d0-6d11-47c4-9438-488c016e3d49-catalog-content\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.968762 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a298d0-6d11-47c4-9438-488c016e3d49-catalog-content\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.968759 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a298d0-6d11-47c4-9438-488c016e3d49-utilities\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:03 crc kubenswrapper[4786]: I0313 15:10:03.989823 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlk4c\" (UniqueName: \"kubernetes.io/projected/04a298d0-6d11-47c4-9438-488c016e3d49-kube-api-access-jlk4c\") pod \"redhat-marketplace-qwfx9\" (UID: \"04a298d0-6d11-47c4-9438-488c016e3d49\") " pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.072397 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.247617 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b45zm"] Mar 13 15:10:04 crc kubenswrapper[4786]: W0313 15:10:04.256803 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67bb9c5_b78b_4824_b2e3_c95e51c6c8c6.slice/crio-6200b5abe65705d679dfff13efb2d37dc8dcaaf2b8d47678589291310a71e97e WatchSource:0}: Error finding container 6200b5abe65705d679dfff13efb2d37dc8dcaaf2b8d47678589291310a71e97e: Status 404 returned error can't find the container with id 6200b5abe65705d679dfff13efb2d37dc8dcaaf2b8d47678589291310a71e97e Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.486260 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwfx9"] Mar 13 15:10:04 crc kubenswrapper[4786]: W0313 15:10:04.493070 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a298d0_6d11_47c4_9438_488c016e3d49.slice/crio-52a832716b52980bf7a5453fce4eed37f69cd45d3da3ea6c3f075cc7621db12a WatchSource:0}: Error finding container 52a832716b52980bf7a5453fce4eed37f69cd45d3da3ea6c3f075cc7621db12a: Status 404 returned error can't find the container with id 52a832716b52980bf7a5453fce4eed37f69cd45d3da3ea6c3f075cc7621db12a Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.833904 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vg8fj" event={"ID":"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43","Type":"ContainerStarted","Data":"9c17993f35b58bab8b9b7c7389c2313629098c47d67dc5dc95d8703cd47e62ad"} Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.835815 4786 generic.go:334] "Generic (PLEG): container finished" podID="c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6" containerID="2471cefd494f482b6084bfc608d11db1e7319c48f90e156f21c321c111eee3b9" exitCode=0 Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.835887 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b45zm" event={"ID":"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6","Type":"ContainerDied","Data":"2471cefd494f482b6084bfc608d11db1e7319c48f90e156f21c321c111eee3b9"} Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.835912 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b45zm" event={"ID":"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6","Type":"ContainerStarted","Data":"6200b5abe65705d679dfff13efb2d37dc8dcaaf2b8d47678589291310a71e97e"} Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.841364 4786 generic.go:334] "Generic (PLEG): container finished" podID="04a298d0-6d11-47c4-9438-488c016e3d49" containerID="1a95ebbb147a8691658e78c7827fe45e046ea22eddd26161e566e6ec6fd1a0e0" exitCode=0 Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.841428 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwfx9" event={"ID":"04a298d0-6d11-47c4-9438-488c016e3d49","Type":"ContainerDied","Data":"1a95ebbb147a8691658e78c7827fe45e046ea22eddd26161e566e6ec6fd1a0e0"} Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.841454 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwfx9" event={"ID":"04a298d0-6d11-47c4-9438-488c016e3d49","Type":"ContainerStarted","Data":"52a832716b52980bf7a5453fce4eed37f69cd45d3da3ea6c3f075cc7621db12a"} Mar 13 15:10:04 crc kubenswrapper[4786]: I0313 15:10:04.843658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dstmd" event={"ID":"5c65dc28-4b5f-4159-af2e-83b4bffec120","Type":"ContainerStarted","Data":"5384a9f773522522d82cd6e20cf88b7031d22b39197dc86f94f33db94072faf3"} Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.413951 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-6976f" Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.443371 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jmgz\" (UniqueName: \"kubernetes.io/projected/fc69663e-8a65-4de4-972f-92b8abe2f715-kube-api-access-6jmgz\") pod \"fc69663e-8a65-4de4-972f-92b8abe2f715\" (UID: \"fc69663e-8a65-4de4-972f-92b8abe2f715\") " Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.458242 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc69663e-8a65-4de4-972f-92b8abe2f715-kube-api-access-6jmgz" (OuterVolumeSpecName: "kube-api-access-6jmgz") pod "fc69663e-8a65-4de4-972f-92b8abe2f715" (UID: "fc69663e-8a65-4de4-972f-92b8abe2f715"). InnerVolumeSpecName "kube-api-access-6jmgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.546461 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jmgz\" (UniqueName: \"kubernetes.io/projected/fc69663e-8a65-4de4-972f-92b8abe2f715-kube-api-access-6jmgz\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.853871 4786 generic.go:334] "Generic (PLEG): container finished" podID="5c65dc28-4b5f-4159-af2e-83b4bffec120" containerID="5384a9f773522522d82cd6e20cf88b7031d22b39197dc86f94f33db94072faf3" exitCode=0 Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.853935 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dstmd" event={"ID":"5c65dc28-4b5f-4159-af2e-83b4bffec120","Type":"ContainerDied","Data":"5384a9f773522522d82cd6e20cf88b7031d22b39197dc86f94f33db94072faf3"} Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.860731 4786 generic.go:334] "Generic (PLEG): container finished" podID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerID="9c17993f35b58bab8b9b7c7389c2313629098c47d67dc5dc95d8703cd47e62ad" exitCode=0 Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.860869 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vg8fj" event={"ID":"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43","Type":"ContainerDied","Data":"9c17993f35b58bab8b9b7c7389c2313629098c47d67dc5dc95d8703cd47e62ad"} Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.867320 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-6976f" event={"ID":"fc69663e-8a65-4de4-972f-92b8abe2f715","Type":"ContainerDied","Data":"565b88666bd8c65bd760975b1e8c1c295a715fffa5c066c3187044bba53e1c9d"} Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.867349 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="565b88666bd8c65bd760975b1e8c1c295a715fffa5c066c3187044bba53e1c9d" Mar 13 15:10:05 crc kubenswrapper[4786]: I0313 15:10:05.867410 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-6976f" Mar 13 15:10:06 crc kubenswrapper[4786]: I0313 15:10:06.874074 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dstmd" event={"ID":"5c65dc28-4b5f-4159-af2e-83b4bffec120","Type":"ContainerStarted","Data":"1b51da4763c02f25d599f6dc520d2ab50eeaef907d3622210d56a718bcd746a7"} Mar 13 15:10:06 crc kubenswrapper[4786]: I0313 15:10:06.875824 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vg8fj" event={"ID":"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43","Type":"ContainerStarted","Data":"5d0ece2f96c0f5c52230c9499b4a43daae11a80e1afef5aa38276c8e95800c3b"} Mar 13 15:10:06 crc kubenswrapper[4786]: I0313 15:10:06.877191 4786 generic.go:334] "Generic (PLEG): container finished" podID="c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6" containerID="578e078fb9dc7058be2e6d83fcb4e9c238c8fafe8b3d58ec509bfc6c62da8c81" exitCode=0 Mar 13 15:10:06 crc kubenswrapper[4786]: I0313 15:10:06.877243 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b45zm" event={"ID":"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6","Type":"ContainerDied","Data":"578e078fb9dc7058be2e6d83fcb4e9c238c8fafe8b3d58ec509bfc6c62da8c81"} Mar 13 15:10:06 crc kubenswrapper[4786]: I0313 15:10:06.878812 4786 generic.go:334] "Generic (PLEG): container finished" podID="04a298d0-6d11-47c4-9438-488c016e3d49" containerID="ed59cbe7cff647c7e76c9cfb0ba14f64e7e4afead1e63ac3ba6efce4d75211b6" exitCode=0 Mar 13 15:10:06 crc kubenswrapper[4786]: I0313 15:10:06.878834 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwfx9" event={"ID":"04a298d0-6d11-47c4-9438-488c016e3d49","Type":"ContainerDied","Data":"ed59cbe7cff647c7e76c9cfb0ba14f64e7e4afead1e63ac3ba6efce4d75211b6"} Mar 13 15:10:06 crc kubenswrapper[4786]: I0313 15:10:06.892934 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dstmd" podStartSLOduration=2.397338337 podStartE2EDuration="5.892920786s" podCreationTimestamp="2026-03-13 15:10:01 +0000 UTC" firstStartedPulling="2026-03-13 15:10:02.803667579 +0000 UTC m=+432.966879390" lastFinishedPulling="2026-03-13 15:10:06.299250028 +0000 UTC m=+436.462461839" observedRunningTime="2026-03-13 15:10:06.890254522 +0000 UTC m=+437.053466333" watchObservedRunningTime="2026-03-13 15:10:06.892920786 +0000 UTC m=+437.056132597" Mar 13 15:10:06 crc kubenswrapper[4786]: I0313 15:10:06.949264 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vg8fj" podStartSLOduration=2.225817616 podStartE2EDuration="5.949249028s" podCreationTimestamp="2026-03-13 15:10:01 +0000 UTC" firstStartedPulling="2026-03-13 15:10:02.807783347 +0000 UTC m=+432.970995158" lastFinishedPulling="2026-03-13 15:10:06.531214759 +0000 UTC m=+436.694426570" observedRunningTime="2026-03-13 15:10:06.947775013 +0000 UTC m=+437.110986814" watchObservedRunningTime="2026-03-13 15:10:06.949249028 +0000 UTC m=+437.112460839" Mar 13 15:10:07 crc kubenswrapper[4786]: I0313 15:10:07.868376 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:10:07 crc kubenswrapper[4786]: I0313 15:10:07.868707 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:10:07 crc kubenswrapper[4786]: I0313 15:10:07.885241 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b45zm" event={"ID":"c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6","Type":"ContainerStarted","Data":"cd0ccf5579e312e750a31a2fa0f84fc35106ac22b7502aec65c49d2c83554b4d"} Mar 13 15:10:07 crc kubenswrapper[4786]: I0313 15:10:07.888341 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwfx9" event={"ID":"04a298d0-6d11-47c4-9438-488c016e3d49","Type":"ContainerStarted","Data":"3d10d5fd22a7ec801fa11d9c0e5c8cd6e0ca316035d9c9ac95c07a5fd47cb671"} Mar 13 15:10:07 crc kubenswrapper[4786]: I0313 15:10:07.933963 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b45zm" podStartSLOduration=2.4658914530000002 podStartE2EDuration="4.933941836s" podCreationTimestamp="2026-03-13 15:10:03 +0000 UTC" firstStartedPulling="2026-03-13 15:10:04.839732776 +0000 UTC m=+435.002944587" lastFinishedPulling="2026-03-13 15:10:07.307783159 +0000 UTC m=+437.470994970" observedRunningTime="2026-03-13 15:10:07.913782882 +0000 UTC m=+438.076994693" watchObservedRunningTime="2026-03-13 15:10:07.933941836 +0000 UTC m=+438.097153667" Mar 13 15:10:11 crc kubenswrapper[4786]: I0313 15:10:11.503457 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:11 crc kubenswrapper[4786]: I0313 15:10:11.503868 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:11 crc kubenswrapper[4786]: I0313 15:10:11.553721 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:11 crc kubenswrapper[4786]: I0313 15:10:11.576173 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qwfx9" podStartSLOduration=5.962105388 podStartE2EDuration="8.576151557s" podCreationTimestamp="2026-03-13 15:10:03 +0000 UTC" firstStartedPulling="2026-03-13 15:10:04.843015935 +0000 UTC m=+435.006227746" lastFinishedPulling="2026-03-13 15:10:07.457062104 +0000 UTC m=+437.620273915" observedRunningTime="2026-03-13 15:10:07.936355684 +0000 UTC m=+438.099567515" watchObservedRunningTime="2026-03-13 15:10:11.576151557 +0000 UTC m=+441.739363378" Mar 13 15:10:11 crc kubenswrapper[4786]: I0313 15:10:11.678147 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:11 crc kubenswrapper[4786]: I0313 15:10:11.678399 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:11 crc kubenswrapper[4786]: I0313 15:10:11.976153 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 15:10:12 crc kubenswrapper[4786]: I0313 15:10:12.721307 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dstmd" podUID="5c65dc28-4b5f-4159-af2e-83b4bffec120" containerName="registry-server" probeResult="failure" output=< Mar 13 15:10:12 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 15:10:12 crc kubenswrapper[4786]: > Mar 13 15:10:13 crc kubenswrapper[4786]: I0313 15:10:13.877613 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:13 crc kubenswrapper[4786]: I0313 15:10:13.877726 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:13 crc kubenswrapper[4786]: I0313 15:10:13.932520 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:13 crc kubenswrapper[4786]: I0313 15:10:13.992237 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b45zm" Mar 13 15:10:14 crc kubenswrapper[4786]: I0313 15:10:14.072575 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:14 crc kubenswrapper[4786]: I0313 15:10:14.072631 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:14 crc kubenswrapper[4786]: I0313 15:10:14.119794 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:14 crc kubenswrapper[4786]: I0313 15:10:14.978516 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qwfx9" Mar 13 15:10:20 crc kubenswrapper[4786]: I0313 15:10:20.565354 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" podUID="c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" containerName="registry" containerID="cri-o://6c55db000e74d08a764730984d25dc63942d8af978bb0cf22fa3a7e1ccb4a2db" gracePeriod=30 Mar 13 15:10:20 crc kubenswrapper[4786]: I0313 15:10:20.975247 4786 generic.go:334] "Generic (PLEG): container finished" podID="c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" containerID="6c55db000e74d08a764730984d25dc63942d8af978bb0cf22fa3a7e1ccb4a2db" exitCode=0 Mar 13 15:10:20 crc kubenswrapper[4786]: I0313 15:10:20.975421 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" event={"ID":"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d","Type":"ContainerDied","Data":"6c55db000e74d08a764730984d25dc63942d8af978bb0cf22fa3a7e1ccb4a2db"} Mar 13 15:10:20 crc kubenswrapper[4786]: I0313 15:10:20.975804 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" event={"ID":"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d","Type":"ContainerDied","Data":"ce275b3c95baf9262e6476605c5f81f37f89bfa1626d3251bf4c13e02f056887"} Mar 13 15:10:20 crc kubenswrapper[4786]: I0313 15:10:20.975831 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce275b3c95baf9262e6476605c5f81f37f89bfa1626d3251bf4c13e02f056887" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.004508 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.057294 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-tls\") pod \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.057368 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-installation-pull-secrets\") pod \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.057423 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-bound-sa-token\") pod \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.057453 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szt5s\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-kube-api-access-szt5s\") pod \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.057487 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-certificates\") pod \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.057520 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-ca-trust-extracted\") pod \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.057690 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.057735 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-trusted-ca\") pod \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\" (UID: \"c5f5e5d9-5a24-4552-bec7-8f0227e07e9d\") " Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.058509 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.058785 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.062589 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.064632 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.064782 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-kube-api-access-szt5s" (OuterVolumeSpecName: "kube-api-access-szt5s") pod "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d"). InnerVolumeSpecName "kube-api-access-szt5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.064950 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.069454 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.084113 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" (UID: "c5f5e5d9-5a24-4552-bec7-8f0227e07e9d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.159788 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.159904 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szt5s\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-kube-api-access-szt5s\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.159949 4786 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.159975 4786 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.159999 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.160024 4786 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.160047 4786 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.750960 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.820070 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dstmd" Mar 13 15:10:21 crc kubenswrapper[4786]: I0313 15:10:21.982007 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ffbml" Mar 13 15:10:22 crc kubenswrapper[4786]: I0313 15:10:22.021598 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ffbml"] Mar 13 15:10:22 crc kubenswrapper[4786]: I0313 15:10:22.028706 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ffbml"] Mar 13 15:10:22 crc kubenswrapper[4786]: I0313 15:10:22.559716 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" path="/var/lib/kubelet/pods/c5f5e5d9-5a24-4552-bec7-8f0227e07e9d/volumes" Mar 13 15:10:37 crc kubenswrapper[4786]: I0313 15:10:37.869124 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:10:37 crc kubenswrapper[4786]: I0313 15:10:37.869734 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:11:07 crc kubenswrapper[4786]: I0313 15:11:07.868608 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:11:07 crc kubenswrapper[4786]: I0313 15:11:07.869324 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:11:07 crc kubenswrapper[4786]: I0313 15:11:07.869385 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:11:07 crc kubenswrapper[4786]: I0313 15:11:07.870067 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8f95de79bd567add7c54012ca9cf8f384df817fb4c5a400948b699024ddf22f"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:11:07 crc kubenswrapper[4786]: I0313 15:11:07.870134 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://b8f95de79bd567add7c54012ca9cf8f384df817fb4c5a400948b699024ddf22f" gracePeriod=600 Mar 13 15:11:08 crc kubenswrapper[4786]: I0313 15:11:08.314190 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="b8f95de79bd567add7c54012ca9cf8f384df817fb4c5a400948b699024ddf22f" exitCode=0 Mar 13 15:11:08 crc kubenswrapper[4786]: I0313 15:11:08.314277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"b8f95de79bd567add7c54012ca9cf8f384df817fb4c5a400948b699024ddf22f"} Mar 13 15:11:08 crc kubenswrapper[4786]: I0313 15:11:08.314496 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"62d0b2ce1eb336ac4fba66f12731e23d24aaf160b69d1b42e5a0e33b26f90040"} Mar 13 15:11:08 crc kubenswrapper[4786]: I0313 15:11:08.314556 4786 scope.go:117] "RemoveContainer" containerID="203d3ac414580642373cb7ef4cb0395c0ecaa256b30ebd3d7c9b5254811af513" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.150167 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556912-zp2mk"] Mar 13 15:12:00 crc kubenswrapper[4786]: E0313 15:12:00.151354 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" containerName="registry" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.151388 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" containerName="registry" Mar 13 15:12:00 crc kubenswrapper[4786]: E0313 15:12:00.151443 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc69663e-8a65-4de4-972f-92b8abe2f715" containerName="oc" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.151462 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc69663e-8a65-4de4-972f-92b8abe2f715" containerName="oc" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.151714 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc69663e-8a65-4de4-972f-92b8abe2f715" containerName="oc" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.151761 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f5e5d9-5a24-4552-bec7-8f0227e07e9d" containerName="registry" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.152578 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-zp2mk" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.158083 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-zp2mk"] Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.172564 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.173046 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.173224 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.267308 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px854\" (UniqueName: \"kubernetes.io/projected/54310c14-883f-4071-a914-907a4749d55d-kube-api-access-px854\") pod \"auto-csr-approver-29556912-zp2mk\" (UID: \"54310c14-883f-4071-a914-907a4749d55d\") " pod="openshift-infra/auto-csr-approver-29556912-zp2mk" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.368736 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px854\" (UniqueName: \"kubernetes.io/projected/54310c14-883f-4071-a914-907a4749d55d-kube-api-access-px854\") pod \"auto-csr-approver-29556912-zp2mk\" (UID: \"54310c14-883f-4071-a914-907a4749d55d\") " pod="openshift-infra/auto-csr-approver-29556912-zp2mk" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.404106 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px854\" (UniqueName: \"kubernetes.io/projected/54310c14-883f-4071-a914-907a4749d55d-kube-api-access-px854\") pod \"auto-csr-approver-29556912-zp2mk\" (UID: \"54310c14-883f-4071-a914-907a4749d55d\") " pod="openshift-infra/auto-csr-approver-29556912-zp2mk" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.481249 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-zp2mk" Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.727218 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-zp2mk"] Mar 13 15:12:00 crc kubenswrapper[4786]: I0313 15:12:00.750012 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:12:01 crc kubenswrapper[4786]: I0313 15:12:01.672327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-zp2mk" event={"ID":"54310c14-883f-4071-a914-907a4749d55d","Type":"ContainerStarted","Data":"7738c3fac2874187af58903d84c9de16e2bc7e552874c31106295c348d68b41c"} Mar 13 15:12:03 crc kubenswrapper[4786]: I0313 15:12:03.686620 4786 generic.go:334] "Generic (PLEG): container finished" podID="54310c14-883f-4071-a914-907a4749d55d" containerID="0c586000742afd995ab94f45fac2bafe638c77eb3b70e8f334797b831010644c" exitCode=0 Mar 13 15:12:03 crc kubenswrapper[4786]: I0313 15:12:03.687045 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-zp2mk" event={"ID":"54310c14-883f-4071-a914-907a4749d55d","Type":"ContainerDied","Data":"0c586000742afd995ab94f45fac2bafe638c77eb3b70e8f334797b831010644c"} Mar 13 15:12:04 crc kubenswrapper[4786]: I0313 15:12:04.942430 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-zp2mk" Mar 13 15:12:05 crc kubenswrapper[4786]: I0313 15:12:05.126693 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px854\" (UniqueName: \"kubernetes.io/projected/54310c14-883f-4071-a914-907a4749d55d-kube-api-access-px854\") pod \"54310c14-883f-4071-a914-907a4749d55d\" (UID: \"54310c14-883f-4071-a914-907a4749d55d\") " Mar 13 15:12:05 crc kubenswrapper[4786]: I0313 15:12:05.136298 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54310c14-883f-4071-a914-907a4749d55d-kube-api-access-px854" (OuterVolumeSpecName: "kube-api-access-px854") pod "54310c14-883f-4071-a914-907a4749d55d" (UID: "54310c14-883f-4071-a914-907a4749d55d"). InnerVolumeSpecName "kube-api-access-px854". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:12:05 crc kubenswrapper[4786]: I0313 15:12:05.228038 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px854\" (UniqueName: \"kubernetes.io/projected/54310c14-883f-4071-a914-907a4749d55d-kube-api-access-px854\") on node \"crc\" DevicePath \"\"" Mar 13 15:12:05 crc kubenswrapper[4786]: I0313 15:12:05.700927 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-zp2mk" event={"ID":"54310c14-883f-4071-a914-907a4749d55d","Type":"ContainerDied","Data":"7738c3fac2874187af58903d84c9de16e2bc7e552874c31106295c348d68b41c"} Mar 13 15:12:05 crc kubenswrapper[4786]: I0313 15:12:05.700974 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7738c3fac2874187af58903d84c9de16e2bc7e552874c31106295c348d68b41c" Mar 13 15:12:05 crc kubenswrapper[4786]: I0313 15:12:05.700981 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-zp2mk" Mar 13 15:12:06 crc kubenswrapper[4786]: I0313 15:12:06.009668 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-bz7fd"] Mar 13 15:12:06 crc kubenswrapper[4786]: I0313 15:12:06.013953 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-bz7fd"] Mar 13 15:12:06 crc kubenswrapper[4786]: I0313 15:12:06.562997 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eda6a73-a8cf-406d-ab33-394ec1982f4a" path="/var/lib/kubelet/pods/1eda6a73-a8cf-406d-ab33-394ec1982f4a/volumes" Mar 13 15:12:51 crc kubenswrapper[4786]: I0313 15:12:51.837800 4786 scope.go:117] "RemoveContainer" containerID="6c55db000e74d08a764730984d25dc63942d8af978bb0cf22fa3a7e1ccb4a2db" Mar 13 15:13:37 crc kubenswrapper[4786]: I0313 15:13:37.869107 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:13:37 crc kubenswrapper[4786]: I0313 15:13:37.869709 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:13:51 crc kubenswrapper[4786]: I0313 15:13:51.878848 4786 scope.go:117] "RemoveContainer" containerID="f0706fa676c676803b709d131c215b589e47c181d768675fc2d4eff10a128f08" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.138782 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556914-qrwh4"] Mar 13 15:14:00 crc kubenswrapper[4786]: E0313 15:14:00.140095 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54310c14-883f-4071-a914-907a4749d55d" containerName="oc" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.140138 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="54310c14-883f-4071-a914-907a4749d55d" containerName="oc" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.140336 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="54310c14-883f-4071-a914-907a4749d55d" containerName="oc" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.141170 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-qrwh4" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.142793 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.144226 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.144403 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.147825 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-qrwh4"] Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.301167 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jws\" (UniqueName: \"kubernetes.io/projected/d731ba13-dc0b-479a-8d77-ac4679c41d7c-kube-api-access-v2jws\") pod \"auto-csr-approver-29556914-qrwh4\" (UID: \"d731ba13-dc0b-479a-8d77-ac4679c41d7c\") " pod="openshift-infra/auto-csr-approver-29556914-qrwh4" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.401877 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jws\" (UniqueName: \"kubernetes.io/projected/d731ba13-dc0b-479a-8d77-ac4679c41d7c-kube-api-access-v2jws\") pod \"auto-csr-approver-29556914-qrwh4\" (UID: \"d731ba13-dc0b-479a-8d77-ac4679c41d7c\") " pod="openshift-infra/auto-csr-approver-29556914-qrwh4" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.421691 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jws\" (UniqueName: \"kubernetes.io/projected/d731ba13-dc0b-479a-8d77-ac4679c41d7c-kube-api-access-v2jws\") pod \"auto-csr-approver-29556914-qrwh4\" (UID: \"d731ba13-dc0b-479a-8d77-ac4679c41d7c\") " pod="openshift-infra/auto-csr-approver-29556914-qrwh4" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.462825 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-qrwh4" Mar 13 15:14:00 crc kubenswrapper[4786]: I0313 15:14:00.638819 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-qrwh4"] Mar 13 15:14:01 crc kubenswrapper[4786]: I0313 15:14:01.523226 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-qrwh4" event={"ID":"d731ba13-dc0b-479a-8d77-ac4679c41d7c","Type":"ContainerStarted","Data":"fd2c34111130bbde591963c6b3c21808639eb460e50314694f334485b42bc530"} Mar 13 15:14:02 crc kubenswrapper[4786]: I0313 15:14:02.529133 4786 generic.go:334] "Generic (PLEG): container finished" podID="d731ba13-dc0b-479a-8d77-ac4679c41d7c" containerID="5b8883b5f73859d99542e8bc8deaecc6b973a654f66a87d6256d4bb174fdcd51" exitCode=0 Mar 13 15:14:02 crc kubenswrapper[4786]: I0313 15:14:02.529174 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-qrwh4" event={"ID":"d731ba13-dc0b-479a-8d77-ac4679c41d7c","Type":"ContainerDied","Data":"5b8883b5f73859d99542e8bc8deaecc6b973a654f66a87d6256d4bb174fdcd51"} Mar 13 15:14:03 crc kubenswrapper[4786]: I0313 15:14:03.726422 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-qrwh4" Mar 13 15:14:03 crc kubenswrapper[4786]: I0313 15:14:03.847054 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2jws\" (UniqueName: \"kubernetes.io/projected/d731ba13-dc0b-479a-8d77-ac4679c41d7c-kube-api-access-v2jws\") pod \"d731ba13-dc0b-479a-8d77-ac4679c41d7c\" (UID: \"d731ba13-dc0b-479a-8d77-ac4679c41d7c\") " Mar 13 15:14:03 crc kubenswrapper[4786]: I0313 15:14:03.853158 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d731ba13-dc0b-479a-8d77-ac4679c41d7c-kube-api-access-v2jws" (OuterVolumeSpecName: "kube-api-access-v2jws") pod "d731ba13-dc0b-479a-8d77-ac4679c41d7c" (UID: "d731ba13-dc0b-479a-8d77-ac4679c41d7c"). InnerVolumeSpecName "kube-api-access-v2jws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:14:03 crc kubenswrapper[4786]: I0313 15:14:03.948357 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2jws\" (UniqueName: \"kubernetes.io/projected/d731ba13-dc0b-479a-8d77-ac4679c41d7c-kube-api-access-v2jws\") on node \"crc\" DevicePath \"\"" Mar 13 15:14:04 crc kubenswrapper[4786]: I0313 15:14:04.540275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-qrwh4" event={"ID":"d731ba13-dc0b-479a-8d77-ac4679c41d7c","Type":"ContainerDied","Data":"fd2c34111130bbde591963c6b3c21808639eb460e50314694f334485b42bc530"} Mar 13 15:14:04 crc kubenswrapper[4786]: I0313 15:14:04.540644 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd2c34111130bbde591963c6b3c21808639eb460e50314694f334485b42bc530" Mar 13 15:14:04 crc kubenswrapper[4786]: I0313 15:14:04.540326 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-qrwh4" Mar 13 15:14:04 crc kubenswrapper[4786]: I0313 15:14:04.780495 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-ckrmj"] Mar 13 15:14:04 crc kubenswrapper[4786]: I0313 15:14:04.783790 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-ckrmj"] Mar 13 15:14:06 crc kubenswrapper[4786]: I0313 15:14:06.558982 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73373245-c6fb-46a6-855e-321909bf5959" path="/var/lib/kubelet/pods/73373245-c6fb-46a6-855e-321909bf5959/volumes" Mar 13 15:14:07 crc kubenswrapper[4786]: I0313 15:14:07.869026 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:14:07 crc kubenswrapper[4786]: I0313 15:14:07.869104 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:14:37 crc kubenswrapper[4786]: I0313 15:14:37.868327 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:14:37 crc kubenswrapper[4786]: I0313 15:14:37.869080 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:14:37 crc kubenswrapper[4786]: I0313 15:14:37.869139 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:14:37 crc kubenswrapper[4786]: I0313 15:14:37.870072 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62d0b2ce1eb336ac4fba66f12731e23d24aaf160b69d1b42e5a0e33b26f90040"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:14:37 crc kubenswrapper[4786]: I0313 15:14:37.870155 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://62d0b2ce1eb336ac4fba66f12731e23d24aaf160b69d1b42e5a0e33b26f90040" gracePeriod=600 Mar 13 15:14:38 crc kubenswrapper[4786]: I0313 15:14:38.738955 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="62d0b2ce1eb336ac4fba66f12731e23d24aaf160b69d1b42e5a0e33b26f90040" exitCode=0 Mar 13 15:14:38 crc kubenswrapper[4786]: I0313 15:14:38.739010 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"62d0b2ce1eb336ac4fba66f12731e23d24aaf160b69d1b42e5a0e33b26f90040"} Mar 13 15:14:38 crc kubenswrapper[4786]: I0313 15:14:38.739626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"dad5dab593ccac2d22182d28c8abfa4af5554be94f9545ed96143d1052cb64d4"} Mar 13 15:14:38 crc kubenswrapper[4786]: I0313 15:14:38.739657 4786 scope.go:117] "RemoveContainer" containerID="b8f95de79bd567add7c54012ca9cf8f384df817fb4c5a400948b699024ddf22f" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.139770 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d"] Mar 13 15:15:00 crc kubenswrapper[4786]: E0313 15:15:00.140432 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d731ba13-dc0b-479a-8d77-ac4679c41d7c" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.140444 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d731ba13-dc0b-479a-8d77-ac4679c41d7c" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.140700 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d731ba13-dc0b-479a-8d77-ac4679c41d7c" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.141081 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.143553 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.145394 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.152624 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d"] Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.281491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c965d5e-882a-4c8d-8290-ecddba18b208-secret-volume\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.281590 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c965d5e-882a-4c8d-8290-ecddba18b208-config-volume\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.281741 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n9pt\" (UniqueName: \"kubernetes.io/projected/2c965d5e-882a-4c8d-8290-ecddba18b208-kube-api-access-6n9pt\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.383277 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c965d5e-882a-4c8d-8290-ecddba18b208-config-volume\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.383366 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n9pt\" (UniqueName: \"kubernetes.io/projected/2c965d5e-882a-4c8d-8290-ecddba18b208-kube-api-access-6n9pt\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.383428 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c965d5e-882a-4c8d-8290-ecddba18b208-secret-volume\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.385090 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c965d5e-882a-4c8d-8290-ecddba18b208-config-volume\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.394163 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c965d5e-882a-4c8d-8290-ecddba18b208-secret-volume\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.402982 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n9pt\" (UniqueName: \"kubernetes.io/projected/2c965d5e-882a-4c8d-8290-ecddba18b208-kube-api-access-6n9pt\") pod \"collect-profiles-29556915-b5f5d\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.465687 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.660935 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d"] Mar 13 15:15:00 crc kubenswrapper[4786]: W0313 15:15:00.664553 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c965d5e_882a_4c8d_8290_ecddba18b208.slice/crio-b8dfd31385e75d58c4b724020f58fff8d4164254096bb7fbfd6907af7089a0f5 WatchSource:0}: Error finding container b8dfd31385e75d58c4b724020f58fff8d4164254096bb7fbfd6907af7089a0f5: Status 404 returned error can't find the container with id b8dfd31385e75d58c4b724020f58fff8d4164254096bb7fbfd6907af7089a0f5 Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.877974 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" event={"ID":"2c965d5e-882a-4c8d-8290-ecddba18b208","Type":"ContainerStarted","Data":"d753256315f1310bb28f0691628ce24ffb13984ac11fd1d3c1cff5812092c02e"} Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.878381 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" event={"ID":"2c965d5e-882a-4c8d-8290-ecddba18b208","Type":"ContainerStarted","Data":"b8dfd31385e75d58c4b724020f58fff8d4164254096bb7fbfd6907af7089a0f5"} Mar 13 15:15:00 crc kubenswrapper[4786]: I0313 15:15:00.898393 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" podStartSLOduration=0.898378073 podStartE2EDuration="898.378073ms" podCreationTimestamp="2026-03-13 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:15:00.895280874 +0000 UTC m=+731.058492695" watchObservedRunningTime="2026-03-13 15:15:00.898378073 +0000 UTC m=+731.061589904" Mar 13 15:15:01 crc kubenswrapper[4786]: I0313 15:15:01.884581 4786 generic.go:334] "Generic (PLEG): container finished" podID="2c965d5e-882a-4c8d-8290-ecddba18b208" containerID="d753256315f1310bb28f0691628ce24ffb13984ac11fd1d3c1cff5812092c02e" exitCode=0 Mar 13 15:15:01 crc kubenswrapper[4786]: I0313 15:15:01.884629 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" event={"ID":"2c965d5e-882a-4c8d-8290-ecddba18b208","Type":"ContainerDied","Data":"d753256315f1310bb28f0691628ce24ffb13984ac11fd1d3c1cff5812092c02e"} Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.086014 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.224758 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c965d5e-882a-4c8d-8290-ecddba18b208-config-volume\") pod \"2c965d5e-882a-4c8d-8290-ecddba18b208\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.224847 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n9pt\" (UniqueName: \"kubernetes.io/projected/2c965d5e-882a-4c8d-8290-ecddba18b208-kube-api-access-6n9pt\") pod \"2c965d5e-882a-4c8d-8290-ecddba18b208\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.224907 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c965d5e-882a-4c8d-8290-ecddba18b208-secret-volume\") pod \"2c965d5e-882a-4c8d-8290-ecddba18b208\" (UID: \"2c965d5e-882a-4c8d-8290-ecddba18b208\") " Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.225294 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c965d5e-882a-4c8d-8290-ecddba18b208-config-volume" (OuterVolumeSpecName: "config-volume") pod "2c965d5e-882a-4c8d-8290-ecddba18b208" (UID: "2c965d5e-882a-4c8d-8290-ecddba18b208"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.230208 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c965d5e-882a-4c8d-8290-ecddba18b208-kube-api-access-6n9pt" (OuterVolumeSpecName: "kube-api-access-6n9pt") pod "2c965d5e-882a-4c8d-8290-ecddba18b208" (UID: "2c965d5e-882a-4c8d-8290-ecddba18b208"). InnerVolumeSpecName "kube-api-access-6n9pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.230241 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c965d5e-882a-4c8d-8290-ecddba18b208-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2c965d5e-882a-4c8d-8290-ecddba18b208" (UID: "2c965d5e-882a-4c8d-8290-ecddba18b208"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.326595 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c965d5e-882a-4c8d-8290-ecddba18b208-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.326639 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c965d5e-882a-4c8d-8290-ecddba18b208-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.326651 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n9pt\" (UniqueName: \"kubernetes.io/projected/2c965d5e-882a-4c8d-8290-ecddba18b208-kube-api-access-6n9pt\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.900037 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" event={"ID":"2c965d5e-882a-4c8d-8290-ecddba18b208","Type":"ContainerDied","Data":"b8dfd31385e75d58c4b724020f58fff8d4164254096bb7fbfd6907af7089a0f5"} Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.900077 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8dfd31385e75d58c4b724020f58fff8d4164254096bb7fbfd6907af7089a0f5" Mar 13 15:15:03 crc kubenswrapper[4786]: I0313 15:15:03.900207 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d" Mar 13 15:15:51 crc kubenswrapper[4786]: I0313 15:15:51.952187 4786 scope.go:117] "RemoveContainer" containerID="b064e301025f633fca08b2c6c6d6715dd2fe77ad1d9b0ad6e22d7c4854ff4f2b" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.142635 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556916-lz92k"] Mar 13 15:16:00 crc kubenswrapper[4786]: E0313 15:16:00.143737 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c965d5e-882a-4c8d-8290-ecddba18b208" containerName="collect-profiles" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.143765 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c965d5e-882a-4c8d-8290-ecddba18b208" containerName="collect-profiles" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.144017 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c965d5e-882a-4c8d-8290-ecddba18b208" containerName="collect-profiles" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.144653 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-lz92k" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.145897 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-lz92k"] Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.150646 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.150725 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.151264 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.207082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h22rg\" (UniqueName: \"kubernetes.io/projected/8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad-kube-api-access-h22rg\") pod \"auto-csr-approver-29556916-lz92k\" (UID: \"8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad\") " pod="openshift-infra/auto-csr-approver-29556916-lz92k" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.309211 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h22rg\" (UniqueName: \"kubernetes.io/projected/8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad-kube-api-access-h22rg\") pod \"auto-csr-approver-29556916-lz92k\" (UID: \"8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad\") " pod="openshift-infra/auto-csr-approver-29556916-lz92k" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.330793 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h22rg\" (UniqueName: \"kubernetes.io/projected/8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad-kube-api-access-h22rg\") pod \"auto-csr-approver-29556916-lz92k\" (UID: \"8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad\") " pod="openshift-infra/auto-csr-approver-29556916-lz92k" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.464979 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-lz92k" Mar 13 15:16:00 crc kubenswrapper[4786]: I0313 15:16:00.696999 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-lz92k"] Mar 13 15:16:01 crc kubenswrapper[4786]: I0313 15:16:01.398514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-lz92k" event={"ID":"8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad","Type":"ContainerStarted","Data":"8a97bb8bc019b8d4e26ba341aaca91bb8aa875a35176beb670bc49449c8c1b78"} Mar 13 15:16:02 crc kubenswrapper[4786]: I0313 15:16:02.407512 4786 generic.go:334] "Generic (PLEG): container finished" podID="8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad" containerID="ee23c4bdb46d75f37c05e047df4a8e16d06f52e8695adb4dd2ff95b6d53ae9e2" exitCode=0 Mar 13 15:16:02 crc kubenswrapper[4786]: I0313 15:16:02.407610 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-lz92k" event={"ID":"8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad","Type":"ContainerDied","Data":"ee23c4bdb46d75f37c05e047df4a8e16d06f52e8695adb4dd2ff95b6d53ae9e2"} Mar 13 15:16:03 crc kubenswrapper[4786]: I0313 15:16:03.620622 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-lz92k" Mar 13 15:16:03 crc kubenswrapper[4786]: I0313 15:16:03.649028 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h22rg\" (UniqueName: \"kubernetes.io/projected/8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad-kube-api-access-h22rg\") pod \"8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad\" (UID: \"8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad\") " Mar 13 15:16:03 crc kubenswrapper[4786]: I0313 15:16:03.654995 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad-kube-api-access-h22rg" (OuterVolumeSpecName: "kube-api-access-h22rg") pod "8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad" (UID: "8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad"). InnerVolumeSpecName "kube-api-access-h22rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:16:03 crc kubenswrapper[4786]: I0313 15:16:03.752115 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h22rg\" (UniqueName: \"kubernetes.io/projected/8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad-kube-api-access-h22rg\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:04 crc kubenswrapper[4786]: I0313 15:16:04.419568 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-lz92k" event={"ID":"8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad","Type":"ContainerDied","Data":"8a97bb8bc019b8d4e26ba341aaca91bb8aa875a35176beb670bc49449c8c1b78"} Mar 13 15:16:04 crc kubenswrapper[4786]: I0313 15:16:04.419613 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a97bb8bc019b8d4e26ba341aaca91bb8aa875a35176beb670bc49449c8c1b78" Mar 13 15:16:04 crc kubenswrapper[4786]: I0313 15:16:04.419631 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-lz92k" Mar 13 15:16:04 crc kubenswrapper[4786]: I0313 15:16:04.677155 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-6976f"] Mar 13 15:16:04 crc kubenswrapper[4786]: I0313 15:16:04.680601 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-6976f"] Mar 13 15:16:06 crc kubenswrapper[4786]: I0313 15:16:06.567953 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc69663e-8a65-4de4-972f-92b8abe2f715" path="/var/lib/kubelet/pods/fc69663e-8a65-4de4-972f-92b8abe2f715/volumes" Mar 13 15:16:18 crc kubenswrapper[4786]: I0313 15:16:18.320258 4786 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.358916 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7b6g9"] Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.359877 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovn-controller" containerID="cri-o://844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9" gracePeriod=30 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.359934 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="northd" containerID="cri-o://99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990" gracePeriod=30 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.359997 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovn-acl-logging" containerID="cri-o://9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d" gracePeriod=30 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.360027 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="nbdb" containerID="cri-o://23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58" gracePeriod=30 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.360005 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae" gracePeriod=30 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.360022 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="sbdb" containerID="cri-o://da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5" gracePeriod=30 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.361326 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kube-rbac-proxy-node" containerID="cri-o://390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4" gracePeriod=30 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.394346 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" containerID="cri-o://c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d" gracePeriod=30 Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.539300 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6a64e5_e5ca_401a_9653_e0419f9f46c4.slice/crio-c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6a64e5_e5ca_401a_9653_e0419f9f46c4.slice/crio-23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58.scope\": RecentStats: unable to find data in memory cache]" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.639128 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovnkube-controller/3.log" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.641483 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovn-acl-logging/0.log" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.641830 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovn-controller/0.log" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642206 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d" exitCode=0 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642232 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5" exitCode=0 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642240 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58" exitCode=0 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642247 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae" exitCode=0 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642254 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4" exitCode=0 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642260 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d" exitCode=143 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642266 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9" exitCode=143 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642291 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d"} Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642340 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5"} Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642354 4786 scope.go:117] "RemoveContainer" containerID="4efe201d148fcbf7cb14236bfc27c1ff698f6f363db0fe642aa1d37d77471497" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642355 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58"} Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae"} Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642377 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4"} Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642386 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d"} Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.642395 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9"} Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.644054 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/2.log" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.645242 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/1.log" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.645281 4786 generic.go:334] "Generic (PLEG): container finished" podID="930a5a92-be71-4866-aa6f-95a98647bc33" containerID="459051b7219350f8c61d97d648b668c00d787af9c9750e356aae1c8c21b74b3a" exitCode=2 Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.645306 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mvpcz" event={"ID":"930a5a92-be71-4866-aa6f-95a98647bc33","Type":"ContainerDied","Data":"459051b7219350f8c61d97d648b668c00d787af9c9750e356aae1c8c21b74b3a"} Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.645801 4786 scope.go:117] "RemoveContainer" containerID="459051b7219350f8c61d97d648b668c00d787af9c9750e356aae1c8c21b74b3a" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.708702 4786 scope.go:117] "RemoveContainer" containerID="de35f5128060f814edc295ed0a6a11acb9b534f38749165df712730db4990383" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.719309 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovn-acl-logging/0.log" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.719758 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovn-controller/0.log" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.720659 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.773827 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9mb27"] Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774123 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774150 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774163 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovn-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774172 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovn-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774186 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774194 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774208 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774216 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774228 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="sbdb" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774237 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="sbdb" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774249 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774256 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774269 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="nbdb" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774279 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="nbdb" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774290 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="northd" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774298 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="northd" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774311 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kube-rbac-proxy-node" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774318 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kube-rbac-proxy-node" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774332 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovn-acl-logging" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774340 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovn-acl-logging" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774354 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kubecfg-setup" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774361 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kubecfg-setup" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774372 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad" containerName="oc" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774379 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad" containerName="oc" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774510 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774524 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774534 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="sbdb" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774546 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="northd" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774558 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovn-acl-logging" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774568 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="kube-rbac-proxy-node" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774575 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="nbdb" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774587 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovn-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774593 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774603 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774610 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad" containerName="oc" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774727 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774737 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: E0313 15:16:40.774750 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.774758 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.775448 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.775478 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerName="ovnkube-controller" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.778395 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835319 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-openvswitch\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835358 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-systemd-units\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-node-log\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835439 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-systemd\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-script-lib\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835484 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-log-socket\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835518 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-etc-openvswitch\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835469 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835540 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-netd\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835481 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835563 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-env-overrides\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835581 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-var-lib-openvswitch\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835514 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-node-log" (OuterVolumeSpecName: "node-log") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835557 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-log-socket" (OuterVolumeSpecName: "log-socket") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835583 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835601 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835658 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835676 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-slash\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835694 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835711 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77lq5\" (UniqueName: \"kubernetes.io/projected/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-kube-api-access-77lq5\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835736 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-bin\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835764 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-ovn-kubernetes\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835789 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-kubelet\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835716 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835731 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-slash" (OuterVolumeSpecName: "host-slash") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835811 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835900 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835877 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-netns\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835924 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835940 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-config\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835876 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.835971 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovn-node-metrics-cert\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836103 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-ovn\") pod \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\" (UID: \"0c6a64e5-e5ca-401a-9653-e0419f9f46c4\") " Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836197 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836322 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-var-lib-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836358 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-ovnkube-script-lib\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836399 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-node-log\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836424 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836431 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-cni-bin\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836517 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-cni-netd\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836560 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-log-socket\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836610 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-ovn\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836672 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-run-netns\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836719 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-slash\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836782 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-systemd-units\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836822 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-env-overrides\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836873 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836952 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcj22\" (UniqueName: \"kubernetes.io/projected/47665274-ba43-4f47-8d21-ad401c38d1ac-kube-api-access-zcj22\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.836975 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-systemd\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837003 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837025 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837058 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-etc-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-kubelet\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837110 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-ovnkube-config\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837140 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47665274-ba43-4f47-8d21-ad401c38d1ac-ovn-node-metrics-cert\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837218 4786 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837233 4786 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837246 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837259 4786 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837273 4786 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837284 4786 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837296 4786 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837309 4786 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837321 4786 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837334 4786 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837348 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837360 4786 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837385 4786 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837398 4786 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837413 4786 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837425 4786 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.837505 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.841956 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.842299 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-kube-api-access-77lq5" (OuterVolumeSpecName: "kube-api-access-77lq5") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "kube-api-access-77lq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.850396 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0c6a64e5-e5ca-401a-9653-e0419f9f46c4" (UID: "0c6a64e5-e5ca-401a-9653-e0419f9f46c4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938388 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938474 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcj22\" (UniqueName: \"kubernetes.io/projected/47665274-ba43-4f47-8d21-ad401c38d1ac-kube-api-access-zcj22\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938497 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-systemd\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938513 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938529 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938524 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938554 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-etc-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938607 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-etc-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938655 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-kubelet\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938677 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938684 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-ovnkube-config\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938712 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-kubelet\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938657 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938780 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47665274-ba43-4f47-8d21-ad401c38d1ac-ovn-node-metrics-cert\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938849 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-var-lib-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938932 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-ovnkube-script-lib\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938998 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-node-log\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939042 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-cni-bin\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939090 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-cni-netd\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939142 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-log-socket\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939170 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-var-lib-openvswitch\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.938663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-systemd\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939217 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-ovn\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939281 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-run-ovn\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939547 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-ovnkube-config\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939574 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-log-socket\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939596 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-cni-bin\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939583 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-node-log\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939728 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-cni-netd\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939732 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-run-netns\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939973 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-slash\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.939771 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-run-netns\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.940166 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-host-slash\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.940278 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-systemd-units\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.940359 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-env-overrides\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.940495 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77lq5\" (UniqueName: \"kubernetes.io/projected/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-kube-api-access-77lq5\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.940561 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.940377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47665274-ba43-4f47-8d21-ad401c38d1ac-systemd-units\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.940687 4786 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.940742 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c6a64e5-e5ca-401a-9653-e0419f9f46c4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.941322 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-env-overrides\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.941875 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47665274-ba43-4f47-8d21-ad401c38d1ac-ovnkube-script-lib\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.944489 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47665274-ba43-4f47-8d21-ad401c38d1ac-ovn-node-metrics-cert\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:40 crc kubenswrapper[4786]: I0313 15:16:40.956160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcj22\" (UniqueName: \"kubernetes.io/projected/47665274-ba43-4f47-8d21-ad401c38d1ac-kube-api-access-zcj22\") pod \"ovnkube-node-9mb27\" (UID: \"47665274-ba43-4f47-8d21-ad401c38d1ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.104124 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:41 crc kubenswrapper[4786]: W0313 15:16:41.124607 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47665274_ba43_4f47_8d21_ad401c38d1ac.slice/crio-259c58f80e52d3158ed550ac39e3248ba0c52b05852d2e2c59ece4592fc03c55 WatchSource:0}: Error finding container 259c58f80e52d3158ed550ac39e3248ba0c52b05852d2e2c59ece4592fc03c55: Status 404 returned error can't find the container with id 259c58f80e52d3158ed550ac39e3248ba0c52b05852d2e2c59ece4592fc03c55 Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.652241 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mvpcz_930a5a92-be71-4866-aa6f-95a98647bc33/kube-multus/2.log" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.652435 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mvpcz" event={"ID":"930a5a92-be71-4866-aa6f-95a98647bc33","Type":"ContainerStarted","Data":"f7c971953741de41fb4dc2cc8bc78b3b342359eafcc6effda0b86a9457f7f16b"} Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.656032 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovn-acl-logging/0.log" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.656593 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7b6g9_0c6a64e5-e5ca-401a-9653-e0419f9f46c4/ovn-controller/0.log" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.656957 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" containerID="99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990" exitCode=0 Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.657006 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990"} Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.657045 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" event={"ID":"0c6a64e5-e5ca-401a-9653-e0419f9f46c4","Type":"ContainerDied","Data":"7d571718649b373f903e968935f37898a3c1fa9c475267b78810fab24c60dc67"} Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.657053 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7b6g9" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.657060 4786 scope.go:117] "RemoveContainer" containerID="c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.658286 4786 generic.go:334] "Generic (PLEG): container finished" podID="47665274-ba43-4f47-8d21-ad401c38d1ac" containerID="82d370ba8031715d718532ed3134c581a3fa103775a77e54338e7f01b916e88b" exitCode=0 Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.658321 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerDied","Data":"82d370ba8031715d718532ed3134c581a3fa103775a77e54338e7f01b916e88b"} Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.658346 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"259c58f80e52d3158ed550ac39e3248ba0c52b05852d2e2c59ece4592fc03c55"} Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.678938 4786 scope.go:117] "RemoveContainer" containerID="da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.720950 4786 scope.go:117] "RemoveContainer" containerID="23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.753807 4786 scope.go:117] "RemoveContainer" containerID="99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.753997 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7b6g9"] Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.760609 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7b6g9"] Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.775390 4786 scope.go:117] "RemoveContainer" containerID="4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.801491 4786 scope.go:117] "RemoveContainer" containerID="390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.816525 4786 scope.go:117] "RemoveContainer" containerID="9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.850206 4786 scope.go:117] "RemoveContainer" containerID="844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.863826 4786 scope.go:117] "RemoveContainer" containerID="0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.878465 4786 scope.go:117] "RemoveContainer" containerID="c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.878814 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d\": container with ID starting with c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d not found: ID does not exist" containerID="c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.879423 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d"} err="failed to get container status \"c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d\": rpc error: code = NotFound desc = could not find container \"c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d\": container with ID starting with c89946507c12db7f015a230723f252016996b58efde28027d4f8c4743faa9a1d not found: ID does not exist" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.879453 4786 scope.go:117] "RemoveContainer" containerID="da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.879918 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\": container with ID starting with da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5 not found: ID does not exist" containerID="da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.880047 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5"} err="failed to get container status \"da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\": rpc error: code = NotFound desc = could not find container \"da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5\": container with ID starting with da3c5f0b399e7aaa0f20489eb45bd7233bf956f638558b70231b9732a71122d5 not found: ID does not exist" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.880061 4786 scope.go:117] "RemoveContainer" containerID="23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.880897 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\": container with ID starting with 23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58 not found: ID does not exist" containerID="23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.880932 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58"} err="failed to get container status \"23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\": rpc error: code = NotFound desc = could not find container \"23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58\": container with ID starting with 23517c0b6806f732b9b74c0f28a7d592fe49ce3ff887889d7c75424cc9c4ff58 not found: ID does not exist" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.880972 4786 scope.go:117] "RemoveContainer" containerID="99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.881231 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\": container with ID starting with 99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990 not found: ID does not exist" containerID="99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.881253 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990"} err="failed to get container status \"99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\": rpc error: code = NotFound desc = could not find container \"99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990\": container with ID starting with 99e64165af94aaa43ea0f7d4427e03e55079e256eb1cf1dc6686bc70e3305990 not found: ID does not exist" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.881268 4786 scope.go:117] "RemoveContainer" containerID="4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.881503 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\": container with ID starting with 4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae not found: ID does not exist" containerID="4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.881533 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae"} err="failed to get container status \"4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\": rpc error: code = NotFound desc = could not find container \"4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae\": container with ID starting with 4520eefc89b8815c0063eb5176078b972896418583fb5730c76c84b6ced7ebae not found: ID does not exist" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.881550 4786 scope.go:117] "RemoveContainer" containerID="390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.881764 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\": container with ID starting with 390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4 not found: ID does not exist" containerID="390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.881788 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4"} err="failed to get container status \"390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\": rpc error: code = NotFound desc = could not find container \"390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4\": container with ID starting with 390598d17946d931e1abcd966a7580ef77c505f1bffecb0f3765aef875ec0da4 not found: ID does not exist" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.881808 4786 scope.go:117] "RemoveContainer" containerID="9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.882050 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\": container with ID starting with 9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d not found: ID does not exist" containerID="9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.882073 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d"} err="failed to get container status \"9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\": rpc error: code = NotFound desc = could not find container \"9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d\": container with ID starting with 9240e3a1326d209bc5a56f95636e96ec7e38e37bf1b0e4cc334fe12088b22a4d not found: ID does not exist" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.882089 4786 scope.go:117] "RemoveContainer" containerID="844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.882326 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\": container with ID starting with 844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9 not found: ID does not exist" containerID="844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.882351 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9"} err="failed to get container status \"844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\": rpc error: code = NotFound desc = could not find container \"844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9\": container with ID starting with 844aa743a6242e922471f32af114cfddfc7f9a421acc5f6af76f4322b403adb9 not found: ID does not exist" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.882367 4786 scope.go:117] "RemoveContainer" containerID="0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128" Mar 13 15:16:41 crc kubenswrapper[4786]: E0313 15:16:41.882595 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\": container with ID starting with 0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128 not found: ID does not exist" containerID="0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128" Mar 13 15:16:41 crc kubenswrapper[4786]: I0313 15:16:41.882633 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128"} err="failed to get container status \"0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\": rpc error: code = NotFound desc = could not find container \"0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128\": container with ID starting with 0b1376f8db3f0c33bdff0357f4de2ff440284d947ca5aa13cf28eac3be5ca128 not found: ID does not exist" Mar 13 15:16:42 crc kubenswrapper[4786]: I0313 15:16:42.559492 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6a64e5-e5ca-401a-9653-e0419f9f46c4" path="/var/lib/kubelet/pods/0c6a64e5-e5ca-401a-9653-e0419f9f46c4/volumes" Mar 13 15:16:42 crc kubenswrapper[4786]: I0313 15:16:42.672505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"2c7dc304e83bcb959d3f5e8d7dcef79b2612327fe444816183ce3f776119b098"} Mar 13 15:16:42 crc kubenswrapper[4786]: I0313 15:16:42.672552 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"ac11d03968d9d934273549aad39c0183c8fc5a4641cdadfeff2acff79327a8ba"} Mar 13 15:16:42 crc kubenswrapper[4786]: I0313 15:16:42.672568 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"beab3e9f2db16c979c5ceb8f16696ede2b8b7abf55ef7604c701df2bdf1a4673"} Mar 13 15:16:42 crc kubenswrapper[4786]: I0313 15:16:42.672581 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"fa2866589a1ad7f5419bb24cc32f8c9917307e154e7b6ae43a020a1cdd846b16"} Mar 13 15:16:42 crc kubenswrapper[4786]: I0313 15:16:42.672597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"977ebf6e18a5598991ae83c483aa17d889fd140d59c5fa82be2c05b39c944f7d"} Mar 13 15:16:42 crc kubenswrapper[4786]: I0313 15:16:42.672608 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"0667faf199ee0760edb4df25c09d20a9c6ae42a3c37f2ba17341f33ad33a3ae4"} Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.752351 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jbw4r"] Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.753184 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.755711 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.755936 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.755946 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.756331 4786 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-kbgsj" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.776227 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwj7j\" (UniqueName: \"kubernetes.io/projected/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-kube-api-access-qwj7j\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.776314 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-node-mnt\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.776337 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-crc-storage\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.877809 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwj7j\" (UniqueName: \"kubernetes.io/projected/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-kube-api-access-qwj7j\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.878071 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-node-mnt\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.878132 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-crc-storage\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.878525 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-node-mnt\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.879712 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-crc-storage\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:43 crc kubenswrapper[4786]: I0313 15:16:43.912832 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwj7j\" (UniqueName: \"kubernetes.io/projected/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-kube-api-access-qwj7j\") pod \"crc-storage-crc-jbw4r\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:44 crc kubenswrapper[4786]: I0313 15:16:44.078476 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:44 crc kubenswrapper[4786]: E0313 15:16:44.106107 4786 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jbw4r_crc-storage_5c1f7e2a-a7a0-463c-ad77-07674d37c7a2_0(157ee0ce9b1004b09ffc2afad68a936e4cd0c96acfb878e786cdc9bb09259174): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:16:44 crc kubenswrapper[4786]: E0313 15:16:44.106288 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jbw4r_crc-storage_5c1f7e2a-a7a0-463c-ad77-07674d37c7a2_0(157ee0ce9b1004b09ffc2afad68a936e4cd0c96acfb878e786cdc9bb09259174): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:44 crc kubenswrapper[4786]: E0313 15:16:44.106362 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jbw4r_crc-storage_5c1f7e2a-a7a0-463c-ad77-07674d37c7a2_0(157ee0ce9b1004b09ffc2afad68a936e4cd0c96acfb878e786cdc9bb09259174): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:44 crc kubenswrapper[4786]: E0313 15:16:44.106431 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-jbw4r_crc-storage(5c1f7e2a-a7a0-463c-ad77-07674d37c7a2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-jbw4r_crc-storage(5c1f7e2a-a7a0-463c-ad77-07674d37c7a2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jbw4r_crc-storage_5c1f7e2a-a7a0-463c-ad77-07674d37c7a2_0(157ee0ce9b1004b09ffc2afad68a936e4cd0c96acfb878e786cdc9bb09259174): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-jbw4r" podUID="5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" Mar 13 15:16:45 crc kubenswrapper[4786]: I0313 15:16:45.697642 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"9f8e25f41d6ee298db296059c72cf282bad70c402263949bfae4e82775b0c96d"} Mar 13 15:16:47 crc kubenswrapper[4786]: I0313 15:16:47.711159 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" event={"ID":"47665274-ba43-4f47-8d21-ad401c38d1ac","Type":"ContainerStarted","Data":"29af6a11df0dd48bc35c7f5497792b820475eb4163d9f71e1d7b85756b640e41"} Mar 13 15:16:47 crc kubenswrapper[4786]: I0313 15:16:47.711600 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:47 crc kubenswrapper[4786]: I0313 15:16:47.735469 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" podStartSLOduration=7.735450221 podStartE2EDuration="7.735450221s" podCreationTimestamp="2026-03-13 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:16:47.734611569 +0000 UTC m=+837.897823400" watchObservedRunningTime="2026-03-13 15:16:47.735450221 +0000 UTC m=+837.898662042" Mar 13 15:16:47 crc kubenswrapper[4786]: I0313 15:16:47.739944 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:47 crc kubenswrapper[4786]: I0313 15:16:47.900274 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jbw4r"] Mar 13 15:16:47 crc kubenswrapper[4786]: I0313 15:16:47.900432 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:47 crc kubenswrapper[4786]: I0313 15:16:47.901101 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:47 crc kubenswrapper[4786]: E0313 15:16:47.930464 4786 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jbw4r_crc-storage_5c1f7e2a-a7a0-463c-ad77-07674d37c7a2_0(fd75b16a15a8796bb1f5dfd4e744a76f71d1a41aef6bae14136cce7aa1c5783f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 15:16:47 crc kubenswrapper[4786]: E0313 15:16:47.930533 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jbw4r_crc-storage_5c1f7e2a-a7a0-463c-ad77-07674d37c7a2_0(fd75b16a15a8796bb1f5dfd4e744a76f71d1a41aef6bae14136cce7aa1c5783f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:47 crc kubenswrapper[4786]: E0313 15:16:47.930561 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jbw4r_crc-storage_5c1f7e2a-a7a0-463c-ad77-07674d37c7a2_0(fd75b16a15a8796bb1f5dfd4e744a76f71d1a41aef6bae14136cce7aa1c5783f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:47 crc kubenswrapper[4786]: E0313 15:16:47.930612 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-jbw4r_crc-storage(5c1f7e2a-a7a0-463c-ad77-07674d37c7a2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-jbw4r_crc-storage(5c1f7e2a-a7a0-463c-ad77-07674d37c7a2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-jbw4r_crc-storage_5c1f7e2a-a7a0-463c-ad77-07674d37c7a2_0(fd75b16a15a8796bb1f5dfd4e744a76f71d1a41aef6bae14136cce7aa1c5783f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-jbw4r" podUID="5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" Mar 13 15:16:48 crc kubenswrapper[4786]: I0313 15:16:48.718680 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:48 crc kubenswrapper[4786]: I0313 15:16:48.718726 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:48 crc kubenswrapper[4786]: I0313 15:16:48.795891 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:16:52 crc kubenswrapper[4786]: I0313 15:16:52.016540 4786 scope.go:117] "RemoveContainer" containerID="01af5470493edc97e626069e9859171986db6f4535e71cd638f9fbf5158bf999" Mar 13 15:16:58 crc kubenswrapper[4786]: I0313 15:16:58.551519 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:58 crc kubenswrapper[4786]: I0313 15:16:58.552080 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:16:58 crc kubenswrapper[4786]: I0313 15:16:58.755727 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jbw4r"] Mar 13 15:16:58 crc kubenswrapper[4786]: I0313 15:16:58.774365 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jbw4r" event={"ID":"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2","Type":"ContainerStarted","Data":"66e2d86f11e8642daac15e944446b51b5616209a6c20e048c8e99418c2638912"} Mar 13 15:17:02 crc kubenswrapper[4786]: I0313 15:17:02.801747 4786 generic.go:334] "Generic (PLEG): container finished" podID="5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" containerID="7ff3ba39c44e7b17afc354eac43d6001c39d03f38c1740a77085c938ee044094" exitCode=0 Mar 13 15:17:02 crc kubenswrapper[4786]: I0313 15:17:02.801825 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jbw4r" event={"ID":"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2","Type":"ContainerDied","Data":"7ff3ba39c44e7b17afc354eac43d6001c39d03f38c1740a77085c938ee044094"} Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.024507 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.150974 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-crc-storage\") pod \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.151088 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-node-mnt\") pod \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.151113 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwj7j\" (UniqueName: \"kubernetes.io/projected/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-kube-api-access-qwj7j\") pod \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\" (UID: \"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2\") " Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.151188 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" (UID: "5c1f7e2a-a7a0-463c-ad77-07674d37c7a2"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.152226 4786 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.161035 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-kube-api-access-qwj7j" (OuterVolumeSpecName: "kube-api-access-qwj7j") pod "5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" (UID: "5c1f7e2a-a7a0-463c-ad77-07674d37c7a2"). InnerVolumeSpecName "kube-api-access-qwj7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.169633 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" (UID: "5c1f7e2a-a7a0-463c-ad77-07674d37c7a2"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.253463 4786 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.253514 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwj7j\" (UniqueName: \"kubernetes.io/projected/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2-kube-api-access-qwj7j\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.819309 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jbw4r" event={"ID":"5c1f7e2a-a7a0-463c-ad77-07674d37c7a2","Type":"ContainerDied","Data":"66e2d86f11e8642daac15e944446b51b5616209a6c20e048c8e99418c2638912"} Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.819365 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e2d86f11e8642daac15e944446b51b5616209a6c20e048c8e99418c2638912" Mar 13 15:17:04 crc kubenswrapper[4786]: I0313 15:17:04.819428 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jbw4r" Mar 13 15:17:07 crc kubenswrapper[4786]: I0313 15:17:07.868509 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:17:07 crc kubenswrapper[4786]: I0313 15:17:07.868889 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.533416 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m"] Mar 13 15:17:10 crc kubenswrapper[4786]: E0313 15:17:10.535106 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" containerName="storage" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.535174 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" containerName="storage" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.535362 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" containerName="storage" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.536194 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.539946 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m"] Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.541737 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.637135 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv9wq\" (UniqueName: \"kubernetes.io/projected/e1b88d4e-85fd-4163-987c-d60b5653637b-kube-api-access-sv9wq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.637203 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.637303 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.738587 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.738638 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.738758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv9wq\" (UniqueName: \"kubernetes.io/projected/e1b88d4e-85fd-4163-987c-d60b5653637b-kube-api-access-sv9wq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.739193 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.739210 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.765117 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv9wq\" (UniqueName: \"kubernetes.io/projected/e1b88d4e-85fd-4163-987c-d60b5653637b-kube-api-access-sv9wq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:10 crc kubenswrapper[4786]: I0313 15:17:10.894195 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:11 crc kubenswrapper[4786]: I0313 15:17:11.127504 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mb27" Mar 13 15:17:11 crc kubenswrapper[4786]: I0313 15:17:11.304408 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m"] Mar 13 15:17:11 crc kubenswrapper[4786]: I0313 15:17:11.864707 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" event={"ID":"e1b88d4e-85fd-4163-987c-d60b5653637b","Type":"ContainerStarted","Data":"6e1cd24939833db2bc029c1e44f3d09afd84059fab6db59922a8cb326e0318c3"} Mar 13 15:17:11 crc kubenswrapper[4786]: I0313 15:17:11.865046 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" event={"ID":"e1b88d4e-85fd-4163-987c-d60b5653637b","Type":"ContainerStarted","Data":"6175a96b54a7cb50d2a207264c51011daae7abb6afb3948002a42a57fb6cd121"} Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.228346 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zxmjf"] Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.229385 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.245974 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxmjf"] Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.256034 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-catalog-content\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.256175 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-utilities\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.256353 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n5dc\" (UniqueName: \"kubernetes.io/projected/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-kube-api-access-8n5dc\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.358097 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-catalog-content\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.358148 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-utilities\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.358207 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n5dc\" (UniqueName: \"kubernetes.io/projected/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-kube-api-access-8n5dc\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.358797 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-catalog-content\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.358850 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-utilities\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.378932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n5dc\" (UniqueName: \"kubernetes.io/projected/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-kube-api-access-8n5dc\") pod \"redhat-operators-zxmjf\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.544142 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.776162 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxmjf"] Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.871582 4786 generic.go:334] "Generic (PLEG): container finished" podID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerID="6e1cd24939833db2bc029c1e44f3d09afd84059fab6db59922a8cb326e0318c3" exitCode=0 Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.871652 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" event={"ID":"e1b88d4e-85fd-4163-987c-d60b5653637b","Type":"ContainerDied","Data":"6e1cd24939833db2bc029c1e44f3d09afd84059fab6db59922a8cb326e0318c3"} Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.873383 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:17:12 crc kubenswrapper[4786]: I0313 15:17:12.875941 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmjf" event={"ID":"12b9005b-1458-4ac6-84e0-8a3f2a3ff680","Type":"ContainerStarted","Data":"05d05a846a8e59af288f8a654272307c9bf677c265770ac414cc5f2f6b5ec5f9"} Mar 13 15:17:13 crc kubenswrapper[4786]: I0313 15:17:13.884937 4786 generic.go:334] "Generic (PLEG): container finished" podID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerID="97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9" exitCode=0 Mar 13 15:17:13 crc kubenswrapper[4786]: I0313 15:17:13.885066 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmjf" event={"ID":"12b9005b-1458-4ac6-84e0-8a3f2a3ff680","Type":"ContainerDied","Data":"97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9"} Mar 13 15:17:14 crc kubenswrapper[4786]: I0313 15:17:14.893114 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmjf" event={"ID":"12b9005b-1458-4ac6-84e0-8a3f2a3ff680","Type":"ContainerStarted","Data":"3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647"} Mar 13 15:17:14 crc kubenswrapper[4786]: I0313 15:17:14.896176 4786 generic.go:334] "Generic (PLEG): container finished" podID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerID="d0d708210f4aabdfeac8e4d27eae6dc90b821da509353bb7611a1a155f7509f5" exitCode=0 Mar 13 15:17:14 crc kubenswrapper[4786]: I0313 15:17:14.896223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" event={"ID":"e1b88d4e-85fd-4163-987c-d60b5653637b","Type":"ContainerDied","Data":"d0d708210f4aabdfeac8e4d27eae6dc90b821da509353bb7611a1a155f7509f5"} Mar 13 15:17:15 crc kubenswrapper[4786]: I0313 15:17:15.903189 4786 generic.go:334] "Generic (PLEG): container finished" podID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerID="3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647" exitCode=0 Mar 13 15:17:15 crc kubenswrapper[4786]: I0313 15:17:15.903314 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmjf" event={"ID":"12b9005b-1458-4ac6-84e0-8a3f2a3ff680","Type":"ContainerDied","Data":"3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647"} Mar 13 15:17:15 crc kubenswrapper[4786]: I0313 15:17:15.906536 4786 generic.go:334] "Generic (PLEG): container finished" podID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerID="aa21c094b241d35e2ee91a67fefabbbd1b428fcafe3ac3ffaaef208d22c2ad6d" exitCode=0 Mar 13 15:17:15 crc kubenswrapper[4786]: I0313 15:17:15.906590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" event={"ID":"e1b88d4e-85fd-4163-987c-d60b5653637b","Type":"ContainerDied","Data":"aa21c094b241d35e2ee91a67fefabbbd1b428fcafe3ac3ffaaef208d22c2ad6d"} Mar 13 15:17:16 crc kubenswrapper[4786]: I0313 15:17:16.915674 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmjf" event={"ID":"12b9005b-1458-4ac6-84e0-8a3f2a3ff680","Type":"ContainerStarted","Data":"f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544"} Mar 13 15:17:16 crc kubenswrapper[4786]: I0313 15:17:16.942176 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zxmjf" podStartSLOduration=2.283511886 podStartE2EDuration="4.94215373s" podCreationTimestamp="2026-03-13 15:17:12 +0000 UTC" firstStartedPulling="2026-03-13 15:17:13.88698147 +0000 UTC m=+864.050193331" lastFinishedPulling="2026-03-13 15:17:16.545623324 +0000 UTC m=+866.708835175" observedRunningTime="2026-03-13 15:17:16.935648235 +0000 UTC m=+867.098860096" watchObservedRunningTime="2026-03-13 15:17:16.94215373 +0000 UTC m=+867.105365581" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.163708 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.221091 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-util\") pod \"e1b88d4e-85fd-4163-987c-d60b5653637b\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.221171 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-bundle\") pod \"e1b88d4e-85fd-4163-987c-d60b5653637b\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.221216 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv9wq\" (UniqueName: \"kubernetes.io/projected/e1b88d4e-85fd-4163-987c-d60b5653637b-kube-api-access-sv9wq\") pod \"e1b88d4e-85fd-4163-987c-d60b5653637b\" (UID: \"e1b88d4e-85fd-4163-987c-d60b5653637b\") " Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.221707 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-bundle" (OuterVolumeSpecName: "bundle") pod "e1b88d4e-85fd-4163-987c-d60b5653637b" (UID: "e1b88d4e-85fd-4163-987c-d60b5653637b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.227453 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b88d4e-85fd-4163-987c-d60b5653637b-kube-api-access-sv9wq" (OuterVolumeSpecName: "kube-api-access-sv9wq") pod "e1b88d4e-85fd-4163-987c-d60b5653637b" (UID: "e1b88d4e-85fd-4163-987c-d60b5653637b"). InnerVolumeSpecName "kube-api-access-sv9wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.231698 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-util" (OuterVolumeSpecName: "util") pod "e1b88d4e-85fd-4163-987c-d60b5653637b" (UID: "e1b88d4e-85fd-4163-987c-d60b5653637b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.322827 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.322890 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv9wq\" (UniqueName: \"kubernetes.io/projected/e1b88d4e-85fd-4163-987c-d60b5653637b-kube-api-access-sv9wq\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.322901 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1b88d4e-85fd-4163-987c-d60b5653637b-util\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.922655 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.922640 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m" event={"ID":"e1b88d4e-85fd-4163-987c-d60b5653637b","Type":"ContainerDied","Data":"6175a96b54a7cb50d2a207264c51011daae7abb6afb3948002a42a57fb6cd121"} Mar 13 15:17:17 crc kubenswrapper[4786]: I0313 15:17:17.922716 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6175a96b54a7cb50d2a207264c51011daae7abb6afb3948002a42a57fb6cd121" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.891680 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd"] Mar 13 15:17:21 crc kubenswrapper[4786]: E0313 15:17:21.892498 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerName="extract" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.892513 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerName="extract" Mar 13 15:17:21 crc kubenswrapper[4786]: E0313 15:17:21.892525 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerName="util" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.892532 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerName="util" Mar 13 15:17:21 crc kubenswrapper[4786]: E0313 15:17:21.892549 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerName="pull" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.892558 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerName="pull" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.892675 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1b88d4e-85fd-4163-987c-d60b5653637b" containerName="extract" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.893150 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.894616 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9l5hm" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.894722 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.895751 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.904889 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd"] Mar 13 15:17:21 crc kubenswrapper[4786]: I0313 15:17:21.980118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzsmk\" (UniqueName: \"kubernetes.io/projected/7a77aa5e-e467-4edf-8db1-6961b154d940-kube-api-access-gzsmk\") pod \"nmstate-operator-796d4cfff4-p4cvd\" (UID: \"7a77aa5e-e467-4edf-8db1-6961b154d940\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd" Mar 13 15:17:22 crc kubenswrapper[4786]: I0313 15:17:22.081320 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzsmk\" (UniqueName: \"kubernetes.io/projected/7a77aa5e-e467-4edf-8db1-6961b154d940-kube-api-access-gzsmk\") pod \"nmstate-operator-796d4cfff4-p4cvd\" (UID: \"7a77aa5e-e467-4edf-8db1-6961b154d940\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd" Mar 13 15:17:22 crc kubenswrapper[4786]: I0313 15:17:22.099385 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzsmk\" (UniqueName: \"kubernetes.io/projected/7a77aa5e-e467-4edf-8db1-6961b154d940-kube-api-access-gzsmk\") pod \"nmstate-operator-796d4cfff4-p4cvd\" (UID: \"7a77aa5e-e467-4edf-8db1-6961b154d940\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd" Mar 13 15:17:22 crc kubenswrapper[4786]: I0313 15:17:22.208807 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd" Mar 13 15:17:22 crc kubenswrapper[4786]: I0313 15:17:22.413587 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd"] Mar 13 15:17:22 crc kubenswrapper[4786]: I0313 15:17:22.544584 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:22 crc kubenswrapper[4786]: I0313 15:17:22.544647 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:22 crc kubenswrapper[4786]: I0313 15:17:22.948326 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd" event={"ID":"7a77aa5e-e467-4edf-8db1-6961b154d940","Type":"ContainerStarted","Data":"4166067c98c5a23f6115602b92d7a420e176e48e6ebb898e812a5b6ee076ace5"} Mar 13 15:17:23 crc kubenswrapper[4786]: I0313 15:17:23.595743 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zxmjf" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:23 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:23 crc kubenswrapper[4786]: > Mar 13 15:17:25 crc kubenswrapper[4786]: I0313 15:17:25.967422 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd" event={"ID":"7a77aa5e-e467-4edf-8db1-6961b154d940","Type":"ContainerStarted","Data":"5f3b848cc5ef9e2335493a79564ee1bff15305fa62f490c075ff8e9e02b873e1"} Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.545061 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-p4cvd" podStartSLOduration=6.417498044 podStartE2EDuration="9.545041212s" podCreationTimestamp="2026-03-13 15:17:21 +0000 UTC" firstStartedPulling="2026-03-13 15:17:22.423586945 +0000 UTC m=+872.586798756" lastFinishedPulling="2026-03-13 15:17:25.551130103 +0000 UTC m=+875.714341924" observedRunningTime="2026-03-13 15:17:25.990335668 +0000 UTC m=+876.153547539" watchObservedRunningTime="2026-03-13 15:17:30.545041212 +0000 UTC m=+880.708253023" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.549029 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.550190 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.553803 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tnctm" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.554083 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.583429 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.584808 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.585638 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.587498 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/33d6a569-116c-4200-b6c4-24c75bfbda77-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mq9sp\" (UID: \"33d6a569-116c-4200-b6c4-24c75bfbda77\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.587539 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfln6\" (UniqueName: \"kubernetes.io/projected/33d6a569-116c-4200-b6c4-24c75bfbda77-kube-api-access-dfln6\") pod \"nmstate-webhook-5f558f5558-mq9sp\" (UID: \"33d6a569-116c-4200-b6c4-24c75bfbda77\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.594494 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.604379 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5hfgg"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.605361 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.663061 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.663691 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.666005 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.666208 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rggqr" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.667333 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.687513 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688444 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9tsx\" (UniqueName: \"kubernetes.io/projected/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-kube-api-access-m9tsx\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688505 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e993546a-4062-41d9-870c-36f2a467be39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688532 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlq75\" (UniqueName: \"kubernetes.io/projected/9d81cf1e-8913-4975-b2f6-ce71220f51ad-kube-api-access-rlq75\") pod \"nmstate-metrics-9b8c8685d-zxmxw\" (UID: \"9d81cf1e-8913-4975-b2f6-ce71220f51ad\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688570 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-ovs-socket\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688604 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/33d6a569-116c-4200-b6c4-24c75bfbda77-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mq9sp\" (UID: \"33d6a569-116c-4200-b6c4-24c75bfbda77\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688628 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfln6\" (UniqueName: \"kubernetes.io/projected/33d6a569-116c-4200-b6c4-24c75bfbda77-kube-api-access-dfln6\") pod \"nmstate-webhook-5f558f5558-mq9sp\" (UID: \"33d6a569-116c-4200-b6c4-24c75bfbda77\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688651 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-nmstate-lock\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688674 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstvp\" (UniqueName: \"kubernetes.io/projected/e993546a-4062-41d9-870c-36f2a467be39-kube-api-access-sstvp\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688690 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-dbus-socket\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: E0313 15:17:30.688758 4786 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.688804 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e993546a-4062-41d9-870c-36f2a467be39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: E0313 15:17:30.688829 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d6a569-116c-4200-b6c4-24c75bfbda77-tls-key-pair podName:33d6a569-116c-4200-b6c4-24c75bfbda77 nodeName:}" failed. No retries permitted until 2026-03-13 15:17:31.188807397 +0000 UTC m=+881.352019218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/33d6a569-116c-4200-b6c4-24c75bfbda77-tls-key-pair") pod "nmstate-webhook-5f558f5558-mq9sp" (UID: "33d6a569-116c-4200-b6c4-24c75bfbda77") : secret "openshift-nmstate-webhook" not found Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.734878 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfln6\" (UniqueName: \"kubernetes.io/projected/33d6a569-116c-4200-b6c4-24c75bfbda77-kube-api-access-dfln6\") pod \"nmstate-webhook-5f558f5558-mq9sp\" (UID: \"33d6a569-116c-4200-b6c4-24c75bfbda77\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.789793 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-nmstate-lock\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.789869 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-dbus-socket\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.789896 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstvp\" (UniqueName: \"kubernetes.io/projected/e993546a-4062-41d9-870c-36f2a467be39-kube-api-access-sstvp\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.789942 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e993546a-4062-41d9-870c-36f2a467be39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.789977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9tsx\" (UniqueName: \"kubernetes.io/projected/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-kube-api-access-m9tsx\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.790002 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e993546a-4062-41d9-870c-36f2a467be39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.790026 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlq75\" (UniqueName: \"kubernetes.io/projected/9d81cf1e-8913-4975-b2f6-ce71220f51ad-kube-api-access-rlq75\") pod \"nmstate-metrics-9b8c8685d-zxmxw\" (UID: \"9d81cf1e-8913-4975-b2f6-ce71220f51ad\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.790080 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-ovs-socket\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.790139 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-nmstate-lock\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.790159 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-ovs-socket\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: E0313 15:17:30.790222 4786 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 13 15:17:30 crc kubenswrapper[4786]: E0313 15:17:30.790275 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e993546a-4062-41d9-870c-36f2a467be39-plugin-serving-cert podName:e993546a-4062-41d9-870c-36f2a467be39 nodeName:}" failed. No retries permitted until 2026-03-13 15:17:31.290259172 +0000 UTC m=+881.453471053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e993546a-4062-41d9-870c-36f2a467be39-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-g5bjq" (UID: "e993546a-4062-41d9-870c-36f2a467be39") : secret "plugin-serving-cert" not found Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.790452 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-dbus-socket\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.791313 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e993546a-4062-41d9-870c-36f2a467be39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.812591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9tsx\" (UniqueName: \"kubernetes.io/projected/0bd50cd9-6517-4b70-8bf3-8e3469793cd3-kube-api-access-m9tsx\") pod \"nmstate-handler-5hfgg\" (UID: \"0bd50cd9-6517-4b70-8bf3-8e3469793cd3\") " pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.813822 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlq75\" (UniqueName: \"kubernetes.io/projected/9d81cf1e-8913-4975-b2f6-ce71220f51ad-kube-api-access-rlq75\") pod \"nmstate-metrics-9b8c8685d-zxmxw\" (UID: \"9d81cf1e-8913-4975-b2f6-ce71220f51ad\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.819932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstvp\" (UniqueName: \"kubernetes.io/projected/e993546a-4062-41d9-870c-36f2a467be39-kube-api-access-sstvp\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.873788 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6884d88fb8-qxvw4"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.874416 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.898082 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6884d88fb8-qxvw4"] Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.902322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.920619 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.993486 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-trusted-ca-bundle\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.993571 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/895be3c4-9809-48be-8d08-10547582cde2-console-serving-cert\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.993611 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/895be3c4-9809-48be-8d08-10547582cde2-console-oauth-config\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.993654 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-console-config\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.993701 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brgkt\" (UniqueName: \"kubernetes.io/projected/895be3c4-9809-48be-8d08-10547582cde2-kube-api-access-brgkt\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.993747 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-service-ca\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:30 crc kubenswrapper[4786]: I0313 15:17:30.993768 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-oauth-serving-cert\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.097972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-console-config\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.098324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brgkt\" (UniqueName: \"kubernetes.io/projected/895be3c4-9809-48be-8d08-10547582cde2-kube-api-access-brgkt\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.098366 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-service-ca\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.098386 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-oauth-serving-cert\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.098417 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-trusted-ca-bundle\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.098458 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/895be3c4-9809-48be-8d08-10547582cde2-console-serving-cert\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.098486 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/895be3c4-9809-48be-8d08-10547582cde2-console-oauth-config\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.100537 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-console-config\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.101459 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-service-ca\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.102186 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-trusted-ca-bundle\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.102825 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/895be3c4-9809-48be-8d08-10547582cde2-oauth-serving-cert\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.106833 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/895be3c4-9809-48be-8d08-10547582cde2-console-serving-cert\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.106829 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/895be3c4-9809-48be-8d08-10547582cde2-console-oauth-config\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.116633 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brgkt\" (UniqueName: \"kubernetes.io/projected/895be3c4-9809-48be-8d08-10547582cde2-kube-api-access-brgkt\") pod \"console-6884d88fb8-qxvw4\" (UID: \"895be3c4-9809-48be-8d08-10547582cde2\") " pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.183839 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw"] Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.190151 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:31 crc kubenswrapper[4786]: W0313 15:17:31.190882 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d81cf1e_8913_4975_b2f6_ce71220f51ad.slice/crio-d5fe75a1084e8eb42850473bdc3182fae878bf562ac496818460afbe5d54ddfe WatchSource:0}: Error finding container d5fe75a1084e8eb42850473bdc3182fae878bf562ac496818460afbe5d54ddfe: Status 404 returned error can't find the container with id d5fe75a1084e8eb42850473bdc3182fae878bf562ac496818460afbe5d54ddfe Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.199235 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/33d6a569-116c-4200-b6c4-24c75bfbda77-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mq9sp\" (UID: \"33d6a569-116c-4200-b6c4-24c75bfbda77\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.202405 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/33d6a569-116c-4200-b6c4-24c75bfbda77-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mq9sp\" (UID: \"33d6a569-116c-4200-b6c4-24c75bfbda77\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.300797 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e993546a-4062-41d9-870c-36f2a467be39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.303832 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e993546a-4062-41d9-870c-36f2a467be39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-g5bjq\" (UID: \"e993546a-4062-41d9-870c-36f2a467be39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.355634 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6884d88fb8-qxvw4"] Mar 13 15:17:31 crc kubenswrapper[4786]: W0313 15:17:31.360185 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod895be3c4_9809_48be_8d08_10547582cde2.slice/crio-d90104af9709552f810dda74ab29fcec7add1646b5717d03a3a5764b741791a9 WatchSource:0}: Error finding container d90104af9709552f810dda74ab29fcec7add1646b5717d03a3a5764b741791a9: Status 404 returned error can't find the container with id d90104af9709552f810dda74ab29fcec7add1646b5717d03a3a5764b741791a9 Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.467173 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.575817 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.745329 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq"] Mar 13 15:17:31 crc kubenswrapper[4786]: W0313 15:17:31.883398 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d6a569_116c_4200_b6c4_24c75bfbda77.slice/crio-7838e3a0dbe909e65e6df39d9ad2279e8f595efd28a4fa4053459c8dfbe8cf84 WatchSource:0}: Error finding container 7838e3a0dbe909e65e6df39d9ad2279e8f595efd28a4fa4053459c8dfbe8cf84: Status 404 returned error can't find the container with id 7838e3a0dbe909e65e6df39d9ad2279e8f595efd28a4fa4053459c8dfbe8cf84 Mar 13 15:17:31 crc kubenswrapper[4786]: I0313 15:17:31.903392 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp"] Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.020208 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" event={"ID":"33d6a569-116c-4200-b6c4-24c75bfbda77","Type":"ContainerStarted","Data":"7838e3a0dbe909e65e6df39d9ad2279e8f595efd28a4fa4053459c8dfbe8cf84"} Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.021888 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5hfgg" event={"ID":"0bd50cd9-6517-4b70-8bf3-8e3469793cd3","Type":"ContainerStarted","Data":"7f4bee41da91067677f71f24615903c23a2514a9a39684e0f5ac9b48803987fa"} Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.023160 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" event={"ID":"9d81cf1e-8913-4975-b2f6-ce71220f51ad","Type":"ContainerStarted","Data":"d5fe75a1084e8eb42850473bdc3182fae878bf562ac496818460afbe5d54ddfe"} Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.025255 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6884d88fb8-qxvw4" event={"ID":"895be3c4-9809-48be-8d08-10547582cde2","Type":"ContainerStarted","Data":"47a9f16f251c170005fc43e89c14c271bce6d41d7d3350149547fa11cb95c2db"} Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.025301 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6884d88fb8-qxvw4" event={"ID":"895be3c4-9809-48be-8d08-10547582cde2","Type":"ContainerStarted","Data":"d90104af9709552f810dda74ab29fcec7add1646b5717d03a3a5764b741791a9"} Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.026675 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" event={"ID":"e993546a-4062-41d9-870c-36f2a467be39","Type":"ContainerStarted","Data":"b00f820122e8ce49620a5237a3435b71e772424dbe8d5c839dee86d995445f28"} Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.046681 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6884d88fb8-qxvw4" podStartSLOduration=2.046660919 podStartE2EDuration="2.046660919s" podCreationTimestamp="2026-03-13 15:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:17:32.046562087 +0000 UTC m=+882.209773938" watchObservedRunningTime="2026-03-13 15:17:32.046660919 +0000 UTC m=+882.209872750" Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.610474 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.657732 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:32 crc kubenswrapper[4786]: I0313 15:17:32.845910 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxmjf"] Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.041181 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zxmjf" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="registry-server" containerID="cri-o://f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544" gracePeriod=2 Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.352125 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.442483 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-catalog-content\") pod \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.442592 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-utilities\") pod \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.442631 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n5dc\" (UniqueName: \"kubernetes.io/projected/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-kube-api-access-8n5dc\") pod \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\" (UID: \"12b9005b-1458-4ac6-84e0-8a3f2a3ff680\") " Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.444370 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-utilities" (OuterVolumeSpecName: "utilities") pod "12b9005b-1458-4ac6-84e0-8a3f2a3ff680" (UID: "12b9005b-1458-4ac6-84e0-8a3f2a3ff680"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.448211 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-kube-api-access-8n5dc" (OuterVolumeSpecName: "kube-api-access-8n5dc") pod "12b9005b-1458-4ac6-84e0-8a3f2a3ff680" (UID: "12b9005b-1458-4ac6-84e0-8a3f2a3ff680"). InnerVolumeSpecName "kube-api-access-8n5dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.543728 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.543770 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n5dc\" (UniqueName: \"kubernetes.io/projected/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-kube-api-access-8n5dc\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.581431 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12b9005b-1458-4ac6-84e0-8a3f2a3ff680" (UID: "12b9005b-1458-4ac6-84e0-8a3f2a3ff680"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:17:34 crc kubenswrapper[4786]: I0313 15:17:34.645388 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12b9005b-1458-4ac6-84e0-8a3f2a3ff680-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.048244 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5hfgg" event={"ID":"0bd50cd9-6517-4b70-8bf3-8e3469793cd3","Type":"ContainerStarted","Data":"b038d52f595d5b634627c8e7f56e0df8ec896e249e7f782ced1179c1957d923c"} Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.049060 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.051843 4786 generic.go:334] "Generic (PLEG): container finished" podID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerID="f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544" exitCode=0 Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.051918 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxmjf" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.051912 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmjf" event={"ID":"12b9005b-1458-4ac6-84e0-8a3f2a3ff680","Type":"ContainerDied","Data":"f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544"} Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.052030 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxmjf" event={"ID":"12b9005b-1458-4ac6-84e0-8a3f2a3ff680","Type":"ContainerDied","Data":"05d05a846a8e59af288f8a654272307c9bf677c265770ac414cc5f2f6b5ec5f9"} Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.052053 4786 scope.go:117] "RemoveContainer" containerID="f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.053562 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" event={"ID":"9d81cf1e-8913-4975-b2f6-ce71220f51ad","Type":"ContainerStarted","Data":"771e97d894c20e00bab918ef1c0ce651d451be4edbd98a4d2bf06347a88e8464"} Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.058883 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" event={"ID":"e993546a-4062-41d9-870c-36f2a467be39","Type":"ContainerStarted","Data":"feb2e7fa8bbab21770bda4cacc87c3d482f915368eda29affbc9f63c16f4ba55"} Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.060471 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" event={"ID":"33d6a569-116c-4200-b6c4-24c75bfbda77","Type":"ContainerStarted","Data":"32a40ad8c9265bfdc8b2daee46f604ee5472cbc642b2e260b77a860228474207"} Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.061315 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.067812 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5hfgg" podStartSLOduration=1.86987247 podStartE2EDuration="5.067793028s" podCreationTimestamp="2026-03-13 15:17:30 +0000 UTC" firstStartedPulling="2026-03-13 15:17:31.011961738 +0000 UTC m=+881.175173549" lastFinishedPulling="2026-03-13 15:17:34.209882296 +0000 UTC m=+884.373094107" observedRunningTime="2026-03-13 15:17:35.06629853 +0000 UTC m=+885.229510341" watchObservedRunningTime="2026-03-13 15:17:35.067793028 +0000 UTC m=+885.231004849" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.069871 4786 scope.go:117] "RemoveContainer" containerID="3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.083188 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-g5bjq" podStartSLOduration=2.647693356 podStartE2EDuration="5.083167376s" podCreationTimestamp="2026-03-13 15:17:30 +0000 UTC" firstStartedPulling="2026-03-13 15:17:31.750100331 +0000 UTC m=+881.913312142" lastFinishedPulling="2026-03-13 15:17:34.185574351 +0000 UTC m=+884.348786162" observedRunningTime="2026-03-13 15:17:35.079897294 +0000 UTC m=+885.243109145" watchObservedRunningTime="2026-03-13 15:17:35.083167376 +0000 UTC m=+885.246379187" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.109586 4786 scope.go:117] "RemoveContainer" containerID="97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.112152 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" podStartSLOduration=2.8262387110000002 podStartE2EDuration="5.112131919s" podCreationTimestamp="2026-03-13 15:17:30 +0000 UTC" firstStartedPulling="2026-03-13 15:17:31.899680023 +0000 UTC m=+882.062891844" lastFinishedPulling="2026-03-13 15:17:34.185573231 +0000 UTC m=+884.348785052" observedRunningTime="2026-03-13 15:17:35.101617523 +0000 UTC m=+885.264829334" watchObservedRunningTime="2026-03-13 15:17:35.112131919 +0000 UTC m=+885.275343750" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.128439 4786 scope.go:117] "RemoveContainer" containerID="f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544" Mar 13 15:17:35 crc kubenswrapper[4786]: E0313 15:17:35.129047 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544\": container with ID starting with f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544 not found: ID does not exist" containerID="f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.129077 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544"} err="failed to get container status \"f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544\": rpc error: code = NotFound desc = could not find container \"f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544\": container with ID starting with f90dcacd519c11f9da9d045f4e2aea5adff2ea195144b5aa47b7db8987d07544 not found: ID does not exist" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.129098 4786 scope.go:117] "RemoveContainer" containerID="3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647" Mar 13 15:17:35 crc kubenswrapper[4786]: E0313 15:17:35.129461 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647\": container with ID starting with 3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647 not found: ID does not exist" containerID="3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.129484 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647"} err="failed to get container status \"3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647\": rpc error: code = NotFound desc = could not find container \"3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647\": container with ID starting with 3c37a2742d9d6b150d84294f626bf02fca73a80265ab9e9188e274af3dbdb647 not found: ID does not exist" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.129496 4786 scope.go:117] "RemoveContainer" containerID="97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9" Mar 13 15:17:35 crc kubenswrapper[4786]: E0313 15:17:35.130227 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9\": container with ID starting with 97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9 not found: ID does not exist" containerID="97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.130247 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9"} err="failed to get container status \"97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9\": rpc error: code = NotFound desc = could not find container \"97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9\": container with ID starting with 97caa2504a11d0448e264dca5790f69257a1937dae4586245b8d6ffcc6f203a9 not found: ID does not exist" Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.132332 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxmjf"] Mar 13 15:17:35 crc kubenswrapper[4786]: I0313 15:17:35.137794 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zxmjf"] Mar 13 15:17:36 crc kubenswrapper[4786]: I0313 15:17:36.558882 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" path="/var/lib/kubelet/pods/12b9005b-1458-4ac6-84e0-8a3f2a3ff680/volumes" Mar 13 15:17:37 crc kubenswrapper[4786]: I0313 15:17:37.881781 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:17:37 crc kubenswrapper[4786]: I0313 15:17:37.882217 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:17:38 crc kubenswrapper[4786]: I0313 15:17:38.088521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" event={"ID":"9d81cf1e-8913-4975-b2f6-ce71220f51ad","Type":"ContainerStarted","Data":"5cdea7c0f55060d7907c17b22a8639e869cb6ecd79a671d1307dbb62ef8a174c"} Mar 13 15:17:38 crc kubenswrapper[4786]: I0313 15:17:38.121965 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-zxmxw" podStartSLOduration=1.987381691 podStartE2EDuration="8.12193664s" podCreationTimestamp="2026-03-13 15:17:30 +0000 UTC" firstStartedPulling="2026-03-13 15:17:31.197108549 +0000 UTC m=+881.360320360" lastFinishedPulling="2026-03-13 15:17:37.331663478 +0000 UTC m=+887.494875309" observedRunningTime="2026-03-13 15:17:38.115290512 +0000 UTC m=+888.278502323" watchObservedRunningTime="2026-03-13 15:17:38.12193664 +0000 UTC m=+888.285148461" Mar 13 15:17:40 crc kubenswrapper[4786]: I0313 15:17:40.960404 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5hfgg" Mar 13 15:17:41 crc kubenswrapper[4786]: I0313 15:17:41.192283 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:41 crc kubenswrapper[4786]: I0313 15:17:41.193106 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:41 crc kubenswrapper[4786]: I0313 15:17:41.195684 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:42 crc kubenswrapper[4786]: I0313 15:17:42.121115 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6884d88fb8-qxvw4" Mar 13 15:17:42 crc kubenswrapper[4786]: I0313 15:17:42.180974 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2tqr8"] Mar 13 15:17:51 crc kubenswrapper[4786]: I0313 15:17:51.475563 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mq9sp" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.138018 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556918-kdwgv"] Mar 13 15:18:00 crc kubenswrapper[4786]: E0313 15:18:00.139011 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="extract-utilities" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.139063 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="extract-utilities" Mar 13 15:18:00 crc kubenswrapper[4786]: E0313 15:18:00.139075 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="extract-content" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.139081 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="extract-content" Mar 13 15:18:00 crc kubenswrapper[4786]: E0313 15:18:00.139092 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="registry-server" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.139098 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="registry-server" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.139188 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b9005b-1458-4ac6-84e0-8a3f2a3ff680" containerName="registry-server" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.139647 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-kdwgv" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.142833 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.146813 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-kdwgv"] Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.148223 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.148640 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.205223 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl928\" (UniqueName: \"kubernetes.io/projected/7b51b465-3b1c-48de-8f8c-672f96eb07c9-kube-api-access-gl928\") pod \"auto-csr-approver-29556918-kdwgv\" (UID: \"7b51b465-3b1c-48de-8f8c-672f96eb07c9\") " pod="openshift-infra/auto-csr-approver-29556918-kdwgv" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.306151 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl928\" (UniqueName: \"kubernetes.io/projected/7b51b465-3b1c-48de-8f8c-672f96eb07c9-kube-api-access-gl928\") pod \"auto-csr-approver-29556918-kdwgv\" (UID: \"7b51b465-3b1c-48de-8f8c-672f96eb07c9\") " pod="openshift-infra/auto-csr-approver-29556918-kdwgv" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.325375 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl928\" (UniqueName: \"kubernetes.io/projected/7b51b465-3b1c-48de-8f8c-672f96eb07c9-kube-api-access-gl928\") pod \"auto-csr-approver-29556918-kdwgv\" (UID: \"7b51b465-3b1c-48de-8f8c-672f96eb07c9\") " pod="openshift-infra/auto-csr-approver-29556918-kdwgv" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.514249 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-kdwgv" Mar 13 15:18:00 crc kubenswrapper[4786]: I0313 15:18:00.943444 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-kdwgv"] Mar 13 15:18:01 crc kubenswrapper[4786]: I0313 15:18:01.233791 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-kdwgv" event={"ID":"7b51b465-3b1c-48de-8f8c-672f96eb07c9","Type":"ContainerStarted","Data":"df4767ce65199873ca13eb7721c1c6d844adee3da2e0956e2b3fe4510d285542"} Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.251512 4786 generic.go:334] "Generic (PLEG): container finished" podID="7b51b465-3b1c-48de-8f8c-672f96eb07c9" containerID="120ce3515ded4e6a9af9413eac717dc3b1eb01ded30c3453e2ae4e06dc5b62ba" exitCode=0 Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.251576 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-kdwgv" event={"ID":"7b51b465-3b1c-48de-8f8c-672f96eb07c9","Type":"ContainerDied","Data":"120ce3515ded4e6a9af9413eac717dc3b1eb01ded30c3453e2ae4e06dc5b62ba"} Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.507258 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh"] Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.508457 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.510374 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.517831 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh"] Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.646459 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.646592 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.646625 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsqzh\" (UniqueName: \"kubernetes.io/projected/cdd29b42-059a-45b5-9ae0-9d1b25879f87-kube-api-access-hsqzh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.748103 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.748145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsqzh\" (UniqueName: \"kubernetes.io/projected/cdd29b42-059a-45b5-9ae0-9d1b25879f87-kube-api-access-hsqzh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.748188 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.748502 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.748552 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.770919 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsqzh\" (UniqueName: \"kubernetes.io/projected/cdd29b42-059a-45b5-9ae0-9d1b25879f87-kube-api-access-hsqzh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:03 crc kubenswrapper[4786]: I0313 15:18:03.825723 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:04 crc kubenswrapper[4786]: I0313 15:18:04.209795 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh"] Mar 13 15:18:04 crc kubenswrapper[4786]: I0313 15:18:04.259899 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" event={"ID":"cdd29b42-059a-45b5-9ae0-9d1b25879f87","Type":"ContainerStarted","Data":"798b3073ad1cc38be69ed44a665b3bcfb2c821ea63bbe84dd45c15624d3ea1fa"} Mar 13 15:18:04 crc kubenswrapper[4786]: I0313 15:18:04.464793 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-kdwgv" Mar 13 15:18:04 crc kubenswrapper[4786]: I0313 15:18:04.660984 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl928\" (UniqueName: \"kubernetes.io/projected/7b51b465-3b1c-48de-8f8c-672f96eb07c9-kube-api-access-gl928\") pod \"7b51b465-3b1c-48de-8f8c-672f96eb07c9\" (UID: \"7b51b465-3b1c-48de-8f8c-672f96eb07c9\") " Mar 13 15:18:04 crc kubenswrapper[4786]: I0313 15:18:04.667553 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b51b465-3b1c-48de-8f8c-672f96eb07c9-kube-api-access-gl928" (OuterVolumeSpecName: "kube-api-access-gl928") pod "7b51b465-3b1c-48de-8f8c-672f96eb07c9" (UID: "7b51b465-3b1c-48de-8f8c-672f96eb07c9"). InnerVolumeSpecName "kube-api-access-gl928". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:18:04 crc kubenswrapper[4786]: I0313 15:18:04.762676 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl928\" (UniqueName: \"kubernetes.io/projected/7b51b465-3b1c-48de-8f8c-672f96eb07c9-kube-api-access-gl928\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:05 crc kubenswrapper[4786]: I0313 15:18:05.268770 4786 generic.go:334] "Generic (PLEG): container finished" podID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerID="45d6ce4b45ea9f0a05bb1efdad6f55e1d9f1f8f2ee669daa7ce36a87c0147fb6" exitCode=0 Mar 13 15:18:05 crc kubenswrapper[4786]: I0313 15:18:05.268869 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" event={"ID":"cdd29b42-059a-45b5-9ae0-9d1b25879f87","Type":"ContainerDied","Data":"45d6ce4b45ea9f0a05bb1efdad6f55e1d9f1f8f2ee669daa7ce36a87c0147fb6"} Mar 13 15:18:05 crc kubenswrapper[4786]: I0313 15:18:05.270344 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-kdwgv" event={"ID":"7b51b465-3b1c-48de-8f8c-672f96eb07c9","Type":"ContainerDied","Data":"df4767ce65199873ca13eb7721c1c6d844adee3da2e0956e2b3fe4510d285542"} Mar 13 15:18:05 crc kubenswrapper[4786]: I0313 15:18:05.270364 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df4767ce65199873ca13eb7721c1c6d844adee3da2e0956e2b3fe4510d285542" Mar 13 15:18:05 crc kubenswrapper[4786]: I0313 15:18:05.270380 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-kdwgv" Mar 13 15:18:05 crc kubenswrapper[4786]: I0313 15:18:05.520247 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-zp2mk"] Mar 13 15:18:05 crc kubenswrapper[4786]: I0313 15:18:05.529521 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-zp2mk"] Mar 13 15:18:06 crc kubenswrapper[4786]: I0313 15:18:06.560082 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54310c14-883f-4071-a914-907a4749d55d" path="/var/lib/kubelet/pods/54310c14-883f-4071-a914-907a4749d55d/volumes" Mar 13 15:18:07 crc kubenswrapper[4786]: I0313 15:18:07.226104 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-2tqr8" podUID="0c6ae29c-e743-4193-bce1-22b4c5732f45" containerName="console" containerID="cri-o://a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241" gracePeriod=15 Mar 13 15:18:07 crc kubenswrapper[4786]: I0313 15:18:07.868462 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:18:07 crc kubenswrapper[4786]: I0313 15:18:07.868799 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:18:07 crc kubenswrapper[4786]: I0313 15:18:07.868839 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:18:07 crc kubenswrapper[4786]: I0313 15:18:07.869492 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dad5dab593ccac2d22182d28c8abfa4af5554be94f9545ed96143d1052cb64d4"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:18:07 crc kubenswrapper[4786]: I0313 15:18:07.869544 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://dad5dab593ccac2d22182d28c8abfa4af5554be94f9545ed96143d1052cb64d4" gracePeriod=600 Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.145256 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2tqr8_0c6ae29c-e743-4193-bce1-22b4c5732f45/console/0.log" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.145750 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.241227 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-trusted-ca-bundle\") pod \"0c6ae29c-e743-4193-bce1-22b4c5732f45\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.241273 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-serving-cert\") pod \"0c6ae29c-e743-4193-bce1-22b4c5732f45\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.241291 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-config\") pod \"0c6ae29c-e743-4193-bce1-22b4c5732f45\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.241308 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-oauth-serving-cert\") pod \"0c6ae29c-e743-4193-bce1-22b4c5732f45\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.241337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-oauth-config\") pod \"0c6ae29c-e743-4193-bce1-22b4c5732f45\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.241919 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hpvc\" (UniqueName: \"kubernetes.io/projected/0c6ae29c-e743-4193-bce1-22b4c5732f45-kube-api-access-4hpvc\") pod \"0c6ae29c-e743-4193-bce1-22b4c5732f45\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242066 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-service-ca\") pod \"0c6ae29c-e743-4193-bce1-22b4c5732f45\" (UID: \"0c6ae29c-e743-4193-bce1-22b4c5732f45\") " Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242191 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0c6ae29c-e743-4193-bce1-22b4c5732f45" (UID: "0c6ae29c-e743-4193-bce1-22b4c5732f45"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242261 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-config" (OuterVolumeSpecName: "console-config") pod "0c6ae29c-e743-4193-bce1-22b4c5732f45" (UID: "0c6ae29c-e743-4193-bce1-22b4c5732f45"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242476 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-service-ca" (OuterVolumeSpecName: "service-ca") pod "0c6ae29c-e743-4193-bce1-22b4c5732f45" (UID: "0c6ae29c-e743-4193-bce1-22b4c5732f45"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242573 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0c6ae29c-e743-4193-bce1-22b4c5732f45" (UID: "0c6ae29c-e743-4193-bce1-22b4c5732f45"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242743 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242788 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242814 4786 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.242837 4786 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0c6ae29c-e743-4193-bce1-22b4c5732f45-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.246281 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0c6ae29c-e743-4193-bce1-22b4c5732f45" (UID: "0c6ae29c-e743-4193-bce1-22b4c5732f45"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.246445 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0c6ae29c-e743-4193-bce1-22b4c5732f45" (UID: "0c6ae29c-e743-4193-bce1-22b4c5732f45"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.254004 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6ae29c-e743-4193-bce1-22b4c5732f45-kube-api-access-4hpvc" (OuterVolumeSpecName: "kube-api-access-4hpvc") pod "0c6ae29c-e743-4193-bce1-22b4c5732f45" (UID: "0c6ae29c-e743-4193-bce1-22b4c5732f45"). InnerVolumeSpecName "kube-api-access-4hpvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.289996 4786 generic.go:334] "Generic (PLEG): container finished" podID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerID="2320fca42242c154c5f36c97ebb994c4678e23c98325896af11a53c788e67384" exitCode=0 Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.290078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" event={"ID":"cdd29b42-059a-45b5-9ae0-9d1b25879f87","Type":"ContainerDied","Data":"2320fca42242c154c5f36c97ebb994c4678e23c98325896af11a53c788e67384"} Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.291802 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2tqr8_0c6ae29c-e743-4193-bce1-22b4c5732f45/console/0.log" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.291829 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c6ae29c-e743-4193-bce1-22b4c5732f45" containerID="a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241" exitCode=2 Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.291901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2tqr8" event={"ID":"0c6ae29c-e743-4193-bce1-22b4c5732f45","Type":"ContainerDied","Data":"a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241"} Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.291942 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2tqr8" event={"ID":"0c6ae29c-e743-4193-bce1-22b4c5732f45","Type":"ContainerDied","Data":"e3992521c77cdd988b51c4b4bae2b756da30ca49d8714baed94fdade6bd8c8c9"} Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.291941 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2tqr8" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.291969 4786 scope.go:117] "RemoveContainer" containerID="a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.296068 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="dad5dab593ccac2d22182d28c8abfa4af5554be94f9545ed96143d1052cb64d4" exitCode=0 Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.296128 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"dad5dab593ccac2d22182d28c8abfa4af5554be94f9545ed96143d1052cb64d4"} Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.296200 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"a926826d2fa94d740a03a1a08b36f6e48f1ce5a8cc37acd7e0fef98af56e6473"} Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.318814 4786 scope.go:117] "RemoveContainer" containerID="a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241" Mar 13 15:18:08 crc kubenswrapper[4786]: E0313 15:18:08.319313 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241\": container with ID starting with a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241 not found: ID does not exist" containerID="a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.319359 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241"} err="failed to get container status \"a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241\": rpc error: code = NotFound desc = could not find container \"a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241\": container with ID starting with a96d9477f31365d1b187924becd6e33c06d7b56185365280bc050875b7c84241 not found: ID does not exist" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.319382 4786 scope.go:117] "RemoveContainer" containerID="62d0b2ce1eb336ac4fba66f12731e23d24aaf160b69d1b42e5a0e33b26f90040" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.345728 4786 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.345765 4786 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0c6ae29c-e743-4193-bce1-22b4c5732f45-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.345778 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hpvc\" (UniqueName: \"kubernetes.io/projected/0c6ae29c-e743-4193-bce1-22b4c5732f45-kube-api-access-4hpvc\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.353179 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2tqr8"] Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.359100 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-2tqr8"] Mar 13 15:18:08 crc kubenswrapper[4786]: I0313 15:18:08.559067 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6ae29c-e743-4193-bce1-22b4c5732f45" path="/var/lib/kubelet/pods/0c6ae29c-e743-4193-bce1-22b4c5732f45/volumes" Mar 13 15:18:09 crc kubenswrapper[4786]: I0313 15:18:09.309419 4786 generic.go:334] "Generic (PLEG): container finished" podID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerID="adb2fcd6b37d493934f23a6bf879803d1b19bc19321fa313d85ea5195eeb2dce" exitCode=0 Mar 13 15:18:09 crc kubenswrapper[4786]: I0313 15:18:09.309485 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" event={"ID":"cdd29b42-059a-45b5-9ae0-9d1b25879f87","Type":"ContainerDied","Data":"adb2fcd6b37d493934f23a6bf879803d1b19bc19321fa313d85ea5195eeb2dce"} Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.599154 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.674409 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-util\") pod \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.674524 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsqzh\" (UniqueName: \"kubernetes.io/projected/cdd29b42-059a-45b5-9ae0-9d1b25879f87-kube-api-access-hsqzh\") pod \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.674570 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-bundle\") pod \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\" (UID: \"cdd29b42-059a-45b5-9ae0-9d1b25879f87\") " Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.675916 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-bundle" (OuterVolumeSpecName: "bundle") pod "cdd29b42-059a-45b5-9ae0-9d1b25879f87" (UID: "cdd29b42-059a-45b5-9ae0-9d1b25879f87"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.680209 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd29b42-059a-45b5-9ae0-9d1b25879f87-kube-api-access-hsqzh" (OuterVolumeSpecName: "kube-api-access-hsqzh") pod "cdd29b42-059a-45b5-9ae0-9d1b25879f87" (UID: "cdd29b42-059a-45b5-9ae0-9d1b25879f87"). InnerVolumeSpecName "kube-api-access-hsqzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.689498 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-util" (OuterVolumeSpecName: "util") pod "cdd29b42-059a-45b5-9ae0-9d1b25879f87" (UID: "cdd29b42-059a-45b5-9ae0-9d1b25879f87"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.789404 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsqzh\" (UniqueName: \"kubernetes.io/projected/cdd29b42-059a-45b5-9ae0-9d1b25879f87-kube-api-access-hsqzh\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.789445 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:10 crc kubenswrapper[4786]: I0313 15:18:10.789458 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdd29b42-059a-45b5-9ae0-9d1b25879f87-util\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:11 crc kubenswrapper[4786]: I0313 15:18:11.326649 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" event={"ID":"cdd29b42-059a-45b5-9ae0-9d1b25879f87","Type":"ContainerDied","Data":"798b3073ad1cc38be69ed44a665b3bcfb2c821ea63bbe84dd45c15624d3ea1fa"} Mar 13 15:18:11 crc kubenswrapper[4786]: I0313 15:18:11.326691 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="798b3073ad1cc38be69ed44a665b3bcfb2c821ea63bbe84dd45c15624d3ea1fa" Mar 13 15:18:11 crc kubenswrapper[4786]: I0313 15:18:11.326736 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.910574 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw"] Mar 13 15:18:18 crc kubenswrapper[4786]: E0313 15:18:18.911411 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerName="pull" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.911426 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerName="pull" Mar 13 15:18:18 crc kubenswrapper[4786]: E0313 15:18:18.911449 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerName="util" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.911456 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerName="util" Mar 13 15:18:18 crc kubenswrapper[4786]: E0313 15:18:18.911471 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerName="extract" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.911489 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerName="extract" Mar 13 15:18:18 crc kubenswrapper[4786]: E0313 15:18:18.911500 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b51b465-3b1c-48de-8f8c-672f96eb07c9" containerName="oc" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.911506 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b51b465-3b1c-48de-8f8c-672f96eb07c9" containerName="oc" Mar 13 15:18:18 crc kubenswrapper[4786]: E0313 15:18:18.911516 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6ae29c-e743-4193-bce1-22b4c5732f45" containerName="console" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.911525 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6ae29c-e743-4193-bce1-22b4c5732f45" containerName="console" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.911640 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6ae29c-e743-4193-bce1-22b4c5732f45" containerName="console" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.911655 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd29b42-059a-45b5-9ae0-9d1b25879f87" containerName="extract" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.911671 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b51b465-3b1c-48de-8f8c-672f96eb07c9" containerName="oc" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.912142 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.919195 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.919376 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fs9gf" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.919473 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.920048 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.931693 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw"] Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.934091 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.987416 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/76502e53-99f8-40ce-a930-f34b84f5718f-webhook-cert\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.987497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/76502e53-99f8-40ce-a930-f34b84f5718f-apiservice-cert\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:18 crc kubenswrapper[4786]: I0313 15:18:18.987601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twdkz\" (UniqueName: \"kubernetes.io/projected/76502e53-99f8-40ce-a930-f34b84f5718f-kube-api-access-twdkz\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.088915 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/76502e53-99f8-40ce-a930-f34b84f5718f-webhook-cert\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.088972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/76502e53-99f8-40ce-a930-f34b84f5718f-apiservice-cert\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.089020 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twdkz\" (UniqueName: \"kubernetes.io/projected/76502e53-99f8-40ce-a930-f34b84f5718f-kube-api-access-twdkz\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.108757 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/76502e53-99f8-40ce-a930-f34b84f5718f-webhook-cert\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.108791 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twdkz\" (UniqueName: \"kubernetes.io/projected/76502e53-99f8-40ce-a930-f34b84f5718f-kube-api-access-twdkz\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.110252 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/76502e53-99f8-40ce-a930-f34b84f5718f-apiservice-cert\") pod \"metallb-operator-controller-manager-6db9d94745-cm6rw\" (UID: \"76502e53-99f8-40ce-a930-f34b84f5718f\") " pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.139020 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-55488488b6-jbcct"] Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.140211 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.141580 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bbd64" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.142149 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.145867 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.153507 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55488488b6-jbcct"] Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.235446 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.290982 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjb6n\" (UniqueName: \"kubernetes.io/projected/180f6fad-fff1-4831-8332-027c32b35d7d-kube-api-access-vjb6n\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.291091 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180f6fad-fff1-4831-8332-027c32b35d7d-apiservice-cert\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.291129 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180f6fad-fff1-4831-8332-027c32b35d7d-webhook-cert\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.391738 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjb6n\" (UniqueName: \"kubernetes.io/projected/180f6fad-fff1-4831-8332-027c32b35d7d-kube-api-access-vjb6n\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.391801 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180f6fad-fff1-4831-8332-027c32b35d7d-apiservice-cert\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.391821 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180f6fad-fff1-4831-8332-027c32b35d7d-webhook-cert\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.396543 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/180f6fad-fff1-4831-8332-027c32b35d7d-apiservice-cert\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.396543 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/180f6fad-fff1-4831-8332-027c32b35d7d-webhook-cert\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.422960 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjb6n\" (UniqueName: \"kubernetes.io/projected/180f6fad-fff1-4831-8332-027c32b35d7d-kube-api-access-vjb6n\") pod \"metallb-operator-webhook-server-55488488b6-jbcct\" (UID: \"180f6fad-fff1-4831-8332-027c32b35d7d\") " pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.456873 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.706481 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw"] Mar 13 15:18:19 crc kubenswrapper[4786]: I0313 15:18:19.929537 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55488488b6-jbcct"] Mar 13 15:18:20 crc kubenswrapper[4786]: I0313 15:18:20.388898 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" event={"ID":"76502e53-99f8-40ce-a930-f34b84f5718f","Type":"ContainerStarted","Data":"bde4e7883570d4115cdb0075072eb5069fda44e3eb2a4fd4dda54cebe247e31d"} Mar 13 15:18:20 crc kubenswrapper[4786]: I0313 15:18:20.389943 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" event={"ID":"180f6fad-fff1-4831-8332-027c32b35d7d","Type":"ContainerStarted","Data":"d7ca12411c63edf9e353d7af3bfc4314c7bb0a5cacfc20ab068686d5d74c8d90"} Mar 13 15:18:24 crc kubenswrapper[4786]: I0313 15:18:24.418296 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" event={"ID":"76502e53-99f8-40ce-a930-f34b84f5718f","Type":"ContainerStarted","Data":"41a858317d093ab480499d168ef16a7608876e3d0ca9c5797f69081756537c65"} Mar 13 15:18:24 crc kubenswrapper[4786]: I0313 15:18:24.418844 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:24 crc kubenswrapper[4786]: I0313 15:18:24.441226 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" podStartSLOduration=2.604601604 podStartE2EDuration="6.441210491s" podCreationTimestamp="2026-03-13 15:18:18 +0000 UTC" firstStartedPulling="2026-03-13 15:18:19.722027259 +0000 UTC m=+929.885239070" lastFinishedPulling="2026-03-13 15:18:23.558636146 +0000 UTC m=+933.721847957" observedRunningTime="2026-03-13 15:18:24.440257247 +0000 UTC m=+934.603469068" watchObservedRunningTime="2026-03-13 15:18:24.441210491 +0000 UTC m=+934.604422302" Mar 13 15:18:25 crc kubenswrapper[4786]: I0313 15:18:25.426296 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" event={"ID":"180f6fad-fff1-4831-8332-027c32b35d7d","Type":"ContainerStarted","Data":"181534b13a8f2cce4789d639df8882d3e92905fcbb879d7503bd6abd155c90f6"} Mar 13 15:18:25 crc kubenswrapper[4786]: I0313 15:18:25.445737 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" podStartSLOduration=1.405039148 podStartE2EDuration="6.445715819s" podCreationTimestamp="2026-03-13 15:18:19 +0000 UTC" firstStartedPulling="2026-03-13 15:18:19.931240779 +0000 UTC m=+930.094452590" lastFinishedPulling="2026-03-13 15:18:24.97191745 +0000 UTC m=+935.135129261" observedRunningTime="2026-03-13 15:18:25.440384554 +0000 UTC m=+935.603596365" watchObservedRunningTime="2026-03-13 15:18:25.445715819 +0000 UTC m=+935.608927640" Mar 13 15:18:26 crc kubenswrapper[4786]: I0313 15:18:26.431561 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:39 crc kubenswrapper[4786]: I0313 15:18:39.462948 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-55488488b6-jbcct" Mar 13 15:18:52 crc kubenswrapper[4786]: I0313 15:18:52.099344 4786 scope.go:117] "RemoveContainer" containerID="0c586000742afd995ab94f45fac2bafe638c77eb3b70e8f334797b831010644c" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.239070 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6db9d94745-cm6rw" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.871191 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8tcjh"] Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.873923 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.877016 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.877331 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.877516 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-klmqh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.887932 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b"] Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.888603 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.894178 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.902191 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b"] Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdc20c1e-eff7-4478-a06a-05dacc2f169a-metrics-certs\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914289 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqbqr\" (UniqueName: \"kubernetes.io/projected/bdc20c1e-eff7-4478-a06a-05dacc2f169a-kube-api-access-fqbqr\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914317 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rm77\" (UniqueName: \"kubernetes.io/projected/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-kube-api-access-8rm77\") pod \"frr-k8s-webhook-server-bcc4b6f68-gmz8b\" (UID: \"9ebdbb3b-8b33-42de-8150-f13a385b6bb6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914353 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gmz8b\" (UID: \"9ebdbb3b-8b33-42de-8150-f13a385b6bb6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914387 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-startup\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914413 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-sockets\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914434 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-conf\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914466 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-metrics\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.914490 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-reloader\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.956889 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4lzrc"] Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.957895 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4lzrc" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.963240 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.963256 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.963357 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nchpn" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.963400 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.970156 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-c4kjj"] Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.971017 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.973977 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 15:18:59 crc kubenswrapper[4786]: I0313 15:18:59.984802 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-c4kjj"] Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015269 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-metrics-certs\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015313 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727p9\" (UniqueName: \"kubernetes.io/projected/d6114a7a-189a-4212-9669-4addfc43c839-kube-api-access-727p9\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015341 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-startup\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015376 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-sockets\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015403 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-conf\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015433 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d6114a7a-189a-4212-9669-4addfc43c839-metallb-excludel2\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015453 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-metrics-certs\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015476 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-metrics\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-cert\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-reloader\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015531 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015551 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdc20c1e-eff7-4478-a06a-05dacc2f169a-metrics-certs\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015566 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mkw\" (UniqueName: \"kubernetes.io/projected/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-kube-api-access-c2mkw\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015588 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqbqr\" (UniqueName: \"kubernetes.io/projected/bdc20c1e-eff7-4478-a06a-05dacc2f169a-kube-api-access-fqbqr\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015605 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rm77\" (UniqueName: \"kubernetes.io/projected/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-kube-api-access-8rm77\") pod \"frr-k8s-webhook-server-bcc4b6f68-gmz8b\" (UID: \"9ebdbb3b-8b33-42de-8150-f13a385b6bb6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.015648 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gmz8b\" (UID: \"9ebdbb3b-8b33-42de-8150-f13a385b6bb6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:19:00 crc kubenswrapper[4786]: E0313 15:19:00.015761 4786 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 13 15:19:00 crc kubenswrapper[4786]: E0313 15:19:00.015816 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-cert podName:9ebdbb3b-8b33-42de-8150-f13a385b6bb6 nodeName:}" failed. No retries permitted until 2026-03-13 15:19:00.515796588 +0000 UTC m=+970.679008399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-cert") pod "frr-k8s-webhook-server-bcc4b6f68-gmz8b" (UID: "9ebdbb3b-8b33-42de-8150-f13a385b6bb6") : secret "frr-k8s-webhook-server-cert" not found Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.016801 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-startup\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.017031 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-sockets\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.017207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-frr-conf\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.017406 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-metrics\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.017594 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bdc20c1e-eff7-4478-a06a-05dacc2f169a-reloader\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.024663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdc20c1e-eff7-4478-a06a-05dacc2f169a-metrics-certs\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.034450 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rm77\" (UniqueName: \"kubernetes.io/projected/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-kube-api-access-8rm77\") pod \"frr-k8s-webhook-server-bcc4b6f68-gmz8b\" (UID: \"9ebdbb3b-8b33-42de-8150-f13a385b6bb6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.038820 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqbqr\" (UniqueName: \"kubernetes.io/projected/bdc20c1e-eff7-4478-a06a-05dacc2f169a-kube-api-access-fqbqr\") pod \"frr-k8s-8tcjh\" (UID: \"bdc20c1e-eff7-4478-a06a-05dacc2f169a\") " pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.116827 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-metrics-certs\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.116918 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-727p9\" (UniqueName: \"kubernetes.io/projected/d6114a7a-189a-4212-9669-4addfc43c839-kube-api-access-727p9\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.116966 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d6114a7a-189a-4212-9669-4addfc43c839-metallb-excludel2\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.116993 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-metrics-certs\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.117021 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-cert\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.117047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.117062 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mkw\" (UniqueName: \"kubernetes.io/projected/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-kube-api-access-c2mkw\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: E0313 15:19:00.117962 4786 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 13 15:19:00 crc kubenswrapper[4786]: E0313 15:19:00.118013 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-metrics-certs podName:a2108fd5-0d1c-4e6c-8b30-adea0a1545ac nodeName:}" failed. No retries permitted until 2026-03-13 15:19:00.617996162 +0000 UTC m=+970.781207973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-metrics-certs") pod "controller-7bb4cc7c98-c4kjj" (UID: "a2108fd5-0d1c-4e6c-8b30-adea0a1545ac") : secret "controller-certs-secret" not found Mar 13 15:19:00 crc kubenswrapper[4786]: E0313 15:19:00.118434 4786 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 15:19:00 crc kubenswrapper[4786]: E0313 15:19:00.118606 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist podName:d6114a7a-189a-4212-9669-4addfc43c839 nodeName:}" failed. No retries permitted until 2026-03-13 15:19:00.618565446 +0000 UTC m=+970.781777467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist") pod "speaker-4lzrc" (UID: "d6114a7a-189a-4212-9669-4addfc43c839") : secret "metallb-memberlist" not found Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.119000 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d6114a7a-189a-4212-9669-4addfc43c839-metallb-excludel2\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.121394 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-metrics-certs\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.121607 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.131230 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-cert\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.137823 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mkw\" (UniqueName: \"kubernetes.io/projected/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-kube-api-access-c2mkw\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.140143 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-727p9\" (UniqueName: \"kubernetes.io/projected/d6114a7a-189a-4212-9669-4addfc43c839-kube-api-access-727p9\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.195845 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.521706 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gmz8b\" (UID: \"9ebdbb3b-8b33-42de-8150-f13a385b6bb6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.530301 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ebdbb3b-8b33-42de-8150-f13a385b6bb6-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gmz8b\" (UID: \"9ebdbb3b-8b33-42de-8150-f13a385b6bb6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.623398 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:00 crc kubenswrapper[4786]: E0313 15:19:00.623648 4786 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.623700 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-metrics-certs\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: E0313 15:19:00.623782 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist podName:d6114a7a-189a-4212-9669-4addfc43c839 nodeName:}" failed. No retries permitted until 2026-03-13 15:19:01.62375343 +0000 UTC m=+971.786965241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist") pod "speaker-4lzrc" (UID: "d6114a7a-189a-4212-9669-4addfc43c839") : secret "metallb-memberlist" not found Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.627293 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2108fd5-0d1c-4e6c-8b30-adea0a1545ac-metrics-certs\") pod \"controller-7bb4cc7c98-c4kjj\" (UID: \"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac\") " pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.660645 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerStarted","Data":"0cc82bfd2666d7b831fac446c19c83f04c2d415f54dd752bec4a6fa93fae86e3"} Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.817730 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:19:00 crc kubenswrapper[4786]: I0313 15:19:00.888307 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.138257 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-c4kjj"] Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.309228 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b"] Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.638003 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.644834 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d6114a7a-189a-4212-9669-4addfc43c839-memberlist\") pod \"speaker-4lzrc\" (UID: \"d6114a7a-189a-4212-9669-4addfc43c839\") " pod="metallb-system/speaker-4lzrc" Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.667804 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-c4kjj" event={"ID":"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac","Type":"ContainerStarted","Data":"e0ccb7e838e18ab72274a2f78742abd22db5cdf7ba520573796d3ba55e22aaba"} Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.667872 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-c4kjj" event={"ID":"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac","Type":"ContainerStarted","Data":"9e72d79f26d0b9d6c70316d381210243ddafe1d05717d3025de771bfc220f8bf"} Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.667886 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-c4kjj" event={"ID":"a2108fd5-0d1c-4e6c-8b30-adea0a1545ac","Type":"ContainerStarted","Data":"e82298bc82c85cb337ce9d84173b532354131090c0281601fc8064658bada4a2"} Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.668200 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.669206 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" event={"ID":"9ebdbb3b-8b33-42de-8150-f13a385b6bb6","Type":"ContainerStarted","Data":"99c299a17cb1bf9990cd928c2950a06ddc1f66ebd578b2a6cf5d343ad9b591e2"} Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.697148 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-c4kjj" podStartSLOduration=2.697124149 podStartE2EDuration="2.697124149s" podCreationTimestamp="2026-03-13 15:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:19:01.692676697 +0000 UTC m=+971.855888518" watchObservedRunningTime="2026-03-13 15:19:01.697124149 +0000 UTC m=+971.860335960" Mar 13 15:19:01 crc kubenswrapper[4786]: I0313 15:19:01.773270 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4lzrc" Mar 13 15:19:01 crc kubenswrapper[4786]: W0313 15:19:01.789280 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6114a7a_189a_4212_9669_4addfc43c839.slice/crio-d3ccbd80f521acde157eb45ad5d3510098219ac36cc0c34c97860007aade0914 WatchSource:0}: Error finding container d3ccbd80f521acde157eb45ad5d3510098219ac36cc0c34c97860007aade0914: Status 404 returned error can't find the container with id d3ccbd80f521acde157eb45ad5d3510098219ac36cc0c34c97860007aade0914 Mar 13 15:19:02 crc kubenswrapper[4786]: I0313 15:19:02.686297 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4lzrc" event={"ID":"d6114a7a-189a-4212-9669-4addfc43c839","Type":"ContainerStarted","Data":"79296e0f6a6dd1888376f74612298033faafa86217e1ef9bedeec99bbd8b528a"} Mar 13 15:19:02 crc kubenswrapper[4786]: I0313 15:19:02.686912 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4lzrc" event={"ID":"d6114a7a-189a-4212-9669-4addfc43c839","Type":"ContainerStarted","Data":"3432b6307a941ebdc70564173cfa69c173ae2751e4b0f71b6e75328637895bb9"} Mar 13 15:19:02 crc kubenswrapper[4786]: I0313 15:19:02.686934 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4lzrc" event={"ID":"d6114a7a-189a-4212-9669-4addfc43c839","Type":"ContainerStarted","Data":"d3ccbd80f521acde157eb45ad5d3510098219ac36cc0c34c97860007aade0914"} Mar 13 15:19:02 crc kubenswrapper[4786]: I0313 15:19:02.687275 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4lzrc" Mar 13 15:19:02 crc kubenswrapper[4786]: I0313 15:19:02.707030 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4lzrc" podStartSLOduration=3.706995549 podStartE2EDuration="3.706995549s" podCreationTimestamp="2026-03-13 15:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:19:02.704848805 +0000 UTC m=+972.868060626" watchObservedRunningTime="2026-03-13 15:19:02.706995549 +0000 UTC m=+972.870207360" Mar 13 15:19:08 crc kubenswrapper[4786]: I0313 15:19:08.729037 4786 generic.go:334] "Generic (PLEG): container finished" podID="bdc20c1e-eff7-4478-a06a-05dacc2f169a" containerID="5d0967537d8a21e452ba5f7a639be2d32dbe36a17f1bdae4c47a2792b57f2a94" exitCode=0 Mar 13 15:19:08 crc kubenswrapper[4786]: I0313 15:19:08.729154 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerDied","Data":"5d0967537d8a21e452ba5f7a639be2d32dbe36a17f1bdae4c47a2792b57f2a94"} Mar 13 15:19:08 crc kubenswrapper[4786]: I0313 15:19:08.731007 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" event={"ID":"9ebdbb3b-8b33-42de-8150-f13a385b6bb6","Type":"ContainerStarted","Data":"7351321e3b1636a51780431574b06f3c5d6bb2234cd013cfcc73ebfa1835f416"} Mar 13 15:19:08 crc kubenswrapper[4786]: I0313 15:19:08.731172 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:19:08 crc kubenswrapper[4786]: I0313 15:19:08.768912 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" podStartSLOduration=3.197337389 podStartE2EDuration="9.768888864s" podCreationTimestamp="2026-03-13 15:18:59 +0000 UTC" firstStartedPulling="2026-03-13 15:19:01.321162963 +0000 UTC m=+971.484374784" lastFinishedPulling="2026-03-13 15:19:07.892714448 +0000 UTC m=+978.055926259" observedRunningTime="2026-03-13 15:19:08.764986465 +0000 UTC m=+978.928198276" watchObservedRunningTime="2026-03-13 15:19:08.768888864 +0000 UTC m=+978.932100695" Mar 13 15:19:09 crc kubenswrapper[4786]: I0313 15:19:09.740811 4786 generic.go:334] "Generic (PLEG): container finished" podID="bdc20c1e-eff7-4478-a06a-05dacc2f169a" containerID="f4cae8bb07a7b64f75c6b9e693b303a59f1111c461004d9e6e29331c147b4822" exitCode=0 Mar 13 15:19:09 crc kubenswrapper[4786]: I0313 15:19:09.740843 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerDied","Data":"f4cae8bb07a7b64f75c6b9e693b303a59f1111c461004d9e6e29331c147b4822"} Mar 13 15:19:10 crc kubenswrapper[4786]: I0313 15:19:10.749574 4786 generic.go:334] "Generic (PLEG): container finished" podID="bdc20c1e-eff7-4478-a06a-05dacc2f169a" containerID="85e59e1f4b41a8e86ee70eb5b4a8d63e0be7afed4621248ee249d71557750cba" exitCode=0 Mar 13 15:19:10 crc kubenswrapper[4786]: I0313 15:19:10.749661 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerDied","Data":"85e59e1f4b41a8e86ee70eb5b4a8d63e0be7afed4621248ee249d71557750cba"} Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.760220 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerStarted","Data":"aa09eb3318dcc67e206162b2a179fe9511d21cc74801c1c61ba16883d7c9cb80"} Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.760560 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerStarted","Data":"03f5ac9cb717f93886fa29dea961205526ca5a05eb94590a16520f06a26bfbd5"} Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.760574 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerStarted","Data":"7feec2964efe5a41defaf35c285dffffdc2cf37156603824bb48867e3d51cabf"} Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.760585 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerStarted","Data":"cdecdca33b83e7b68b6a0e403183a9ec1a9ad7e36f6f5f181ea530e10f6fb0e3"} Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.760597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerStarted","Data":"40428833912f9f982bf22e93295c02b44a19e785a1290fbd6474478612b99738"} Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.851897 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ls9hn"] Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.853239 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.866929 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls9hn"] Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.902197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-catalog-content\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.902290 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnn5\" (UniqueName: \"kubernetes.io/projected/98952f75-10e8-4a5b-a279-09f751103745-kube-api-access-cgnn5\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:11 crc kubenswrapper[4786]: I0313 15:19:11.902427 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-utilities\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.003522 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnn5\" (UniqueName: \"kubernetes.io/projected/98952f75-10e8-4a5b-a279-09f751103745-kube-api-access-cgnn5\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.003625 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-utilities\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.003696 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-catalog-content\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.004304 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-utilities\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.004388 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-catalog-content\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.050988 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnn5\" (UniqueName: \"kubernetes.io/projected/98952f75-10e8-4a5b-a279-09f751103745-kube-api-access-cgnn5\") pod \"redhat-marketplace-ls9hn\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.180173 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.392729 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls9hn"] Mar 13 15:19:12 crc kubenswrapper[4786]: W0313 15:19:12.404040 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98952f75_10e8_4a5b_a279_09f751103745.slice/crio-75f16cb30aa457253762bd5f2b79413e7eacabf407870fa0e490ae085235ff02 WatchSource:0}: Error finding container 75f16cb30aa457253762bd5f2b79413e7eacabf407870fa0e490ae085235ff02: Status 404 returned error can't find the container with id 75f16cb30aa457253762bd5f2b79413e7eacabf407870fa0e490ae085235ff02 Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.768753 4786 generic.go:334] "Generic (PLEG): container finished" podID="98952f75-10e8-4a5b-a279-09f751103745" containerID="fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e" exitCode=0 Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.768951 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls9hn" event={"ID":"98952f75-10e8-4a5b-a279-09f751103745","Type":"ContainerDied","Data":"fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e"} Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.770046 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls9hn" event={"ID":"98952f75-10e8-4a5b-a279-09f751103745","Type":"ContainerStarted","Data":"75f16cb30aa457253762bd5f2b79413e7eacabf407870fa0e490ae085235ff02"} Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.775303 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8tcjh" event={"ID":"bdc20c1e-eff7-4478-a06a-05dacc2f169a","Type":"ContainerStarted","Data":"a5c8f5056219824374ff05279c678db0839c90ef1545269b070da435371b5940"} Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.775483 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:12 crc kubenswrapper[4786]: I0313 15:19:12.809686 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8tcjh" podStartSLOduration=6.255502744 podStartE2EDuration="13.809654323s" podCreationTimestamp="2026-03-13 15:18:59 +0000 UTC" firstStartedPulling="2026-03-13 15:19:00.315788033 +0000 UTC m=+970.478999844" lastFinishedPulling="2026-03-13 15:19:07.869939612 +0000 UTC m=+978.033151423" observedRunningTime="2026-03-13 15:19:12.807372856 +0000 UTC m=+982.970584677" watchObservedRunningTime="2026-03-13 15:19:12.809654323 +0000 UTC m=+982.972866134" Mar 13 15:19:13 crc kubenswrapper[4786]: I0313 15:19:13.788025 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls9hn" event={"ID":"98952f75-10e8-4a5b-a279-09f751103745","Type":"ContainerStarted","Data":"dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a"} Mar 13 15:19:14 crc kubenswrapper[4786]: I0313 15:19:14.796947 4786 generic.go:334] "Generic (PLEG): container finished" podID="98952f75-10e8-4a5b-a279-09f751103745" containerID="dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a" exitCode=0 Mar 13 15:19:14 crc kubenswrapper[4786]: I0313 15:19:14.796999 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls9hn" event={"ID":"98952f75-10e8-4a5b-a279-09f751103745","Type":"ContainerDied","Data":"dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a"} Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.196831 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.260725 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.805918 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls9hn" event={"ID":"98952f75-10e8-4a5b-a279-09f751103745","Type":"ContainerStarted","Data":"0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6"} Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.835059 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pr8nc"] Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.836544 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.850295 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pr8nc"] Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.876449 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ls9hn" podStartSLOduration=2.314134325 podStartE2EDuration="4.876433861s" podCreationTimestamp="2026-03-13 15:19:11 +0000 UTC" firstStartedPulling="2026-03-13 15:19:12.770165124 +0000 UTC m=+982.933376935" lastFinishedPulling="2026-03-13 15:19:15.33246464 +0000 UTC m=+985.495676471" observedRunningTime="2026-03-13 15:19:15.873540628 +0000 UTC m=+986.036752449" watchObservedRunningTime="2026-03-13 15:19:15.876433861 +0000 UTC m=+986.039645702" Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.949758 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-catalog-content\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.949968 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-utilities\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:15 crc kubenswrapper[4786]: I0313 15:19:15.950047 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbnl5\" (UniqueName: \"kubernetes.io/projected/c48e218d-f916-4d1b-9b51-3f2a373c98f3-kube-api-access-hbnl5\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.051137 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-utilities\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.051210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbnl5\" (UniqueName: \"kubernetes.io/projected/c48e218d-f916-4d1b-9b51-3f2a373c98f3-kube-api-access-hbnl5\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.051272 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-catalog-content\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.051665 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-utilities\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.051768 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-catalog-content\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.077801 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbnl5\" (UniqueName: \"kubernetes.io/projected/c48e218d-f916-4d1b-9b51-3f2a373c98f3-kube-api-access-hbnl5\") pod \"community-operators-pr8nc\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.157378 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.414630 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pr8nc"] Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.813086 4786 generic.go:334] "Generic (PLEG): container finished" podID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerID="5510b7fcf3a8b9e32451559f503768eca27d1172cc7eecf7add0a34690eb5098" exitCode=0 Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.814327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8nc" event={"ID":"c48e218d-f916-4d1b-9b51-3f2a373c98f3","Type":"ContainerDied","Data":"5510b7fcf3a8b9e32451559f503768eca27d1172cc7eecf7add0a34690eb5098"} Mar 13 15:19:16 crc kubenswrapper[4786]: I0313 15:19:16.814391 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8nc" event={"ID":"c48e218d-f916-4d1b-9b51-3f2a373c98f3","Type":"ContainerStarted","Data":"2e1a14dcd858cfd2fd405093b9258055b9c0f07be80812b4c78d13cacdd38ddb"} Mar 13 15:19:17 crc kubenswrapper[4786]: I0313 15:19:17.821777 4786 generic.go:334] "Generic (PLEG): container finished" podID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerID="27bb11171d88888ee9eeb661f6ad77c0e4d42dd2ea699ce474e1313f377b6f15" exitCode=0 Mar 13 15:19:17 crc kubenswrapper[4786]: I0313 15:19:17.821830 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8nc" event={"ID":"c48e218d-f916-4d1b-9b51-3f2a373c98f3","Type":"ContainerDied","Data":"27bb11171d88888ee9eeb661f6ad77c0e4d42dd2ea699ce474e1313f377b6f15"} Mar 13 15:19:18 crc kubenswrapper[4786]: I0313 15:19:18.828378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8nc" event={"ID":"c48e218d-f916-4d1b-9b51-3f2a373c98f3","Type":"ContainerStarted","Data":"ee6a5d585ed17692280affdb6a910d0b19d5905356fa6fff9906657d6b2be363"} Mar 13 15:19:18 crc kubenswrapper[4786]: I0313 15:19:18.843138 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pr8nc" podStartSLOduration=2.317557211 podStartE2EDuration="3.843119877s" podCreationTimestamp="2026-03-13 15:19:15 +0000 UTC" firstStartedPulling="2026-03-13 15:19:16.814814602 +0000 UTC m=+986.978026413" lastFinishedPulling="2026-03-13 15:19:18.340377258 +0000 UTC m=+988.503589079" observedRunningTime="2026-03-13 15:19:18.842590054 +0000 UTC m=+989.005801875" watchObservedRunningTime="2026-03-13 15:19:18.843119877 +0000 UTC m=+989.006331688" Mar 13 15:19:20 crc kubenswrapper[4786]: I0313 15:19:20.199044 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8tcjh" Mar 13 15:19:20 crc kubenswrapper[4786]: I0313 15:19:20.822963 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gmz8b" Mar 13 15:19:20 crc kubenswrapper[4786]: I0313 15:19:20.892960 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-c4kjj" Mar 13 15:19:21 crc kubenswrapper[4786]: I0313 15:19:21.777499 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4lzrc" Mar 13 15:19:22 crc kubenswrapper[4786]: I0313 15:19:22.180941 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:22 crc kubenswrapper[4786]: I0313 15:19:22.181220 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:22 crc kubenswrapper[4786]: I0313 15:19:22.221361 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:22 crc kubenswrapper[4786]: I0313 15:19:22.930718 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.106777 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q"] Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.107921 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.109896 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.120155 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q"] Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.241452 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.241522 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28s9t\" (UniqueName: \"kubernetes.io/projected/f8c00607-24ac-4811-9a56-92304be2396e-kube-api-access-28s9t\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.241845 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.343416 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.343491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.343518 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28s9t\" (UniqueName: \"kubernetes.io/projected/f8c00607-24ac-4811-9a56-92304be2396e-kube-api-access-28s9t\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.343941 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.344108 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.372347 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28s9t\" (UniqueName: \"kubernetes.io/projected/f8c00607-24ac-4811-9a56-92304be2396e-kube-api-access-28s9t\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.426156 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.622105 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q"] Mar 13 15:19:23 crc kubenswrapper[4786]: I0313 15:19:23.858004 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" event={"ID":"f8c00607-24ac-4811-9a56-92304be2396e","Type":"ContainerStarted","Data":"6e67ad27c037f13f2090e09ea0e61606e4aa19cc2549f6f59eb0dd3dfd8b6b21"} Mar 13 15:19:25 crc kubenswrapper[4786]: I0313 15:19:25.870594 4786 generic.go:334] "Generic (PLEG): container finished" podID="f8c00607-24ac-4811-9a56-92304be2396e" containerID="17ff6fb631b01dca0d0127987cb95f4fab80a6a42332ae1af5f9f8da8c2c730e" exitCode=0 Mar 13 15:19:25 crc kubenswrapper[4786]: I0313 15:19:25.870654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" event={"ID":"f8c00607-24ac-4811-9a56-92304be2396e","Type":"ContainerDied","Data":"17ff6fb631b01dca0d0127987cb95f4fab80a6a42332ae1af5f9f8da8c2c730e"} Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.158023 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.158077 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.209363 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.463214 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls9hn"] Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.463476 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ls9hn" podUID="98952f75-10e8-4a5b-a279-09f751103745" containerName="registry-server" containerID="cri-o://0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6" gracePeriod=2 Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.817336 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.879371 4786 generic.go:334] "Generic (PLEG): container finished" podID="98952f75-10e8-4a5b-a279-09f751103745" containerID="0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6" exitCode=0 Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.879444 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls9hn" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.879444 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls9hn" event={"ID":"98952f75-10e8-4a5b-a279-09f751103745","Type":"ContainerDied","Data":"0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6"} Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.879500 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls9hn" event={"ID":"98952f75-10e8-4a5b-a279-09f751103745","Type":"ContainerDied","Data":"75f16cb30aa457253762bd5f2b79413e7eacabf407870fa0e490ae085235ff02"} Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.879524 4786 scope.go:117] "RemoveContainer" containerID="0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.901238 4786 scope.go:117] "RemoveContainer" containerID="dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.920695 4786 scope.go:117] "RemoveContainer" containerID="fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.925381 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.949583 4786 scope.go:117] "RemoveContainer" containerID="0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6" Mar 13 15:19:26 crc kubenswrapper[4786]: E0313 15:19:26.950150 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6\": container with ID starting with 0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6 not found: ID does not exist" containerID="0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.950200 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6"} err="failed to get container status \"0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6\": rpc error: code = NotFound desc = could not find container \"0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6\": container with ID starting with 0e29b548ccdfae4ed82c9550bf9e54eaeea6cff918cd409634e665b10ca107f6 not found: ID does not exist" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.950243 4786 scope.go:117] "RemoveContainer" containerID="dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a" Mar 13 15:19:26 crc kubenswrapper[4786]: E0313 15:19:26.950618 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a\": container with ID starting with dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a not found: ID does not exist" containerID="dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.950644 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a"} err="failed to get container status \"dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a\": rpc error: code = NotFound desc = could not find container \"dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a\": container with ID starting with dcce6700f1b0315050bd18c9f1d245528dbeb70319c94391915ef6e8373d075a not found: ID does not exist" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.950661 4786 scope.go:117] "RemoveContainer" containerID="fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e" Mar 13 15:19:26 crc kubenswrapper[4786]: E0313 15:19:26.951046 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e\": container with ID starting with fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e not found: ID does not exist" containerID="fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.951083 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e"} err="failed to get container status \"fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e\": rpc error: code = NotFound desc = could not find container \"fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e\": container with ID starting with fd9d862aab86b81c9de35d4d1e15ce0cfcee04cff755b1add57a32359e13583e not found: ID does not exist" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.990974 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-utilities\") pod \"98952f75-10e8-4a5b-a279-09f751103745\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.991069 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnn5\" (UniqueName: \"kubernetes.io/projected/98952f75-10e8-4a5b-a279-09f751103745-kube-api-access-cgnn5\") pod \"98952f75-10e8-4a5b-a279-09f751103745\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.991115 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-catalog-content\") pod \"98952f75-10e8-4a5b-a279-09f751103745\" (UID: \"98952f75-10e8-4a5b-a279-09f751103745\") " Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.992342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-utilities" (OuterVolumeSpecName: "utilities") pod "98952f75-10e8-4a5b-a279-09f751103745" (UID: "98952f75-10e8-4a5b-a279-09f751103745"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:19:26 crc kubenswrapper[4786]: I0313 15:19:26.996617 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98952f75-10e8-4a5b-a279-09f751103745-kube-api-access-cgnn5" (OuterVolumeSpecName: "kube-api-access-cgnn5") pod "98952f75-10e8-4a5b-a279-09f751103745" (UID: "98952f75-10e8-4a5b-a279-09f751103745"). InnerVolumeSpecName "kube-api-access-cgnn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:27 crc kubenswrapper[4786]: I0313 15:19:27.022018 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98952f75-10e8-4a5b-a279-09f751103745" (UID: "98952f75-10e8-4a5b-a279-09f751103745"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:19:27 crc kubenswrapper[4786]: I0313 15:19:27.092744 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:27 crc kubenswrapper[4786]: I0313 15:19:27.092788 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnn5\" (UniqueName: \"kubernetes.io/projected/98952f75-10e8-4a5b-a279-09f751103745-kube-api-access-cgnn5\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:27 crc kubenswrapper[4786]: I0313 15:19:27.092807 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98952f75-10e8-4a5b-a279-09f751103745-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:27 crc kubenswrapper[4786]: I0313 15:19:27.212063 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls9hn"] Mar 13 15:19:27 crc kubenswrapper[4786]: I0313 15:19:27.216056 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls9hn"] Mar 13 15:19:28 crc kubenswrapper[4786]: I0313 15:19:28.570510 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98952f75-10e8-4a5b-a279-09f751103745" path="/var/lib/kubelet/pods/98952f75-10e8-4a5b-a279-09f751103745/volumes" Mar 13 15:19:29 crc kubenswrapper[4786]: I0313 15:19:29.682237 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pr8nc"] Mar 13 15:19:29 crc kubenswrapper[4786]: I0313 15:19:29.682879 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pr8nc" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerName="registry-server" containerID="cri-o://ee6a5d585ed17692280affdb6a910d0b19d5905356fa6fff9906657d6b2be363" gracePeriod=2 Mar 13 15:19:29 crc kubenswrapper[4786]: I0313 15:19:29.914603 4786 generic.go:334] "Generic (PLEG): container finished" podID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerID="ee6a5d585ed17692280affdb6a910d0b19d5905356fa6fff9906657d6b2be363" exitCode=0 Mar 13 15:19:29 crc kubenswrapper[4786]: I0313 15:19:29.914675 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8nc" event={"ID":"c48e218d-f916-4d1b-9b51-3f2a373c98f3","Type":"ContainerDied","Data":"ee6a5d585ed17692280affdb6a910d0b19d5905356fa6fff9906657d6b2be363"} Mar 13 15:19:29 crc kubenswrapper[4786]: I0313 15:19:29.916329 4786 generic.go:334] "Generic (PLEG): container finished" podID="f8c00607-24ac-4811-9a56-92304be2396e" containerID="fc3ac5a3969785278f6c246ec6413c1846b4c521c58fda610b580ec6b7be43d3" exitCode=0 Mar 13 15:19:29 crc kubenswrapper[4786]: I0313 15:19:29.916357 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" event={"ID":"f8c00607-24ac-4811-9a56-92304be2396e","Type":"ContainerDied","Data":"fc3ac5a3969785278f6c246ec6413c1846b4c521c58fda610b580ec6b7be43d3"} Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.102890 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.233033 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-catalog-content\") pod \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.233084 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-utilities\") pod \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.233122 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbnl5\" (UniqueName: \"kubernetes.io/projected/c48e218d-f916-4d1b-9b51-3f2a373c98f3-kube-api-access-hbnl5\") pod \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\" (UID: \"c48e218d-f916-4d1b-9b51-3f2a373c98f3\") " Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.233930 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-utilities" (OuterVolumeSpecName: "utilities") pod "c48e218d-f916-4d1b-9b51-3f2a373c98f3" (UID: "c48e218d-f916-4d1b-9b51-3f2a373c98f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.242087 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c48e218d-f916-4d1b-9b51-3f2a373c98f3-kube-api-access-hbnl5" (OuterVolumeSpecName: "kube-api-access-hbnl5") pod "c48e218d-f916-4d1b-9b51-3f2a373c98f3" (UID: "c48e218d-f916-4d1b-9b51-3f2a373c98f3"). InnerVolumeSpecName "kube-api-access-hbnl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.286456 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c48e218d-f916-4d1b-9b51-3f2a373c98f3" (UID: "c48e218d-f916-4d1b-9b51-3f2a373c98f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.334708 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbnl5\" (UniqueName: \"kubernetes.io/projected/c48e218d-f916-4d1b-9b51-3f2a373c98f3-kube-api-access-hbnl5\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.334755 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.334767 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48e218d-f916-4d1b-9b51-3f2a373c98f3-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.927060 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pr8nc" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.927047 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pr8nc" event={"ID":"c48e218d-f916-4d1b-9b51-3f2a373c98f3","Type":"ContainerDied","Data":"2e1a14dcd858cfd2fd405093b9258055b9c0f07be80812b4c78d13cacdd38ddb"} Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.927250 4786 scope.go:117] "RemoveContainer" containerID="ee6a5d585ed17692280affdb6a910d0b19d5905356fa6fff9906657d6b2be363" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.931745 4786 generic.go:334] "Generic (PLEG): container finished" podID="f8c00607-24ac-4811-9a56-92304be2396e" containerID="8e1675cf82c8ef544537f3362303153feb695886fa1e9147a8727353f01e6657" exitCode=0 Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.931802 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" event={"ID":"f8c00607-24ac-4811-9a56-92304be2396e","Type":"ContainerDied","Data":"8e1675cf82c8ef544537f3362303153feb695886fa1e9147a8727353f01e6657"} Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.943805 4786 scope.go:117] "RemoveContainer" containerID="27bb11171d88888ee9eeb661f6ad77c0e4d42dd2ea699ce474e1313f377b6f15" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.973939 4786 scope.go:117] "RemoveContainer" containerID="5510b7fcf3a8b9e32451559f503768eca27d1172cc7eecf7add0a34690eb5098" Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.976660 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pr8nc"] Mar 13 15:19:30 crc kubenswrapper[4786]: I0313 15:19:30.985037 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pr8nc"] Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.254466 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.362898 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-bundle\") pod \"f8c00607-24ac-4811-9a56-92304be2396e\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.362956 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-util\") pod \"f8c00607-24ac-4811-9a56-92304be2396e\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.363000 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28s9t\" (UniqueName: \"kubernetes.io/projected/f8c00607-24ac-4811-9a56-92304be2396e-kube-api-access-28s9t\") pod \"f8c00607-24ac-4811-9a56-92304be2396e\" (UID: \"f8c00607-24ac-4811-9a56-92304be2396e\") " Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.364087 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-bundle" (OuterVolumeSpecName: "bundle") pod "f8c00607-24ac-4811-9a56-92304be2396e" (UID: "f8c00607-24ac-4811-9a56-92304be2396e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.372036 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c00607-24ac-4811-9a56-92304be2396e-kube-api-access-28s9t" (OuterVolumeSpecName: "kube-api-access-28s9t") pod "f8c00607-24ac-4811-9a56-92304be2396e" (UID: "f8c00607-24ac-4811-9a56-92304be2396e"). InnerVolumeSpecName "kube-api-access-28s9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.373823 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-util" (OuterVolumeSpecName: "util") pod "f8c00607-24ac-4811-9a56-92304be2396e" (UID: "f8c00607-24ac-4811-9a56-92304be2396e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.464873 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.464949 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8c00607-24ac-4811-9a56-92304be2396e-util\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.464960 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28s9t\" (UniqueName: \"kubernetes.io/projected/f8c00607-24ac-4811-9a56-92304be2396e-kube-api-access-28s9t\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.559702 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" path="/var/lib/kubelet/pods/c48e218d-f916-4d1b-9b51-3f2a373c98f3/volumes" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.954040 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" event={"ID":"f8c00607-24ac-4811-9a56-92304be2396e","Type":"ContainerDied","Data":"6e67ad27c037f13f2090e09ea0e61606e4aa19cc2549f6f59eb0dd3dfd8b6b21"} Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.954431 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e67ad27c037f13f2090e09ea0e61606e4aa19cc2549f6f59eb0dd3dfd8b6b21" Mar 13 15:19:32 crc kubenswrapper[4786]: I0313 15:19:32.954083 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072268 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wk4sg"] Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072526 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98952f75-10e8-4a5b-a279-09f751103745" containerName="registry-server" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072540 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="98952f75-10e8-4a5b-a279-09f751103745" containerName="registry-server" Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072555 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98952f75-10e8-4a5b-a279-09f751103745" containerName="extract-utilities" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072562 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="98952f75-10e8-4a5b-a279-09f751103745" containerName="extract-utilities" Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072574 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerName="extract-utilities" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072581 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerName="extract-utilities" Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072589 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c00607-24ac-4811-9a56-92304be2396e" containerName="pull" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072596 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c00607-24ac-4811-9a56-92304be2396e" containerName="pull" Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072605 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c00607-24ac-4811-9a56-92304be2396e" containerName="extract" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072612 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c00607-24ac-4811-9a56-92304be2396e" containerName="extract" Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072618 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c00607-24ac-4811-9a56-92304be2396e" containerName="util" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072624 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c00607-24ac-4811-9a56-92304be2396e" containerName="util" Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072633 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerName="registry-server" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072639 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerName="registry-server" Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072650 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98952f75-10e8-4a5b-a279-09f751103745" containerName="extract-content" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072657 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="98952f75-10e8-4a5b-a279-09f751103745" containerName="extract-content" Mar 13 15:19:33 crc kubenswrapper[4786]: E0313 15:19:33.072663 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerName="extract-content" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072669 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerName="extract-content" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072777 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="98952f75-10e8-4a5b-a279-09f751103745" containerName="registry-server" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072787 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c48e218d-f916-4d1b-9b51-3f2a373c98f3" containerName="registry-server" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.072796 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8c00607-24ac-4811-9a56-92304be2396e" containerName="extract" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.073615 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.121936 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wk4sg"] Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.174986 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-utilities\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.175259 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6hx2\" (UniqueName: \"kubernetes.io/projected/5cd94b8d-62f2-496b-929a-f81b4f14aded-kube-api-access-d6hx2\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.175405 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-catalog-content\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.276867 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-utilities\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.277405 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6hx2\" (UniqueName: \"kubernetes.io/projected/5cd94b8d-62f2-496b-929a-f81b4f14aded-kube-api-access-d6hx2\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.277564 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-catalog-content\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.277477 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-utilities\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.277931 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-catalog-content\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.302964 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6hx2\" (UniqueName: \"kubernetes.io/projected/5cd94b8d-62f2-496b-929a-f81b4f14aded-kube-api-access-d6hx2\") pod \"certified-operators-wk4sg\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.428041 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.762594 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wk4sg"] Mar 13 15:19:33 crc kubenswrapper[4786]: I0313 15:19:33.961221 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk4sg" event={"ID":"5cd94b8d-62f2-496b-929a-f81b4f14aded","Type":"ContainerStarted","Data":"9a47cc8e989a3e4eb1c5082d87e7b90808a8bcb3cbb657ce4ab6fee72975f1fe"} Mar 13 15:19:34 crc kubenswrapper[4786]: I0313 15:19:34.967949 4786 generic.go:334] "Generic (PLEG): container finished" podID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerID="1198a163d70627064cd88c22ff7c31139b16c7f66b3ceecb2c0c871e96a43ba5" exitCode=0 Mar 13 15:19:34 crc kubenswrapper[4786]: I0313 15:19:34.968090 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk4sg" event={"ID":"5cd94b8d-62f2-496b-929a-f81b4f14aded","Type":"ContainerDied","Data":"1198a163d70627064cd88c22ff7c31139b16c7f66b3ceecb2c0c871e96a43ba5"} Mar 13 15:19:35 crc kubenswrapper[4786]: I0313 15:19:35.891286 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh"] Mar 13 15:19:35 crc kubenswrapper[4786]: I0313 15:19:35.892515 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" Mar 13 15:19:35 crc kubenswrapper[4786]: I0313 15:19:35.894361 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 13 15:19:35 crc kubenswrapper[4786]: I0313 15:19:35.894747 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 13 15:19:35 crc kubenswrapper[4786]: I0313 15:19:35.894817 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-vsmr9" Mar 13 15:19:35 crc kubenswrapper[4786]: I0313 15:19:35.906889 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh"] Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.031485 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g8gc\" (UniqueName: \"kubernetes.io/projected/4e85db10-18c2-474b-b572-c12121a79449-kube-api-access-5g8gc\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sldh\" (UID: \"4e85db10-18c2-474b-b572-c12121a79449\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.031548 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e85db10-18c2-474b-b572-c12121a79449-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sldh\" (UID: \"4e85db10-18c2-474b-b572-c12121a79449\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.133188 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g8gc\" (UniqueName: \"kubernetes.io/projected/4e85db10-18c2-474b-b572-c12121a79449-kube-api-access-5g8gc\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sldh\" (UID: \"4e85db10-18c2-474b-b572-c12121a79449\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.133257 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e85db10-18c2-474b-b572-c12121a79449-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sldh\" (UID: \"4e85db10-18c2-474b-b572-c12121a79449\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.133793 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4e85db10-18c2-474b-b572-c12121a79449-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sldh\" (UID: \"4e85db10-18c2-474b-b572-c12121a79449\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.158389 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g8gc\" (UniqueName: \"kubernetes.io/projected/4e85db10-18c2-474b-b572-c12121a79449-kube-api-access-5g8gc\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sldh\" (UID: \"4e85db10-18c2-474b-b572-c12121a79449\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.234934 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.448248 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh"] Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.981206 4786 generic.go:334] "Generic (PLEG): container finished" podID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerID="4ac1154c949d476c8b51a1b2f71748d450befb233d589270b7f9d8496ef557bf" exitCode=0 Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.981297 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk4sg" event={"ID":"5cd94b8d-62f2-496b-929a-f81b4f14aded","Type":"ContainerDied","Data":"4ac1154c949d476c8b51a1b2f71748d450befb233d589270b7f9d8496ef557bf"} Mar 13 15:19:36 crc kubenswrapper[4786]: I0313 15:19:36.982497 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" event={"ID":"4e85db10-18c2-474b-b572-c12121a79449","Type":"ContainerStarted","Data":"8f92ce1c20d503a8cf862b86cc4222a268bcd939bf8ed5e7311968dadd6ab0f6"} Mar 13 15:19:37 crc kubenswrapper[4786]: I0313 15:19:37.999210 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk4sg" event={"ID":"5cd94b8d-62f2-496b-929a-f81b4f14aded","Type":"ContainerStarted","Data":"32bf8d3baedb46dac164c172dcb5cfd77eb8c8f144f5590d09f8abfe30bd3bbb"} Mar 13 15:19:38 crc kubenswrapper[4786]: I0313 15:19:38.020139 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wk4sg" podStartSLOduration=2.633539754 podStartE2EDuration="5.020123673s" podCreationTimestamp="2026-03-13 15:19:33 +0000 UTC" firstStartedPulling="2026-03-13 15:19:34.969549706 +0000 UTC m=+1005.132761517" lastFinishedPulling="2026-03-13 15:19:37.356133625 +0000 UTC m=+1007.519345436" observedRunningTime="2026-03-13 15:19:38.018330598 +0000 UTC m=+1008.181542419" watchObservedRunningTime="2026-03-13 15:19:38.020123673 +0000 UTC m=+1008.183335484" Mar 13 15:19:41 crc kubenswrapper[4786]: I0313 15:19:41.047215 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" event={"ID":"4e85db10-18c2-474b-b572-c12121a79449","Type":"ContainerStarted","Data":"309393af5296b30a5a172b881750d2fd52169adc3a5b42013999f1968f28d869"} Mar 13 15:19:41 crc kubenswrapper[4786]: I0313 15:19:41.084304 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sldh" podStartSLOduration=2.416148483 podStartE2EDuration="6.084287135s" podCreationTimestamp="2026-03-13 15:19:35 +0000 UTC" firstStartedPulling="2026-03-13 15:19:36.461630195 +0000 UTC m=+1006.624842006" lastFinishedPulling="2026-03-13 15:19:40.129768847 +0000 UTC m=+1010.292980658" observedRunningTime="2026-03-13 15:19:41.077302548 +0000 UTC m=+1011.240514359" watchObservedRunningTime="2026-03-13 15:19:41.084287135 +0000 UTC m=+1011.247498936" Mar 13 15:19:43 crc kubenswrapper[4786]: I0313 15:19:43.428896 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:43 crc kubenswrapper[4786]: I0313 15:19:43.429219 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:43 crc kubenswrapper[4786]: I0313 15:19:43.487298 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.116103 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.143177 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wqs9w"] Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.144108 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.152927 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.153234 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-wpxqz" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.153412 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.170444 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wqs9w"] Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.248076 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2134099c-6c07-4d86-9aa5-b7360e8f3ea1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wqs9w\" (UID: \"2134099c-6c07-4d86-9aa5-b7360e8f3ea1\") " pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.248228 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqnsk\" (UniqueName: \"kubernetes.io/projected/2134099c-6c07-4d86-9aa5-b7360e8f3ea1-kube-api-access-hqnsk\") pod \"cert-manager-webhook-6888856db4-wqs9w\" (UID: \"2134099c-6c07-4d86-9aa5-b7360e8f3ea1\") " pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.349812 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2134099c-6c07-4d86-9aa5-b7360e8f3ea1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wqs9w\" (UID: \"2134099c-6c07-4d86-9aa5-b7360e8f3ea1\") " pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.349950 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqnsk\" (UniqueName: \"kubernetes.io/projected/2134099c-6c07-4d86-9aa5-b7360e8f3ea1-kube-api-access-hqnsk\") pod \"cert-manager-webhook-6888856db4-wqs9w\" (UID: \"2134099c-6c07-4d86-9aa5-b7360e8f3ea1\") " pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.367594 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2134099c-6c07-4d86-9aa5-b7360e8f3ea1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wqs9w\" (UID: \"2134099c-6c07-4d86-9aa5-b7360e8f3ea1\") " pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.371887 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqnsk\" (UniqueName: \"kubernetes.io/projected/2134099c-6c07-4d86-9aa5-b7360e8f3ea1-kube-api-access-hqnsk\") pod \"cert-manager-webhook-6888856db4-wqs9w\" (UID: \"2134099c-6c07-4d86-9aa5-b7360e8f3ea1\") " pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.460941 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.879055 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-pqlz5"] Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.880275 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.882311 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9nw97" Mar 13 15:19:44 crc kubenswrapper[4786]: I0313 15:19:44.890526 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-pqlz5"] Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.035269 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wqs9w"] Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.058654 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c58xh\" (UniqueName: \"kubernetes.io/projected/9257f04b-d34e-4a81-b897-e563a66b6b59-kube-api-access-c58xh\") pod \"cert-manager-cainjector-5545bd876-pqlz5\" (UID: \"9257f04b-d34e-4a81-b897-e563a66b6b59\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.058770 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9257f04b-d34e-4a81-b897-e563a66b6b59-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-pqlz5\" (UID: \"9257f04b-d34e-4a81-b897-e563a66b6b59\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.070838 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" event={"ID":"2134099c-6c07-4d86-9aa5-b7360e8f3ea1","Type":"ContainerStarted","Data":"8125fcf8b264abcf80bbec34d03180a7303802b246b6953e6a0b0ca47840f97f"} Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.160688 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c58xh\" (UniqueName: \"kubernetes.io/projected/9257f04b-d34e-4a81-b897-e563a66b6b59-kube-api-access-c58xh\") pod \"cert-manager-cainjector-5545bd876-pqlz5\" (UID: \"9257f04b-d34e-4a81-b897-e563a66b6b59\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.160764 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9257f04b-d34e-4a81-b897-e563a66b6b59-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-pqlz5\" (UID: \"9257f04b-d34e-4a81-b897-e563a66b6b59\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.180425 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c58xh\" (UniqueName: \"kubernetes.io/projected/9257f04b-d34e-4a81-b897-e563a66b6b59-kube-api-access-c58xh\") pod \"cert-manager-cainjector-5545bd876-pqlz5\" (UID: \"9257f04b-d34e-4a81-b897-e563a66b6b59\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.184432 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9257f04b-d34e-4a81-b897-e563a66b6b59-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-pqlz5\" (UID: \"9257f04b-d34e-4a81-b897-e563a66b6b59\") " pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.196528 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" Mar 13 15:19:45 crc kubenswrapper[4786]: I0313 15:19:45.653533 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-pqlz5"] Mar 13 15:19:46 crc kubenswrapper[4786]: I0313 15:19:46.076975 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" event={"ID":"9257f04b-d34e-4a81-b897-e563a66b6b59","Type":"ContainerStarted","Data":"24a2c26b388694cf9ce90f70552d62f71d6fc9fd1c74a85d444b0899d8d7056c"} Mar 13 15:19:47 crc kubenswrapper[4786]: I0313 15:19:47.061897 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wk4sg"] Mar 13 15:19:47 crc kubenswrapper[4786]: I0313 15:19:47.062106 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wk4sg" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerName="registry-server" containerID="cri-o://32bf8d3baedb46dac164c172dcb5cfd77eb8c8f144f5590d09f8abfe30bd3bbb" gracePeriod=2 Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.098101 4786 generic.go:334] "Generic (PLEG): container finished" podID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerID="32bf8d3baedb46dac164c172dcb5cfd77eb8c8f144f5590d09f8abfe30bd3bbb" exitCode=0 Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.098358 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk4sg" event={"ID":"5cd94b8d-62f2-496b-929a-f81b4f14aded","Type":"ContainerDied","Data":"32bf8d3baedb46dac164c172dcb5cfd77eb8c8f144f5590d09f8abfe30bd3bbb"} Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.292071 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.300177 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6hx2\" (UniqueName: \"kubernetes.io/projected/5cd94b8d-62f2-496b-929a-f81b4f14aded-kube-api-access-d6hx2\") pod \"5cd94b8d-62f2-496b-929a-f81b4f14aded\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.300253 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-utilities\") pod \"5cd94b8d-62f2-496b-929a-f81b4f14aded\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.300368 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-catalog-content\") pod \"5cd94b8d-62f2-496b-929a-f81b4f14aded\" (UID: \"5cd94b8d-62f2-496b-929a-f81b4f14aded\") " Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.301732 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-utilities" (OuterVolumeSpecName: "utilities") pod "5cd94b8d-62f2-496b-929a-f81b4f14aded" (UID: "5cd94b8d-62f2-496b-929a-f81b4f14aded"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.305501 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cd94b8d-62f2-496b-929a-f81b4f14aded-kube-api-access-d6hx2" (OuterVolumeSpecName: "kube-api-access-d6hx2") pod "5cd94b8d-62f2-496b-929a-f81b4f14aded" (UID: "5cd94b8d-62f2-496b-929a-f81b4f14aded"). InnerVolumeSpecName "kube-api-access-d6hx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.359077 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cd94b8d-62f2-496b-929a-f81b4f14aded" (UID: "5cd94b8d-62f2-496b-929a-f81b4f14aded"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.402890 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.403195 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6hx2\" (UniqueName: \"kubernetes.io/projected/5cd94b8d-62f2-496b-929a-f81b4f14aded-kube-api-access-d6hx2\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:48 crc kubenswrapper[4786]: I0313 15:19:48.403216 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cd94b8d-62f2-496b-929a-f81b4f14aded-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:49 crc kubenswrapper[4786]: I0313 15:19:49.113289 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wk4sg" event={"ID":"5cd94b8d-62f2-496b-929a-f81b4f14aded","Type":"ContainerDied","Data":"9a47cc8e989a3e4eb1c5082d87e7b90808a8bcb3cbb657ce4ab6fee72975f1fe"} Mar 13 15:19:49 crc kubenswrapper[4786]: I0313 15:19:49.113352 4786 scope.go:117] "RemoveContainer" containerID="32bf8d3baedb46dac164c172dcb5cfd77eb8c8f144f5590d09f8abfe30bd3bbb" Mar 13 15:19:49 crc kubenswrapper[4786]: I0313 15:19:49.113411 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wk4sg" Mar 13 15:19:49 crc kubenswrapper[4786]: I0313 15:19:49.132712 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wk4sg"] Mar 13 15:19:49 crc kubenswrapper[4786]: I0313 15:19:49.138653 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wk4sg"] Mar 13 15:19:49 crc kubenswrapper[4786]: I0313 15:19:49.156092 4786 scope.go:117] "RemoveContainer" containerID="4ac1154c949d476c8b51a1b2f71748d450befb233d589270b7f9d8496ef557bf" Mar 13 15:19:49 crc kubenswrapper[4786]: I0313 15:19:49.213356 4786 scope.go:117] "RemoveContainer" containerID="1198a163d70627064cd88c22ff7c31139b16c7f66b3ceecb2c0c871e96a43ba5" Mar 13 15:19:50 crc kubenswrapper[4786]: I0313 15:19:50.563035 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" path="/var/lib/kubelet/pods/5cd94b8d-62f2-496b-929a-f81b4f14aded/volumes" Mar 13 15:19:52 crc kubenswrapper[4786]: I0313 15:19:52.137784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" event={"ID":"9257f04b-d34e-4a81-b897-e563a66b6b59","Type":"ContainerStarted","Data":"82066a4a6b60822956e0d79e506f91b7a625f6bcf9560935d653322ad17a3136"} Mar 13 15:19:52 crc kubenswrapper[4786]: I0313 15:19:52.139338 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" event={"ID":"2134099c-6c07-4d86-9aa5-b7360e8f3ea1","Type":"ContainerStarted","Data":"5823a49bfe8f70d636c767dd89c6e05320a5aaa39b8b4fa783bca510d26f004d"} Mar 13 15:19:52 crc kubenswrapper[4786]: I0313 15:19:52.139579 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:19:52 crc kubenswrapper[4786]: I0313 15:19:52.153921 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-pqlz5" podStartSLOduration=2.169654203 podStartE2EDuration="8.153825349s" podCreationTimestamp="2026-03-13 15:19:44 +0000 UTC" firstStartedPulling="2026-03-13 15:19:45.666075802 +0000 UTC m=+1015.829287613" lastFinishedPulling="2026-03-13 15:19:51.650246948 +0000 UTC m=+1021.813458759" observedRunningTime="2026-03-13 15:19:52.151219753 +0000 UTC m=+1022.314431564" watchObservedRunningTime="2026-03-13 15:19:52.153825349 +0000 UTC m=+1022.317037200" Mar 13 15:19:52 crc kubenswrapper[4786]: I0313 15:19:52.188632 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" podStartSLOduration=1.60237395 podStartE2EDuration="8.188596048s" podCreationTimestamp="2026-03-13 15:19:44 +0000 UTC" firstStartedPulling="2026-03-13 15:19:45.047834801 +0000 UTC m=+1015.211046612" lastFinishedPulling="2026-03-13 15:19:51.634056899 +0000 UTC m=+1021.797268710" observedRunningTime="2026-03-13 15:19:52.177680162 +0000 UTC m=+1022.340891993" watchObservedRunningTime="2026-03-13 15:19:52.188596048 +0000 UTC m=+1022.351807879" Mar 13 15:19:59 crc kubenswrapper[4786]: I0313 15:19:59.463332 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-wqs9w" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.125937 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556920-t5l58"] Mar 13 15:20:00 crc kubenswrapper[4786]: E0313 15:20:00.126169 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerName="extract-utilities" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.126180 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerName="extract-utilities" Mar 13 15:20:00 crc kubenswrapper[4786]: E0313 15:20:00.126198 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerName="registry-server" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.126204 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerName="registry-server" Mar 13 15:20:00 crc kubenswrapper[4786]: E0313 15:20:00.126214 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerName="extract-content" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.126221 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerName="extract-content" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.126321 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd94b8d-62f2-496b-929a-f81b4f14aded" containerName="registry-server" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.126670 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-t5l58" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.130167 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.130401 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.133113 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.133848 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-t5l58"] Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.182972 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26cfh\" (UniqueName: \"kubernetes.io/projected/0b4b8359-292c-4c01-85b0-6969cb325afa-kube-api-access-26cfh\") pod \"auto-csr-approver-29556920-t5l58\" (UID: \"0b4b8359-292c-4c01-85b0-6969cb325afa\") " pod="openshift-infra/auto-csr-approver-29556920-t5l58" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.284435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26cfh\" (UniqueName: \"kubernetes.io/projected/0b4b8359-292c-4c01-85b0-6969cb325afa-kube-api-access-26cfh\") pod \"auto-csr-approver-29556920-t5l58\" (UID: \"0b4b8359-292c-4c01-85b0-6969cb325afa\") " pod="openshift-infra/auto-csr-approver-29556920-t5l58" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.300785 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26cfh\" (UniqueName: \"kubernetes.io/projected/0b4b8359-292c-4c01-85b0-6969cb325afa-kube-api-access-26cfh\") pod \"auto-csr-approver-29556920-t5l58\" (UID: \"0b4b8359-292c-4c01-85b0-6969cb325afa\") " pod="openshift-infra/auto-csr-approver-29556920-t5l58" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.494455 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-t5l58" Mar 13 15:20:00 crc kubenswrapper[4786]: I0313 15:20:00.940048 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-t5l58"] Mar 13 15:20:01 crc kubenswrapper[4786]: I0313 15:20:01.195262 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-t5l58" event={"ID":"0b4b8359-292c-4c01-85b0-6969cb325afa","Type":"ContainerStarted","Data":"4c2ac6583d917951505c917e2415a84907a4628553d65e36a89dbe09f1b0343a"} Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.310686 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-sspj7"] Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.311853 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-sspj7" Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.313494 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dtxvh" Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.320029 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-sspj7"] Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.513258 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjdf\" (UniqueName: \"kubernetes.io/projected/7d3cd12d-97ae-473f-ae6b-40837bbb9a5e-kube-api-access-rfjdf\") pod \"cert-manager-545d4d4674-sspj7\" (UID: \"7d3cd12d-97ae-473f-ae6b-40837bbb9a5e\") " pod="cert-manager/cert-manager-545d4d4674-sspj7" Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.513362 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3cd12d-97ae-473f-ae6b-40837bbb9a5e-bound-sa-token\") pod \"cert-manager-545d4d4674-sspj7\" (UID: \"7d3cd12d-97ae-473f-ae6b-40837bbb9a5e\") " pod="cert-manager/cert-manager-545d4d4674-sspj7" Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.614106 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjdf\" (UniqueName: \"kubernetes.io/projected/7d3cd12d-97ae-473f-ae6b-40837bbb9a5e-kube-api-access-rfjdf\") pod \"cert-manager-545d4d4674-sspj7\" (UID: \"7d3cd12d-97ae-473f-ae6b-40837bbb9a5e\") " pod="cert-manager/cert-manager-545d4d4674-sspj7" Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.614186 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3cd12d-97ae-473f-ae6b-40837bbb9a5e-bound-sa-token\") pod \"cert-manager-545d4d4674-sspj7\" (UID: \"7d3cd12d-97ae-473f-ae6b-40837bbb9a5e\") " pod="cert-manager/cert-manager-545d4d4674-sspj7" Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.634722 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjdf\" (UniqueName: \"kubernetes.io/projected/7d3cd12d-97ae-473f-ae6b-40837bbb9a5e-kube-api-access-rfjdf\") pod \"cert-manager-545d4d4674-sspj7\" (UID: \"7d3cd12d-97ae-473f-ae6b-40837bbb9a5e\") " pod="cert-manager/cert-manager-545d4d4674-sspj7" Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.635936 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3cd12d-97ae-473f-ae6b-40837bbb9a5e-bound-sa-token\") pod \"cert-manager-545d4d4674-sspj7\" (UID: \"7d3cd12d-97ae-473f-ae6b-40837bbb9a5e\") " pod="cert-manager/cert-manager-545d4d4674-sspj7" Mar 13 15:20:02 crc kubenswrapper[4786]: I0313 15:20:02.929170 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-sspj7" Mar 13 15:20:03 crc kubenswrapper[4786]: I0313 15:20:03.208810 4786 generic.go:334] "Generic (PLEG): container finished" podID="0b4b8359-292c-4c01-85b0-6969cb325afa" containerID="b75d9bc1c31b885ef36a2bda2dbd11ce09de40ca79c24eca1fb59504c54df56e" exitCode=0 Mar 13 15:20:03 crc kubenswrapper[4786]: I0313 15:20:03.209331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-t5l58" event={"ID":"0b4b8359-292c-4c01-85b0-6969cb325afa","Type":"ContainerDied","Data":"b75d9bc1c31b885ef36a2bda2dbd11ce09de40ca79c24eca1fb59504c54df56e"} Mar 13 15:20:03 crc kubenswrapper[4786]: I0313 15:20:03.344164 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-sspj7"] Mar 13 15:20:04 crc kubenswrapper[4786]: I0313 15:20:04.217062 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-sspj7" event={"ID":"7d3cd12d-97ae-473f-ae6b-40837bbb9a5e","Type":"ContainerStarted","Data":"002769a2ed6b45a9f92e069e5dd74822c86b76bb88b3896a3c2b67f10f9a7a8f"} Mar 13 15:20:04 crc kubenswrapper[4786]: I0313 15:20:04.217407 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-sspj7" event={"ID":"7d3cd12d-97ae-473f-ae6b-40837bbb9a5e","Type":"ContainerStarted","Data":"c25118215d55ffee9d96e9974d06250892b1af807466739d6c5c6034a4acb713"} Mar 13 15:20:04 crc kubenswrapper[4786]: I0313 15:20:04.247505 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-sspj7" podStartSLOduration=2.247487011 podStartE2EDuration="2.247487011s" podCreationTimestamp="2026-03-13 15:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:20:04.246692121 +0000 UTC m=+1034.409903942" watchObservedRunningTime="2026-03-13 15:20:04.247487011 +0000 UTC m=+1034.410698822" Mar 13 15:20:04 crc kubenswrapper[4786]: I0313 15:20:04.508766 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-t5l58" Mar 13 15:20:04 crc kubenswrapper[4786]: I0313 15:20:04.557180 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26cfh\" (UniqueName: \"kubernetes.io/projected/0b4b8359-292c-4c01-85b0-6969cb325afa-kube-api-access-26cfh\") pod \"0b4b8359-292c-4c01-85b0-6969cb325afa\" (UID: \"0b4b8359-292c-4c01-85b0-6969cb325afa\") " Mar 13 15:20:04 crc kubenswrapper[4786]: I0313 15:20:04.570116 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4b8359-292c-4c01-85b0-6969cb325afa-kube-api-access-26cfh" (OuterVolumeSpecName: "kube-api-access-26cfh") pod "0b4b8359-292c-4c01-85b0-6969cb325afa" (UID: "0b4b8359-292c-4c01-85b0-6969cb325afa"). InnerVolumeSpecName "kube-api-access-26cfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:20:04 crc kubenswrapper[4786]: I0313 15:20:04.657925 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26cfh\" (UniqueName: \"kubernetes.io/projected/0b4b8359-292c-4c01-85b0-6969cb325afa-kube-api-access-26cfh\") on node \"crc\" DevicePath \"\"" Mar 13 15:20:05 crc kubenswrapper[4786]: I0313 15:20:05.226262 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-t5l58" event={"ID":"0b4b8359-292c-4c01-85b0-6969cb325afa","Type":"ContainerDied","Data":"4c2ac6583d917951505c917e2415a84907a4628553d65e36a89dbe09f1b0343a"} Mar 13 15:20:05 crc kubenswrapper[4786]: I0313 15:20:05.226341 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c2ac6583d917951505c917e2415a84907a4628553d65e36a89dbe09f1b0343a" Mar 13 15:20:05 crc kubenswrapper[4786]: I0313 15:20:05.226281 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-t5l58" Mar 13 15:20:05 crc kubenswrapper[4786]: I0313 15:20:05.559599 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-qrwh4"] Mar 13 15:20:05 crc kubenswrapper[4786]: I0313 15:20:05.565190 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-qrwh4"] Mar 13 15:20:06 crc kubenswrapper[4786]: I0313 15:20:06.562655 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d731ba13-dc0b-479a-8d77-ac4679c41d7c" path="/var/lib/kubelet/pods/d731ba13-dc0b-479a-8d77-ac4679c41d7c/volumes" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.675625 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qqq28"] Mar 13 15:20:15 crc kubenswrapper[4786]: E0313 15:20:15.676417 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b4b8359-292c-4c01-85b0-6969cb325afa" containerName="oc" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.676433 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b4b8359-292c-4c01-85b0-6969cb325afa" containerName="oc" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.676573 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b4b8359-292c-4c01-85b0-6969cb325afa" containerName="oc" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.677098 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.681795 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.686088 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-mn2js" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.686511 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qqq28"] Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.686703 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.807018 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjs65\" (UniqueName: \"kubernetes.io/projected/3a032c51-0082-416e-ae5e-b4c5eb59ff33-kube-api-access-rjs65\") pod \"openstack-operator-index-qqq28\" (UID: \"3a032c51-0082-416e-ae5e-b4c5eb59ff33\") " pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.908629 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjs65\" (UniqueName: \"kubernetes.io/projected/3a032c51-0082-416e-ae5e-b4c5eb59ff33-kube-api-access-rjs65\") pod \"openstack-operator-index-qqq28\" (UID: \"3a032c51-0082-416e-ae5e-b4c5eb59ff33\") " pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.927518 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjs65\" (UniqueName: \"kubernetes.io/projected/3a032c51-0082-416e-ae5e-b4c5eb59ff33-kube-api-access-rjs65\") pod \"openstack-operator-index-qqq28\" (UID: \"3a032c51-0082-416e-ae5e-b4c5eb59ff33\") " pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:15 crc kubenswrapper[4786]: I0313 15:20:15.994486 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:16 crc kubenswrapper[4786]: I0313 15:20:16.512662 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qqq28"] Mar 13 15:20:17 crc kubenswrapper[4786]: I0313 15:20:17.305171 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qqq28" event={"ID":"3a032c51-0082-416e-ae5e-b4c5eb59ff33","Type":"ContainerStarted","Data":"4930834db094a465fa33490b3961deb8921c23809f1fdaf5c44df1624a7ed938"} Mar 13 15:20:18 crc kubenswrapper[4786]: I0313 15:20:18.314880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qqq28" event={"ID":"3a032c51-0082-416e-ae5e-b4c5eb59ff33","Type":"ContainerStarted","Data":"2331f32ecd7c96a42869883273d9e5ebb09f3691dfa0067114282a5b77e8e8ec"} Mar 13 15:20:18 crc kubenswrapper[4786]: I0313 15:20:18.337726 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qqq28" podStartSLOduration=2.35407673 podStartE2EDuration="3.337708466s" podCreationTimestamp="2026-03-13 15:20:15 +0000 UTC" firstStartedPulling="2026-03-13 15:20:16.53033674 +0000 UTC m=+1046.693548551" lastFinishedPulling="2026-03-13 15:20:17.513968476 +0000 UTC m=+1047.677180287" observedRunningTime="2026-03-13 15:20:18.334216448 +0000 UTC m=+1048.497428259" watchObservedRunningTime="2026-03-13 15:20:18.337708466 +0000 UTC m=+1048.500920277" Mar 13 15:20:25 crc kubenswrapper[4786]: I0313 15:20:25.995163 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:25 crc kubenswrapper[4786]: I0313 15:20:25.996329 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:26 crc kubenswrapper[4786]: I0313 15:20:26.044134 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:26 crc kubenswrapper[4786]: I0313 15:20:26.406127 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qqq28" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.496023 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96"] Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.498223 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.502065 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tjbqz" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.506190 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96"] Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.678485 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.678543 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.678618 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjb8\" (UniqueName: \"kubernetes.io/projected/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-kube-api-access-fjjb8\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.780470 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjb8\" (UniqueName: \"kubernetes.io/projected/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-kube-api-access-fjjb8\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.780599 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.780657 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.781426 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-util\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.781535 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-bundle\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.809160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjb8\" (UniqueName: \"kubernetes.io/projected/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-kube-api-access-fjjb8\") pod \"a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:28 crc kubenswrapper[4786]: I0313 15:20:28.813436 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:29 crc kubenswrapper[4786]: I0313 15:20:29.226806 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96"] Mar 13 15:20:29 crc kubenswrapper[4786]: I0313 15:20:29.400412 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" event={"ID":"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6","Type":"ContainerStarted","Data":"d81e60246337b36435a4fd083e5b2e00b4bedc83333a0fdf67bd47ed3f1c9566"} Mar 13 15:20:30 crc kubenswrapper[4786]: I0313 15:20:30.411093 4786 generic.go:334] "Generic (PLEG): container finished" podID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerID="a628cf110d2e57c9e61f5f7ccda81157345aa7246606e0047eb2b13b895da2d4" exitCode=0 Mar 13 15:20:30 crc kubenswrapper[4786]: I0313 15:20:30.411132 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" event={"ID":"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6","Type":"ContainerDied","Data":"a628cf110d2e57c9e61f5f7ccda81157345aa7246606e0047eb2b13b895da2d4"} Mar 13 15:20:31 crc kubenswrapper[4786]: I0313 15:20:31.419818 4786 generic.go:334] "Generic (PLEG): container finished" podID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerID="75b1cb2ce2521900e6c3c693b1435eebf1c528c98f3b284cf7509c56dce19a95" exitCode=0 Mar 13 15:20:31 crc kubenswrapper[4786]: I0313 15:20:31.419904 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" event={"ID":"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6","Type":"ContainerDied","Data":"75b1cb2ce2521900e6c3c693b1435eebf1c528c98f3b284cf7509c56dce19a95"} Mar 13 15:20:32 crc kubenswrapper[4786]: I0313 15:20:32.428220 4786 generic.go:334] "Generic (PLEG): container finished" podID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerID="26623c3c3d627b530ce5a6027c1f6cb37c62cf925c152c02644fb3af1310875e" exitCode=0 Mar 13 15:20:32 crc kubenswrapper[4786]: I0313 15:20:32.428319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" event={"ID":"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6","Type":"ContainerDied","Data":"26623c3c3d627b530ce5a6027c1f6cb37c62cf925c152c02644fb3af1310875e"} Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.674924 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.849831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-bundle\") pod \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.849971 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-util\") pod \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.850100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjb8\" (UniqueName: \"kubernetes.io/projected/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-kube-api-access-fjjb8\") pod \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\" (UID: \"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6\") " Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.850673 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-bundle" (OuterVolumeSpecName: "bundle") pod "0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" (UID: "0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.855251 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-kube-api-access-fjjb8" (OuterVolumeSpecName: "kube-api-access-fjjb8") pod "0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" (UID: "0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6"). InnerVolumeSpecName "kube-api-access-fjjb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.863962 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-util" (OuterVolumeSpecName: "util") pod "0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" (UID: "0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.952254 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjjb8\" (UniqueName: \"kubernetes.io/projected/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-kube-api-access-fjjb8\") on node \"crc\" DevicePath \"\"" Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.952611 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:20:33 crc kubenswrapper[4786]: I0313 15:20:33.952624 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6-util\") on node \"crc\" DevicePath \"\"" Mar 13 15:20:34 crc kubenswrapper[4786]: I0313 15:20:34.443106 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" event={"ID":"0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6","Type":"ContainerDied","Data":"d81e60246337b36435a4fd083e5b2e00b4bedc83333a0fdf67bd47ed3f1c9566"} Mar 13 15:20:34 crc kubenswrapper[4786]: I0313 15:20:34.443154 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d81e60246337b36435a4fd083e5b2e00b4bedc83333a0fdf67bd47ed3f1c9566" Mar 13 15:20:34 crc kubenswrapper[4786]: I0313 15:20:34.443181 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.713071 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9"] Mar 13 15:20:37 crc kubenswrapper[4786]: E0313 15:20:37.714878 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerName="extract" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.715363 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerName="extract" Mar 13 15:20:37 crc kubenswrapper[4786]: E0313 15:20:37.715444 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerName="util" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.715535 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerName="util" Mar 13 15:20:37 crc kubenswrapper[4786]: E0313 15:20:37.715618 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerName="pull" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.715696 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerName="pull" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.715963 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6" containerName="extract" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.716568 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.720583 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-kxdfl" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.736445 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9"] Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.869106 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.869172 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:20:37 crc kubenswrapper[4786]: I0313 15:20:37.905846 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8cmw\" (UniqueName: \"kubernetes.io/projected/54a9d39e-7ebd-4924-8cfb-2704bd61e22e-kube-api-access-g8cmw\") pod \"openstack-operator-controller-init-6dc56d8cd6-n4mm9\" (UID: \"54a9d39e-7ebd-4924-8cfb-2704bd61e22e\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" Mar 13 15:20:38 crc kubenswrapper[4786]: I0313 15:20:38.006616 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8cmw\" (UniqueName: \"kubernetes.io/projected/54a9d39e-7ebd-4924-8cfb-2704bd61e22e-kube-api-access-g8cmw\") pod \"openstack-operator-controller-init-6dc56d8cd6-n4mm9\" (UID: \"54a9d39e-7ebd-4924-8cfb-2704bd61e22e\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" Mar 13 15:20:38 crc kubenswrapper[4786]: I0313 15:20:38.025454 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8cmw\" (UniqueName: \"kubernetes.io/projected/54a9d39e-7ebd-4924-8cfb-2704bd61e22e-kube-api-access-g8cmw\") pod \"openstack-operator-controller-init-6dc56d8cd6-n4mm9\" (UID: \"54a9d39e-7ebd-4924-8cfb-2704bd61e22e\") " pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" Mar 13 15:20:38 crc kubenswrapper[4786]: I0313 15:20:38.032361 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" Mar 13 15:20:38 crc kubenswrapper[4786]: I0313 15:20:38.441744 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9"] Mar 13 15:20:38 crc kubenswrapper[4786]: I0313 15:20:38.482501 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" event={"ID":"54a9d39e-7ebd-4924-8cfb-2704bd61e22e","Type":"ContainerStarted","Data":"e2f5d2021e092f54b359d554beb69e5739c244901ad32f65b4aef34b33cec7f9"} Mar 13 15:20:43 crc kubenswrapper[4786]: I0313 15:20:43.538490 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" event={"ID":"54a9d39e-7ebd-4924-8cfb-2704bd61e22e","Type":"ContainerStarted","Data":"4885632b84559dbe7058a030f656e93724ac61a8664647ef447139178f1efeff"} Mar 13 15:20:43 crc kubenswrapper[4786]: I0313 15:20:43.539181 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" Mar 13 15:20:43 crc kubenswrapper[4786]: I0313 15:20:43.579882 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" podStartSLOduration=1.846402925 podStartE2EDuration="6.579846559s" podCreationTimestamp="2026-03-13 15:20:37 +0000 UTC" firstStartedPulling="2026-03-13 15:20:38.444913957 +0000 UTC m=+1068.608125768" lastFinishedPulling="2026-03-13 15:20:43.178357591 +0000 UTC m=+1073.341569402" observedRunningTime="2026-03-13 15:20:43.576268148 +0000 UTC m=+1073.739479979" watchObservedRunningTime="2026-03-13 15:20:43.579846559 +0000 UTC m=+1073.743058370" Mar 13 15:20:48 crc kubenswrapper[4786]: I0313 15:20:48.035326 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6dc56d8cd6-n4mm9" Mar 13 15:20:52 crc kubenswrapper[4786]: I0313 15:20:52.244625 4786 scope.go:117] "RemoveContainer" containerID="5b8883b5f73859d99542e8bc8deaecc6b973a654f66a87d6256d4bb174fdcd51" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.824371 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.825731 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.827921 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4rz7z" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.828789 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.829525 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.830814 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2rds8" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.840510 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.850192 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.850917 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.855070 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xj7jq" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.858222 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.862991 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45xnp\" (UniqueName: \"kubernetes.io/projected/a176571a-293b-4ff6-8928-1cc5f3b28c44-kube-api-access-45xnp\") pod \"cinder-operator-controller-manager-984cd4dcf-xpqf7\" (UID: \"a176571a-293b-4ff6-8928-1cc5f3b28c44\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.863348 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rn8n\" (UniqueName: \"kubernetes.io/projected/e60c275e-371c-48d7-8816-56ae26f8e911-kube-api-access-5rn8n\") pod \"barbican-operator-controller-manager-d47688694-wgpvr\" (UID: \"e60c275e-371c-48d7-8816-56ae26f8e911\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.863575 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4gps\" (UniqueName: \"kubernetes.io/projected/bad187db-13d7-4bf9-9b5f-9ce08a17b9c7-kube-api-access-v4gps\") pod \"designate-operator-controller-manager-66d56f6ff4-zq9jr\" (UID: \"bad187db-13d7-4bf9-9b5f-9ce08a17b9c7\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.863736 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.868826 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.869118 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.900825 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.901587 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.904518 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-tczl6" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.909098 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.917699 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.918541 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.921970 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-26lhr" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.934733 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.944059 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.945232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.949291 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gw5dp" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.954239 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.955103 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.960409 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.960604 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk"] Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.960643 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-npc2r" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.964881 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45xnp\" (UniqueName: \"kubernetes.io/projected/a176571a-293b-4ff6-8928-1cc5f3b28c44-kube-api-access-45xnp\") pod \"cinder-operator-controller-manager-984cd4dcf-xpqf7\" (UID: \"a176571a-293b-4ff6-8928-1cc5f3b28c44\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.964977 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsv9f\" (UniqueName: \"kubernetes.io/projected/d9425c10-8f83-4b9f-81a8-0502889571a0-kube-api-access-gsv9f\") pod \"glance-operator-controller-manager-5964f64c48-gr9w7\" (UID: \"d9425c10-8f83-4b9f-81a8-0502889571a0\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.965042 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rn8n\" (UniqueName: \"kubernetes.io/projected/e60c275e-371c-48d7-8816-56ae26f8e911-kube-api-access-5rn8n\") pod \"barbican-operator-controller-manager-d47688694-wgpvr\" (UID: \"e60c275e-371c-48d7-8816-56ae26f8e911\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.965095 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4gps\" (UniqueName: \"kubernetes.io/projected/bad187db-13d7-4bf9-9b5f-9ce08a17b9c7-kube-api-access-v4gps\") pod \"designate-operator-controller-manager-66d56f6ff4-zq9jr\" (UID: \"bad187db-13d7-4bf9-9b5f-9ce08a17b9c7\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" Mar 13 15:21:07 crc kubenswrapper[4786]: I0313 15:21:07.968145 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.001501 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rn8n\" (UniqueName: \"kubernetes.io/projected/e60c275e-371c-48d7-8816-56ae26f8e911-kube-api-access-5rn8n\") pod \"barbican-operator-controller-manager-d47688694-wgpvr\" (UID: \"e60c275e-371c-48d7-8816-56ae26f8e911\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.004403 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45xnp\" (UniqueName: \"kubernetes.io/projected/a176571a-293b-4ff6-8928-1cc5f3b28c44-kube-api-access-45xnp\") pod \"cinder-operator-controller-manager-984cd4dcf-xpqf7\" (UID: \"a176571a-293b-4ff6-8928-1cc5f3b28c44\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.006995 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4gps\" (UniqueName: \"kubernetes.io/projected/bad187db-13d7-4bf9-9b5f-9ce08a17b9c7-kube-api-access-v4gps\") pod \"designate-operator-controller-manager-66d56f6ff4-zq9jr\" (UID: \"bad187db-13d7-4bf9-9b5f-9ce08a17b9c7\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.007413 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.009503 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.015045 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5wd9h" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.051934 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.052747 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.059902 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qpsjr" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.062704 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.069986 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj2zb\" (UniqueName: \"kubernetes.io/projected/b93a2ad9-e58e-4b33-8d34-8101b1fa2d38-kube-api-access-fj2zb\") pod \"keystone-operator-controller-manager-684f77d66d-wxjdd\" (UID: \"b93a2ad9-e58e-4b33-8d34-8101b1fa2d38\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.070033 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsv9f\" (UniqueName: \"kubernetes.io/projected/d9425c10-8f83-4b9f-81a8-0502889571a0-kube-api-access-gsv9f\") pod \"glance-operator-controller-manager-5964f64c48-gr9w7\" (UID: \"d9425c10-8f83-4b9f-81a8-0502889571a0\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.070055 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bftvc\" (UniqueName: \"kubernetes.io/projected/1a700a4d-b7ed-4ea7-9382-b3994ba3646e-kube-api-access-bftvc\") pod \"ironic-operator-controller-manager-5bc894d9b-8bd8f\" (UID: \"1a700a4d-b7ed-4ea7-9382-b3994ba3646e\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.070079 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.070117 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsxzp\" (UniqueName: \"kubernetes.io/projected/c4687472-a411-424b-bcc2-d39f84de6a17-kube-api-access-bsxzp\") pod \"horizon-operator-controller-manager-6d9d6b584d-mpfq5\" (UID: \"c4687472-a411-424b-bcc2-d39f84de6a17\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.070153 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8qm5\" (UniqueName: \"kubernetes.io/projected/7826406e-4038-4851-a54e-bf72ff94287f-kube-api-access-w8qm5\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.070170 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc4z9\" (UniqueName: \"kubernetes.io/projected/721d9249-da86-4bba-93bc-4f037cd3344d-kube-api-access-qc4z9\") pod \"heat-operator-controller-manager-77b6666d85-gtwlk\" (UID: \"721d9249-da86-4bba-93bc-4f037cd3344d\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.072496 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.077170 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.078069 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.082184 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-pqqn4" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.083048 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.089486 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.090268 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.094792 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-g2jll" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.095016 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsv9f\" (UniqueName: \"kubernetes.io/projected/d9425c10-8f83-4b9f-81a8-0502889571a0-kube-api-access-gsv9f\") pod \"glance-operator-controller-manager-5964f64c48-gr9w7\" (UID: \"d9425c10-8f83-4b9f-81a8-0502889571a0\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.117428 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.118246 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.128816 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qqmhd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.150057 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.152658 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.156869 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4bbgb" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.162319 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171186 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsxzp\" (UniqueName: \"kubernetes.io/projected/c4687472-a411-424b-bcc2-d39f84de6a17-kube-api-access-bsxzp\") pod \"horizon-operator-controller-manager-6d9d6b584d-mpfq5\" (UID: \"c4687472-a411-424b-bcc2-d39f84de6a17\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171228 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qzc\" (UniqueName: \"kubernetes.io/projected/37c474a3-0434-4911-adf5-02d915b23d57-kube-api-access-m6qzc\") pod \"neutron-operator-controller-manager-776c5696bf-skfmd\" (UID: \"37c474a3-0434-4911-adf5-02d915b23d57\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171250 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw265\" (UniqueName: \"kubernetes.io/projected/e2450a85-9b9c-49c6-8191-df5a87807e4f-kube-api-access-mw265\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-65hkw\" (UID: \"e2450a85-9b9c-49c6-8191-df5a87807e4f\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171276 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmtb2\" (UniqueName: \"kubernetes.io/projected/e32e285a-970c-49ac-8531-d0d87b217b08-kube-api-access-zmtb2\") pod \"nova-operator-controller-manager-7f84474648-dzj6t\" (UID: \"e32e285a-970c-49ac-8531-d0d87b217b08\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171298 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8qm5\" (UniqueName: \"kubernetes.io/projected/7826406e-4038-4851-a54e-bf72ff94287f-kube-api-access-w8qm5\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171323 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc4z9\" (UniqueName: \"kubernetes.io/projected/721d9249-da86-4bba-93bc-4f037cd3344d-kube-api-access-qc4z9\") pod \"heat-operator-controller-manager-77b6666d85-gtwlk\" (UID: \"721d9249-da86-4bba-93bc-4f037cd3344d\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171355 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj2zb\" (UniqueName: \"kubernetes.io/projected/b93a2ad9-e58e-4b33-8d34-8101b1fa2d38-kube-api-access-fj2zb\") pod \"keystone-operator-controller-manager-684f77d66d-wxjdd\" (UID: \"b93a2ad9-e58e-4b33-8d34-8101b1fa2d38\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171377 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bftvc\" (UniqueName: \"kubernetes.io/projected/1a700a4d-b7ed-4ea7-9382-b3994ba3646e-kube-api-access-bftvc\") pod \"ironic-operator-controller-manager-5bc894d9b-8bd8f\" (UID: \"1a700a4d-b7ed-4ea7-9382-b3994ba3646e\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171397 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.171421 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zn7h\" (UniqueName: \"kubernetes.io/projected/8e63300a-b2d6-438f-9b17-989c103b9975-kube-api-access-5zn7h\") pod \"manila-operator-controller-manager-57b484b4df-hxcnt\" (UID: \"8e63300a-b2d6-438f-9b17-989c103b9975\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.171841 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.171900 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert podName:7826406e-4038-4851-a54e-bf72ff94287f nodeName:}" failed. No retries permitted until 2026-03-13 15:21:08.671885674 +0000 UTC m=+1098.835097485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert") pod "infra-operator-controller-manager-54dc5b8f8d-98tsp" (UID: "7826406e-4038-4851-a54e-bf72ff94287f") : secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.174477 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.186193 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.186389 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc4z9\" (UniqueName: \"kubernetes.io/projected/721d9249-da86-4bba-93bc-4f037cd3344d-kube-api-access-qc4z9\") pod \"heat-operator-controller-manager-77b6666d85-gtwlk\" (UID: \"721d9249-da86-4bba-93bc-4f037cd3344d\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.186840 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsxzp\" (UniqueName: \"kubernetes.io/projected/c4687472-a411-424b-bcc2-d39f84de6a17-kube-api-access-bsxzp\") pod \"horizon-operator-controller-manager-6d9d6b584d-mpfq5\" (UID: \"c4687472-a411-424b-bcc2-d39f84de6a17\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.191523 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8qm5\" (UniqueName: \"kubernetes.io/projected/7826406e-4038-4851-a54e-bf72ff94287f-kube-api-access-w8qm5\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.198197 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.206399 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bftvc\" (UniqueName: \"kubernetes.io/projected/1a700a4d-b7ed-4ea7-9382-b3994ba3646e-kube-api-access-bftvc\") pod \"ironic-operator-controller-manager-5bc894d9b-8bd8f\" (UID: \"1a700a4d-b7ed-4ea7-9382-b3994ba3646e\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.207023 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj2zb\" (UniqueName: \"kubernetes.io/projected/b93a2ad9-e58e-4b33-8d34-8101b1fa2d38-kube-api-access-fj2zb\") pod \"keystone-operator-controller-manager-684f77d66d-wxjdd\" (UID: \"b93a2ad9-e58e-4b33-8d34-8101b1fa2d38\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.217710 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.228919 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.229711 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.233203 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-pqqf5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.234486 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.245000 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.250524 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.266163 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.271174 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.271979 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zn7h\" (UniqueName: \"kubernetes.io/projected/8e63300a-b2d6-438f-9b17-989c103b9975-kube-api-access-5zn7h\") pod \"manila-operator-controller-manager-57b484b4df-hxcnt\" (UID: \"8e63300a-b2d6-438f-9b17-989c103b9975\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.272021 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qzc\" (UniqueName: \"kubernetes.io/projected/37c474a3-0434-4911-adf5-02d915b23d57-kube-api-access-m6qzc\") pod \"neutron-operator-controller-manager-776c5696bf-skfmd\" (UID: \"37c474a3-0434-4911-adf5-02d915b23d57\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.272044 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw265\" (UniqueName: \"kubernetes.io/projected/e2450a85-9b9c-49c6-8191-df5a87807e4f-kube-api-access-mw265\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-65hkw\" (UID: \"e2450a85-9b9c-49c6-8191-df5a87807e4f\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.272068 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmtb2\" (UniqueName: \"kubernetes.io/projected/e32e285a-970c-49ac-8531-d0d87b217b08-kube-api-access-zmtb2\") pod \"nova-operator-controller-manager-7f84474648-dzj6t\" (UID: \"e32e285a-970c-49ac-8531-d0d87b217b08\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.284400 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.285325 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.288445 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.288971 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bgmcm" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.295486 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qzc\" (UniqueName: \"kubernetes.io/projected/37c474a3-0434-4911-adf5-02d915b23d57-kube-api-access-m6qzc\") pod \"neutron-operator-controller-manager-776c5696bf-skfmd\" (UID: \"37c474a3-0434-4911-adf5-02d915b23d57\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.300719 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.301566 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.308636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zn7h\" (UniqueName: \"kubernetes.io/projected/8e63300a-b2d6-438f-9b17-989c103b9975-kube-api-access-5zn7h\") pod \"manila-operator-controller-manager-57b484b4df-hxcnt\" (UID: \"8e63300a-b2d6-438f-9b17-989c103b9975\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.318337 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmtb2\" (UniqueName: \"kubernetes.io/projected/e32e285a-970c-49ac-8531-d0d87b217b08-kube-api-access-zmtb2\") pod \"nova-operator-controller-manager-7f84474648-dzj6t\" (UID: \"e32e285a-970c-49ac-8531-d0d87b217b08\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.318933 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lw29x" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.329411 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw265\" (UniqueName: \"kubernetes.io/projected/e2450a85-9b9c-49c6-8191-df5a87807e4f-kube-api-access-mw265\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-65hkw\" (UID: \"e2450a85-9b9c-49c6-8191-df5a87807e4f\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.339783 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.371235 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.373176 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfg6d\" (UniqueName: \"kubernetes.io/projected/d5cc42a1-e4ee-4f94-8edf-4c8a46a632db-kube-api-access-nfg6d\") pod \"octavia-operator-controller-manager-5f4f55cb5c-wc9h2\" (UID: \"d5cc42a1-e4ee-4f94-8edf-4c8a46a632db\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.401246 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.401535 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.402316 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.405336 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-r5jqb" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.417552 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.418574 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.423490 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hmg8m" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.430711 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.437241 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.438523 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.448269 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.460275 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.472737 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.473123 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.475090 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfg6d\" (UniqueName: \"kubernetes.io/projected/d5cc42a1-e4ee-4f94-8edf-4c8a46a632db-kube-api-access-nfg6d\") pod \"octavia-operator-controller-manager-5f4f55cb5c-wc9h2\" (UID: \"d5cc42a1-e4ee-4f94-8edf-4c8a46a632db\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.475132 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.475515 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7thpn\" (UniqueName: \"kubernetes.io/projected/365f42f0-aed6-4131-b8d7-c01b9a8418d1-kube-api-access-7thpn\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.475778 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h29tm\" (UniqueName: \"kubernetes.io/projected/9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f-kube-api-access-h29tm\") pod \"ovn-operator-controller-manager-bbc5b68f9-rx48d\" (UID: \"9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.495957 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.497061 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.499765 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.505311 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfg6d\" (UniqueName: \"kubernetes.io/projected/d5cc42a1-e4ee-4f94-8edf-4c8a46a632db-kube-api-access-nfg6d\") pod \"octavia-operator-controller-manager-5f4f55cb5c-wc9h2\" (UID: \"d5cc42a1-e4ee-4f94-8edf-4c8a46a632db\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.506382 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-ljh2p" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.519072 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.520143 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.524609 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-dskjb" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.528329 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.578965 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jw6f\" (UniqueName: \"kubernetes.io/projected/82e7030d-fdee-4336-84f0-8a89605e1424-kube-api-access-7jw6f\") pod \"placement-operator-controller-manager-574d45c66c-spsz5\" (UID: \"82e7030d-fdee-4336-84f0-8a89605e1424\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.579045 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h29tm\" (UniqueName: \"kubernetes.io/projected/9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f-kube-api-access-h29tm\") pod \"ovn-operator-controller-manager-bbc5b68f9-rx48d\" (UID: \"9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.579090 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.579118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7thpn\" (UniqueName: \"kubernetes.io/projected/365f42f0-aed6-4131-b8d7-c01b9a8418d1-kube-api-access-7thpn\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.579144 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2wlf\" (UniqueName: \"kubernetes.io/projected/a8bee420-59e0-4eb5-a83a-1a518345ca42-kube-api-access-r2wlf\") pod \"swift-operator-controller-manager-7f9cc5dd44-s5wjg\" (UID: \"a8bee420-59e0-4eb5-a83a-1a518345ca42\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.579591 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.580679 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert podName:365f42f0-aed6-4131-b8d7-c01b9a8418d1 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:09.080657566 +0000 UTC m=+1099.243869377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" (UID: "365f42f0-aed6-4131-b8d7-c01b9a8418d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.593355 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.607034 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.607294 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7thpn\" (UniqueName: \"kubernetes.io/projected/365f42f0-aed6-4131-b8d7-c01b9a8418d1-kube-api-access-7thpn\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.608387 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.610285 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-d5ng4" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.618588 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h29tm\" (UniqueName: \"kubernetes.io/projected/9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f-kube-api-access-h29tm\") pod \"ovn-operator-controller-manager-bbc5b68f9-rx48d\" (UID: \"9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.623283 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.665933 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.667352 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.675505 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.675637 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.675767 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-c72jd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.684073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jw6f\" (UniqueName: \"kubernetes.io/projected/82e7030d-fdee-4336-84f0-8a89605e1424-kube-api-access-7jw6f\") pod \"placement-operator-controller-manager-574d45c66c-spsz5\" (UID: \"82e7030d-fdee-4336-84f0-8a89605e1424\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.684248 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bml6c\" (UniqueName: \"kubernetes.io/projected/f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53-kube-api-access-bml6c\") pod \"telemetry-operator-controller-manager-6854b8b9d9-pnzg8\" (UID: \"f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.684317 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2wlf\" (UniqueName: \"kubernetes.io/projected/a8bee420-59e0-4eb5-a83a-1a518345ca42-kube-api-access-r2wlf\") pod \"swift-operator-controller-manager-7f9cc5dd44-s5wjg\" (UID: \"a8bee420-59e0-4eb5-a83a-1a518345ca42\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.684344 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmndh\" (UniqueName: \"kubernetes.io/projected/9e012f2b-6b59-473d-8273-cb64d4957ad7-kube-api-access-fmndh\") pod \"test-operator-controller-manager-5c5cb9c4d7-zg7jh\" (UID: \"9e012f2b-6b59-473d-8273-cb64d4957ad7\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.684416 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.684705 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.684793 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert podName:7826406e-4038-4851-a54e-bf72ff94287f nodeName:}" failed. No retries permitted until 2026-03-13 15:21:09.68477122 +0000 UTC m=+1099.847983031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert") pod "infra-operator-controller-manager-54dc5b8f8d-98tsp" (UID: "7826406e-4038-4851-a54e-bf72ff94287f") : secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.692528 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.704194 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.712277 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.713360 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.715303 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-d4tjp" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.718207 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.721465 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jw6f\" (UniqueName: \"kubernetes.io/projected/82e7030d-fdee-4336-84f0-8a89605e1424-kube-api-access-7jw6f\") pod \"placement-operator-controller-manager-574d45c66c-spsz5\" (UID: \"82e7030d-fdee-4336-84f0-8a89605e1424\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.726123 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2wlf\" (UniqueName: \"kubernetes.io/projected/a8bee420-59e0-4eb5-a83a-1a518345ca42-kube-api-access-r2wlf\") pod \"swift-operator-controller-manager-7f9cc5dd44-s5wjg\" (UID: \"a8bee420-59e0-4eb5-a83a-1a518345ca42\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.732927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.750305 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.786490 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.786571 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bml6c\" (UniqueName: \"kubernetes.io/projected/f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53-kube-api-access-bml6c\") pod \"telemetry-operator-controller-manager-6854b8b9d9-pnzg8\" (UID: \"f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.786591 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpr5l\" (UniqueName: \"kubernetes.io/projected/66bf8109-666c-469d-b33c-ba5152cde7d9-kube-api-access-hpr5l\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.786618 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmndh\" (UniqueName: \"kubernetes.io/projected/9e012f2b-6b59-473d-8273-cb64d4957ad7-kube-api-access-fmndh\") pod \"test-operator-controller-manager-5c5cb9c4d7-zg7jh\" (UID: \"9e012f2b-6b59-473d-8273-cb64d4957ad7\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.786642 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv5jz\" (UniqueName: \"kubernetes.io/projected/b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f-kube-api-access-xv5jz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-nfvdd\" (UID: \"b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.786664 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.806971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bml6c\" (UniqueName: \"kubernetes.io/projected/f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53-kube-api-access-bml6c\") pod \"telemetry-operator-controller-manager-6854b8b9d9-pnzg8\" (UID: \"f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.809180 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmndh\" (UniqueName: \"kubernetes.io/projected/9e012f2b-6b59-473d-8273-cb64d4957ad7-kube-api-access-fmndh\") pod \"test-operator-controller-manager-5c5cb9c4d7-zg7jh\" (UID: \"9e012f2b-6b59-473d-8273-cb64d4957ad7\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.831088 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.844211 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.855940 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.887813 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.887922 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.887985 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpr5l\" (UniqueName: \"kubernetes.io/projected/66bf8109-666c-469d-b33c-ba5152cde7d9-kube-api-access-hpr5l\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.888008 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc28q\" (UniqueName: \"kubernetes.io/projected/5dcee199-4a59-4394-a565-2ce8e15e787c-kube-api-access-lc28q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sfxhr\" (UID: \"5dcee199-4a59-4394-a565-2ce8e15e787c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.888034 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.888126 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:09.388105575 +0000 UTC m=+1099.551317386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "webhook-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.888277 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: E0313 15:21:08.888325 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:09.38830898 +0000 UTC m=+1099.551520791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "metrics-server-cert" not found Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.888057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5jz\" (UniqueName: \"kubernetes.io/projected/b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f-kube-api-access-xv5jz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-nfvdd\" (UID: \"b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.921646 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpr5l\" (UniqueName: \"kubernetes.io/projected/66bf8109-666c-469d-b33c-ba5152cde7d9-kube-api-access-hpr5l\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.924875 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv5jz\" (UniqueName: \"kubernetes.io/projected/b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f-kube-api-access-xv5jz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-nfvdd\" (UID: \"b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.941069 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.989171 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7"] Mar 13 15:21:08 crc kubenswrapper[4786]: I0313 15:21:08.989235 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc28q\" (UniqueName: \"kubernetes.io/projected/5dcee199-4a59-4394-a565-2ce8e15e787c-kube-api-access-lc28q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sfxhr\" (UID: \"5dcee199-4a59-4394-a565-2ce8e15e787c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.064032 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc28q\" (UniqueName: \"kubernetes.io/projected/5dcee199-4a59-4394-a565-2ce8e15e787c-kube-api-access-lc28q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sfxhr\" (UID: \"5dcee199-4a59-4394-a565-2ce8e15e787c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.074460 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.092851 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.093065 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.093115 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert podName:365f42f0-aed6-4131-b8d7-c01b9a8418d1 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:10.093100911 +0000 UTC m=+1100.256312712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" (UID: "365f42f0-aed6-4131-b8d7-c01b9a8418d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.093667 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk"] Mar 13 15:21:09 crc kubenswrapper[4786]: W0313 15:21:09.118969 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9425c10_8f83_4b9f_81a8_0502889571a0.slice/crio-26a79b73d9c342bfbad772a4a3101d9acaa65d07b61b5dbfc9369e0aafc08dc8 WatchSource:0}: Error finding container 26a79b73d9c342bfbad772a4a3101d9acaa65d07b61b5dbfc9369e0aafc08dc8: Status 404 returned error can't find the container with id 26a79b73d9c342bfbad772a4a3101d9acaa65d07b61b5dbfc9369e0aafc08dc8 Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.223563 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.348946 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.356509 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.358271 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.365549 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.372995 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.377703 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.399111 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.399194 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.399312 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.399357 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:10.399341329 +0000 UTC m=+1100.562553140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "webhook-server-cert" not found Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.399396 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.399415 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:10.39940846 +0000 UTC m=+1100.562620271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "metrics-server-cert" not found Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.519976 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.532093 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.547879 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd"] Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.553663 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:af6065309235d342f15ba68d4bec51117e3a21cc630b5b72ba04aca2ce0d3703,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bftvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5bc894d9b-8bd8f_openstack-operators(1a700a4d-b7ed-4ea7-9382-b3994ba3646e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.554820 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" podUID="1a700a4d-b7ed-4ea7-9382-b3994ba3646e" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.559702 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.663006 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg"] Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.677341 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h29tm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-rx48d_openstack-operators(9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.677701 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xv5jz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-nfvdd_openstack-operators(b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.678935 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" podUID="b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.678977 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" podUID="9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.680658 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d"] Mar 13 15:21:09 crc kubenswrapper[4786]: W0313 15:21:09.681611 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e012f2b_6b59_473d_8273_cb64d4957ad7.slice/crio-2032dfade6751ec6fe53a73227d6e795a95cec7d85123310be90e28c6c02d43a WatchSource:0}: Error finding container 2032dfade6751ec6fe53a73227d6e795a95cec7d85123310be90e28c6c02d43a: Status 404 returned error can't find the container with id 2032dfade6751ec6fe53a73227d6e795a95cec7d85123310be90e28c6c02d43a Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.683746 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fmndh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-zg7jh_openstack-operators(9e012f2b-6b59-473d-8273-cb64d4957ad7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.684848 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" podUID="9e012f2b-6b59-473d-8273-cb64d4957ad7" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.688618 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd"] Mar 13 15:21:09 crc kubenswrapper[4786]: W0313 15:21:09.689503 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8bee420_59e0_4eb5_a83a_1a518345ca42.slice/crio-6527e5718d10030274ff69d74ede414f89bc8fa6c556ba86dce2c9dd82bfcbee WatchSource:0}: Error finding container 6527e5718d10030274ff69d74ede414f89bc8fa6c556ba86dce2c9dd82bfcbee: Status 404 returned error can't find the container with id 6527e5718d10030274ff69d74ede414f89bc8fa6c556ba86dce2c9dd82bfcbee Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.704771 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.705052 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.705119 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert podName:7826406e-4038-4851-a54e-bf72ff94287f nodeName:}" failed. No retries permitted until 2026-03-13 15:21:11.705100874 +0000 UTC m=+1101.868312685 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert") pod "infra-operator-controller-manager-54dc5b8f8d-98tsp" (UID: "7826406e-4038-4851-a54e-bf72ff94287f") : secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.705434 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh"] Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.706822 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r2wlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f9cc5dd44-s5wjg_openstack-operators(a8bee420-59e0-4eb5-a83a-1a518345ca42): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.706849 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nfg6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-wc9h2_openstack-operators(d5cc42a1-e4ee-4f94-8edf-4c8a46a632db): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.709710 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" podUID="d5cc42a1-e4ee-4f94-8edf-4c8a46a632db" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.709782 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" podUID="a8bee420-59e0-4eb5-a83a-1a518345ca42" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.713944 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2"] Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.733941 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" event={"ID":"e32e285a-970c-49ac-8531-d0d87b217b08","Type":"ContainerStarted","Data":"20625dd379ae1296c18ff534ac4a7e6943b6ae83450b0096a418729d130e8bf0"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.735029 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" event={"ID":"c4687472-a411-424b-bcc2-d39f84de6a17","Type":"ContainerStarted","Data":"e8702edca8c5e3d6b2feb51f3f5d38cf5882881cd574bd38039b4f79221568f8"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.737308 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" event={"ID":"37c474a3-0434-4911-adf5-02d915b23d57","Type":"ContainerStarted","Data":"64e182df9a19fc6171de56eb03ef5b9dd5d33ce792e5ffb04ccc74301bb01566"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.738150 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" event={"ID":"e2450a85-9b9c-49c6-8191-df5a87807e4f","Type":"ContainerStarted","Data":"5a3f121e339930a4c125103967adb6142f2e9ba1b9c959b82cc663784d9bb99d"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.741701 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" event={"ID":"82e7030d-fdee-4336-84f0-8a89605e1424","Type":"ContainerStarted","Data":"4c00bb4c5277aadb9922f9a221f6f24604fb3565cea314518e76b0582ff18f84"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.742670 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" event={"ID":"b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f","Type":"ContainerStarted","Data":"8da27f3774335656d8d4014f04e597bd79e7432b7058a308207f52626ea0d440"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.744297 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" event={"ID":"a8bee420-59e0-4eb5-a83a-1a518345ca42","Type":"ContainerStarted","Data":"6527e5718d10030274ff69d74ede414f89bc8fa6c556ba86dce2c9dd82bfcbee"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.745368 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" event={"ID":"9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f","Type":"ContainerStarted","Data":"45b5bec983b7dfa113fe45b6af5c607b1e5467fc283a0100e4957b9fb894713a"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.747331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" event={"ID":"d5cc42a1-e4ee-4f94-8edf-4c8a46a632db","Type":"ContainerStarted","Data":"3f092e2b4c58781427b5c29d0b5c6da974f2095cbbd67f4b65d5a2642357179a"} Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.748468 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" podUID="b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.748700 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" podUID="a8bee420-59e0-4eb5-a83a-1a518345ca42" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.749023 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" podUID="d5cc42a1-e4ee-4f94-8edf-4c8a46a632db" Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.749080 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" podUID="9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.749255 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" event={"ID":"9e012f2b-6b59-473d-8273-cb64d4957ad7","Type":"ContainerStarted","Data":"2032dfade6751ec6fe53a73227d6e795a95cec7d85123310be90e28c6c02d43a"} Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.750974 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" podUID="9e012f2b-6b59-473d-8273-cb64d4957ad7" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.751670 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" event={"ID":"a176571a-293b-4ff6-8928-1cc5f3b28c44","Type":"ContainerStarted","Data":"475f64fec6f588ffe95b743d9bf4e25d72d33ea21db3b06a9593abc265e0da56"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.752593 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" event={"ID":"1a700a4d-b7ed-4ea7-9382-b3994ba3646e","Type":"ContainerStarted","Data":"1102afcc73d520a5406997116c907f9488ef50a185d5a5363d3e63589d824e14"} Mar 13 15:21:09 crc kubenswrapper[4786]: E0313 15:21:09.753659 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:af6065309235d342f15ba68d4bec51117e3a21cc630b5b72ba04aca2ce0d3703\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" podUID="1a700a4d-b7ed-4ea7-9382-b3994ba3646e" Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.754478 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" event={"ID":"8e63300a-b2d6-438f-9b17-989c103b9975","Type":"ContainerStarted","Data":"fb98898bd2086b702f66cb14c73103792ca8ed3a5d74b3a5dd266d4f2ca7a49d"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.755240 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" event={"ID":"d9425c10-8f83-4b9f-81a8-0502889571a0","Type":"ContainerStarted","Data":"26a79b73d9c342bfbad772a4a3101d9acaa65d07b61b5dbfc9369e0aafc08dc8"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.756033 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" event={"ID":"721d9249-da86-4bba-93bc-4f037cd3344d","Type":"ContainerStarted","Data":"2f1b927107d29cdeafbb6eb1a679f1684786a90c495ee7abafd9197b1ce17451"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.757070 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" event={"ID":"b93a2ad9-e58e-4b33-8d34-8101b1fa2d38","Type":"ContainerStarted","Data":"4f0f42fee9b551a35e75a09c70a019848e45a16c8ef903540ecc99cc7a2bf35b"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.758396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" event={"ID":"e60c275e-371c-48d7-8816-56ae26f8e911","Type":"ContainerStarted","Data":"aa7827379c7bae6663e8cd46c637a33315aa3755ebae16d35b6dc19c47b7b8ac"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.759744 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" event={"ID":"bad187db-13d7-4bf9-9b5f-9ce08a17b9c7","Type":"ContainerStarted","Data":"615de5a00953fcf24e0cf5831292449837567b4bf9bcbfd611b5f1abd0b6d688"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.761427 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" event={"ID":"f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53","Type":"ContainerStarted","Data":"7e9fa94f54b78d8425ffd59d22c61a1689d9ee8a4b563a2c832e58a461c8976e"} Mar 13 15:21:09 crc kubenswrapper[4786]: I0313 15:21:09.836644 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr"] Mar 13 15:21:10 crc kubenswrapper[4786]: I0313 15:21:10.120156 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.120495 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.120577 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert podName:365f42f0-aed6-4131-b8d7-c01b9a8418d1 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:12.120555014 +0000 UTC m=+1102.283766885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" (UID: "365f42f0-aed6-4131-b8d7-c01b9a8418d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:10 crc kubenswrapper[4786]: I0313 15:21:10.428920 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:10 crc kubenswrapper[4786]: I0313 15:21:10.429053 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.429108 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.429180 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:12.429161852 +0000 UTC m=+1102.592373663 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "webhook-server-cert" not found Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.429189 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.429247 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:12.429228784 +0000 UTC m=+1102.592440675 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "metrics-server-cert" not found Mar 13 15:21:10 crc kubenswrapper[4786]: I0313 15:21:10.796591 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" event={"ID":"5dcee199-4a59-4394-a565-2ce8e15e787c","Type":"ContainerStarted","Data":"738faa004b8e88a8e87284cc614d7e7d0b66d160e42dfab54b8ab3ebdbdf3e1b"} Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.798939 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" podUID="a8bee420-59e0-4eb5-a83a-1a518345ca42" Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.800799 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" podUID="9e012f2b-6b59-473d-8273-cb64d4957ad7" Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.800949 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" podUID="d5cc42a1-e4ee-4f94-8edf-4c8a46a632db" Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.800991 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" podUID="9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f" Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.801042 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" podUID="b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f" Mar 13 15:21:10 crc kubenswrapper[4786]: E0313 15:21:10.801078 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:af6065309235d342f15ba68d4bec51117e3a21cc630b5b72ba04aca2ce0d3703\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" podUID="1a700a4d-b7ed-4ea7-9382-b3994ba3646e" Mar 13 15:21:11 crc kubenswrapper[4786]: I0313 15:21:11.779999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:11 crc kubenswrapper[4786]: E0313 15:21:11.780169 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:11 crc kubenswrapper[4786]: E0313 15:21:11.780240 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert podName:7826406e-4038-4851-a54e-bf72ff94287f nodeName:}" failed. No retries permitted until 2026-03-13 15:21:15.780221732 +0000 UTC m=+1105.943433543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert") pod "infra-operator-controller-manager-54dc5b8f8d-98tsp" (UID: "7826406e-4038-4851-a54e-bf72ff94287f") : secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:12 crc kubenswrapper[4786]: I0313 15:21:12.185406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:12 crc kubenswrapper[4786]: E0313 15:21:12.185674 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:12 crc kubenswrapper[4786]: E0313 15:21:12.185776 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert podName:365f42f0-aed6-4131-b8d7-c01b9a8418d1 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:16.18575174 +0000 UTC m=+1106.348963591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" (UID: "365f42f0-aed6-4131-b8d7-c01b9a8418d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:12 crc kubenswrapper[4786]: I0313 15:21:12.488381 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:12 crc kubenswrapper[4786]: I0313 15:21:12.488474 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:12 crc kubenswrapper[4786]: E0313 15:21:12.488533 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 15:21:12 crc kubenswrapper[4786]: E0313 15:21:12.488605 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:16.488586964 +0000 UTC m=+1106.651798775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "webhook-server-cert" not found Mar 13 15:21:12 crc kubenswrapper[4786]: E0313 15:21:12.488686 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 15:21:12 crc kubenswrapper[4786]: E0313 15:21:12.488739 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:16.488724967 +0000 UTC m=+1106.651936768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "metrics-server-cert" not found Mar 13 15:21:15 crc kubenswrapper[4786]: I0313 15:21:15.835791 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:15 crc kubenswrapper[4786]: E0313 15:21:15.836004 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:15 crc kubenswrapper[4786]: E0313 15:21:15.836418 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert podName:7826406e-4038-4851-a54e-bf72ff94287f nodeName:}" failed. No retries permitted until 2026-03-13 15:21:23.836402973 +0000 UTC m=+1113.999614784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert") pod "infra-operator-controller-manager-54dc5b8f8d-98tsp" (UID: "7826406e-4038-4851-a54e-bf72ff94287f") : secret "infra-operator-webhook-server-cert" not found Mar 13 15:21:16 crc kubenswrapper[4786]: I0313 15:21:16.241442 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:16 crc kubenswrapper[4786]: E0313 15:21:16.241628 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:16 crc kubenswrapper[4786]: E0313 15:21:16.241680 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert podName:365f42f0-aed6-4131-b8d7-c01b9a8418d1 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:24.241663075 +0000 UTC m=+1114.404874886 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" (UID: "365f42f0-aed6-4131-b8d7-c01b9a8418d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:16 crc kubenswrapper[4786]: I0313 15:21:16.545408 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:16 crc kubenswrapper[4786]: I0313 15:21:16.545553 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:16 crc kubenswrapper[4786]: E0313 15:21:16.545619 4786 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 15:21:16 crc kubenswrapper[4786]: E0313 15:21:16.545692 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 15:21:16 crc kubenswrapper[4786]: E0313 15:21:16.545741 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:24.545717278 +0000 UTC m=+1114.708929159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "metrics-server-cert" not found Mar 13 15:21:16 crc kubenswrapper[4786]: E0313 15:21:16.545771 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:24.545753579 +0000 UTC m=+1114.708965490 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "webhook-server-cert" not found Mar 13 15:21:22 crc kubenswrapper[4786]: E0313 15:21:22.967661 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165" Mar 13 15:21:22 crc kubenswrapper[4786]: E0313 15:21:22.968374 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5rn8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-d47688694-wgpvr_openstack-operators(e60c275e-371c-48d7-8816-56ae26f8e911): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:21:22 crc kubenswrapper[4786]: E0313 15:21:22.969529 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" podUID="e60c275e-371c-48d7-8816-56ae26f8e911" Mar 13 15:21:23 crc kubenswrapper[4786]: E0313 15:21:23.467327 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7" Mar 13 15:21:23 crc kubenswrapper[4786]: E0313 15:21:23.467664 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v4gps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66d56f6ff4-zq9jr_openstack-operators(bad187db-13d7-4bf9-9b5f-9ce08a17b9c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:21:23 crc kubenswrapper[4786]: E0313 15:21:23.468988 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" podUID="bad187db-13d7-4bf9-9b5f-9ce08a17b9c7" Mar 13 15:21:23 crc kubenswrapper[4786]: I0313 15:21:23.868881 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:23 crc kubenswrapper[4786]: I0313 15:21:23.879584 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7826406e-4038-4851-a54e-bf72ff94287f-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-98tsp\" (UID: \"7826406e-4038-4851-a54e-bf72ff94287f\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:23 crc kubenswrapper[4786]: E0313 15:21:23.887042 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" podUID="e60c275e-371c-48d7-8816-56ae26f8e911" Mar 13 15:21:23 crc kubenswrapper[4786]: E0313 15:21:23.887542 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" podUID="bad187db-13d7-4bf9-9b5f-9ce08a17b9c7" Mar 13 15:21:23 crc kubenswrapper[4786]: I0313 15:21:23.888015 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:24 crc kubenswrapper[4786]: E0313 15:21:24.042175 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a" Mar 13 15:21:24 crc kubenswrapper[4786]: E0313 15:21:24.043183 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bml6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6854b8b9d9-pnzg8_openstack-operators(f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:21:24 crc kubenswrapper[4786]: E0313 15:21:24.044428 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" podUID="f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53" Mar 13 15:21:24 crc kubenswrapper[4786]: I0313 15:21:24.274116 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:24 crc kubenswrapper[4786]: E0313 15:21:24.274277 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:24 crc kubenswrapper[4786]: E0313 15:21:24.274411 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert podName:365f42f0-aed6-4131-b8d7-c01b9a8418d1 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:40.274371823 +0000 UTC m=+1130.437583634 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert") pod "openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" (UID: "365f42f0-aed6-4131-b8d7-c01b9a8418d1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 15:21:24 crc kubenswrapper[4786]: I0313 15:21:24.578380 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:24 crc kubenswrapper[4786]: E0313 15:21:24.578532 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 15:21:24 crc kubenswrapper[4786]: I0313 15:21:24.578866 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:24 crc kubenswrapper[4786]: E0313 15:21:24.578894 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs podName:66bf8109-666c-469d-b33c-ba5152cde7d9 nodeName:}" failed. No retries permitted until 2026-03-13 15:21:40.578848457 +0000 UTC m=+1130.742060268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs") pod "openstack-operator-controller-manager-6484b7b757-tsssq" (UID: "66bf8109-666c-469d-b33c-ba5152cde7d9") : secret "webhook-server-cert" not found Mar 13 15:21:24 crc kubenswrapper[4786]: I0313 15:21:24.598310 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-metrics-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:24 crc kubenswrapper[4786]: E0313 15:21:24.895169 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" podUID="f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53" Mar 13 15:21:25 crc kubenswrapper[4786]: E0313 15:21:25.662762 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40" Mar 13 15:21:25 crc kubenswrapper[4786]: E0313 15:21:25.662948 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5zn7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-57b484b4df-hxcnt_openstack-operators(8e63300a-b2d6-438f-9b17-989c103b9975): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:21:25 crc kubenswrapper[4786]: E0313 15:21:25.664175 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" podUID="8e63300a-b2d6-438f-9b17-989c103b9975" Mar 13 15:21:25 crc kubenswrapper[4786]: E0313 15:21:25.900502 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40\\\"\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" podUID="8e63300a-b2d6-438f-9b17-989c103b9975" Mar 13 15:21:26 crc kubenswrapper[4786]: E0313 15:21:26.266335 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 13 15:21:26 crc kubenswrapper[4786]: E0313 15:21:26.266819 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fj2zb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-wxjdd_openstack-operators(b93a2ad9-e58e-4b33-8d34-8101b1fa2d38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:21:26 crc kubenswrapper[4786]: E0313 15:21:26.268026 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" podUID="b93a2ad9-e58e-4b33-8d34-8101b1fa2d38" Mar 13 15:21:26 crc kubenswrapper[4786]: E0313 15:21:26.912436 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" podUID="b93a2ad9-e58e-4b33-8d34-8101b1fa2d38" Mar 13 15:21:28 crc kubenswrapper[4786]: E0313 15:21:28.323308 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff" Mar 13 15:21:28 crc kubenswrapper[4786]: E0313 15:21:28.323526 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zmtb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7f84474648-dzj6t_openstack-operators(e32e285a-970c-49ac-8531-d0d87b217b08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:21:28 crc kubenswrapper[4786]: E0313 15:21:28.324718 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" podUID="e32e285a-970c-49ac-8531-d0d87b217b08" Mar 13 15:21:28 crc kubenswrapper[4786]: E0313 15:21:28.924653 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" podUID="e32e285a-970c-49ac-8531-d0d87b217b08" Mar 13 15:21:31 crc kubenswrapper[4786]: E0313 15:21:31.058412 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 13 15:21:31 crc kubenswrapper[4786]: E0313 15:21:31.058650 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lc28q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-sfxhr_openstack-operators(5dcee199-4a59-4394-a565-2ce8e15e787c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:21:31 crc kubenswrapper[4786]: E0313 15:21:31.059893 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" podUID="5dcee199-4a59-4394-a565-2ce8e15e787c" Mar 13 15:21:31 crc kubenswrapper[4786]: E0313 15:21:31.945426 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" podUID="5dcee199-4a59-4394-a565-2ce8e15e787c" Mar 13 15:21:35 crc kubenswrapper[4786]: I0313 15:21:35.704970 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp"] Mar 13 15:21:35 crc kubenswrapper[4786]: W0313 15:21:35.722065 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7826406e_4038_4851_a54e_bf72ff94287f.slice/crio-265d022eadbd4a806c851afe2e491908527e0c0c94fb081bff1cd4e1304cc040 WatchSource:0}: Error finding container 265d022eadbd4a806c851afe2e491908527e0c0c94fb081bff1cd4e1304cc040: Status 404 returned error can't find the container with id 265d022eadbd4a806c851afe2e491908527e0c0c94fb081bff1cd4e1304cc040 Mar 13 15:21:35 crc kubenswrapper[4786]: I0313 15:21:35.969374 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" event={"ID":"7826406e-4038-4851-a54e-bf72ff94287f","Type":"ContainerStarted","Data":"265d022eadbd4a806c851afe2e491908527e0c0c94fb081bff1cd4e1304cc040"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.976456 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" event={"ID":"9e012f2b-6b59-473d-8273-cb64d4957ad7","Type":"ContainerStarted","Data":"9b462965b130e23a084c25046531476e15407f982fd5332d821003b19e5eea85"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.977017 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.977722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" event={"ID":"a176571a-293b-4ff6-8928-1cc5f3b28c44","Type":"ContainerStarted","Data":"dc5bb0f8c4debfd750138be922d553f8db769d4598f0b0cf00ce65cef351844e"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.978091 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.979776 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" event={"ID":"c4687472-a411-424b-bcc2-d39f84de6a17","Type":"ContainerStarted","Data":"cf184a9d0696f7e63efed08b727971ce35a41c38a89b950fc17668bf929b391e"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.979829 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.981086 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" event={"ID":"b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f","Type":"ContainerStarted","Data":"0bda875013093fcc048d4b0eed539c361f21e005e1c8c3abf1063e0b8b6f4237"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.981654 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.982820 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" event={"ID":"9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f","Type":"ContainerStarted","Data":"d960c4a2031934dc9d0450a9c6476588af34da73f1056d78733c2b85049125f5"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.983170 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.984611 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" event={"ID":"d9425c10-8f83-4b9f-81a8-0502889571a0","Type":"ContainerStarted","Data":"68f48c08ac57c40ecc953c4588917c02164d7f4b501db604598f3ff42d94b1ac"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.984714 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.986006 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" event={"ID":"37c474a3-0434-4911-adf5-02d915b23d57","Type":"ContainerStarted","Data":"914cc525bf3e91d5636345f7d727e9b4575403895b0f0ddc6cdf2b58549ab65a"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.986090 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.987400 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" event={"ID":"e2450a85-9b9c-49c6-8191-df5a87807e4f","Type":"ContainerStarted","Data":"b99f79c034f1c9e61a29097562b16866240d972849c30ebb7c19dbc29b9e0655"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.987504 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.988826 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" event={"ID":"721d9249-da86-4bba-93bc-4f037cd3344d","Type":"ContainerStarted","Data":"a6f388764c4b8c62154af895231cc404215cf87789ca7e879ea6d5442fbd4dc6"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.988951 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.990477 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" event={"ID":"82e7030d-fdee-4336-84f0-8a89605e1424","Type":"ContainerStarted","Data":"96a35166f54114780284d232b54e77f6a98b2408f45fa1225c85805f3627f965"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.990541 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.992136 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" event={"ID":"d5cc42a1-e4ee-4f94-8edf-4c8a46a632db","Type":"ContainerStarted","Data":"0fe3c9a15c950abdb6895ef2dc775ed18ec097deec51e9b34472304689105184"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.992321 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.993701 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" event={"ID":"1a700a4d-b7ed-4ea7-9382-b3994ba3646e","Type":"ContainerStarted","Data":"0e98028099da7131828e6f3518e3eb9b102a9c8d2271c2615e36a1fbd13af11c"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.993869 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.994972 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" event={"ID":"a8bee420-59e0-4eb5-a83a-1a518345ca42","Type":"ContainerStarted","Data":"e060f9cc10703412a73a38e317f2189543fe5c7c192027482314e9adff839b33"} Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.995115 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" Mar 13 15:21:36 crc kubenswrapper[4786]: I0313 15:21:36.998992 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" podStartSLOduration=3.05417949 podStartE2EDuration="28.998978345s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.68358548 +0000 UTC m=+1099.846797291" lastFinishedPulling="2026-03-13 15:21:35.628384335 +0000 UTC m=+1125.791596146" observedRunningTime="2026-03-13 15:21:36.998154874 +0000 UTC m=+1127.161366685" watchObservedRunningTime="2026-03-13 15:21:36.998978345 +0000 UTC m=+1127.162190156" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.041757 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" podStartSLOduration=11.360076431 podStartE2EDuration="30.041738984s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.149808276 +0000 UTC m=+1099.313020087" lastFinishedPulling="2026-03-13 15:21:27.831470829 +0000 UTC m=+1117.994682640" observedRunningTime="2026-03-13 15:21:37.020268472 +0000 UTC m=+1127.183480283" watchObservedRunningTime="2026-03-13 15:21:37.041738984 +0000 UTC m=+1127.204950795" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.045688 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" podStartSLOduration=10.407981541 podStartE2EDuration="30.045675552s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.39819711 +0000 UTC m=+1099.561408921" lastFinishedPulling="2026-03-13 15:21:29.035891121 +0000 UTC m=+1119.199102932" observedRunningTime="2026-03-13 15:21:37.040616376 +0000 UTC m=+1127.203828197" watchObservedRunningTime="2026-03-13 15:21:37.045675552 +0000 UTC m=+1127.208887363" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.063576 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" podStartSLOduration=3.159523087 podStartE2EDuration="29.063556885s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.677195108 +0000 UTC m=+1099.840406919" lastFinishedPulling="2026-03-13 15:21:35.581228906 +0000 UTC m=+1125.744440717" observedRunningTime="2026-03-13 15:21:37.059750001 +0000 UTC m=+1127.222961812" watchObservedRunningTime="2026-03-13 15:21:37.063556885 +0000 UTC m=+1127.226768696" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.090557 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" podStartSLOduration=9.583930178 podStartE2EDuration="29.090536453s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.529090601 +0000 UTC m=+1099.692302412" lastFinishedPulling="2026-03-13 15:21:29.035696856 +0000 UTC m=+1119.198908687" observedRunningTime="2026-03-13 15:21:37.084955495 +0000 UTC m=+1127.248167306" watchObservedRunningTime="2026-03-13 15:21:37.090536453 +0000 UTC m=+1127.253748264" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.108450 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" podStartSLOduration=3.221668353 podStartE2EDuration="29.108429647s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.706742666 +0000 UTC m=+1099.869954477" lastFinishedPulling="2026-03-13 15:21:35.59350396 +0000 UTC m=+1125.756715771" observedRunningTime="2026-03-13 15:21:37.106327625 +0000 UTC m=+1127.269539436" watchObservedRunningTime="2026-03-13 15:21:37.108429647 +0000 UTC m=+1127.271641458" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.128220 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" podStartSLOduration=3.24215141 podStartE2EDuration="29.128199687s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.706585602 +0000 UTC m=+1099.869797433" lastFinishedPulling="2026-03-13 15:21:35.592633899 +0000 UTC m=+1125.755845710" observedRunningTime="2026-03-13 15:21:37.126254408 +0000 UTC m=+1127.289466219" watchObservedRunningTime="2026-03-13 15:21:37.128199687 +0000 UTC m=+1127.291411498" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.155810 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" podStartSLOduration=10.238484906 podStartE2EDuration="30.15579525s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.119015267 +0000 UTC m=+1099.282227078" lastFinishedPulling="2026-03-13 15:21:29.036325591 +0000 UTC m=+1119.199537422" observedRunningTime="2026-03-13 15:21:37.151122895 +0000 UTC m=+1127.314334706" watchObservedRunningTime="2026-03-13 15:21:37.15579525 +0000 UTC m=+1127.319007061" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.175845 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" podStartSLOduration=3.546440494 podStartE2EDuration="29.175822617s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.677618739 +0000 UTC m=+1099.840830550" lastFinishedPulling="2026-03-13 15:21:35.307000842 +0000 UTC m=+1125.470212673" observedRunningTime="2026-03-13 15:21:37.175394716 +0000 UTC m=+1127.338606527" watchObservedRunningTime="2026-03-13 15:21:37.175822617 +0000 UTC m=+1127.339034428" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.198035 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" podStartSLOduration=11.600863491 podStartE2EDuration="30.198016766s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.234403396 +0000 UTC m=+1099.397615207" lastFinishedPulling="2026-03-13 15:21:27.831556671 +0000 UTC m=+1117.994768482" observedRunningTime="2026-03-13 15:21:37.193662499 +0000 UTC m=+1127.356874310" watchObservedRunningTime="2026-03-13 15:21:37.198016766 +0000 UTC m=+1127.361228577" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.213995 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" podStartSLOduration=9.72803373 podStartE2EDuration="29.213978202s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.548952224 +0000 UTC m=+1099.712164045" lastFinishedPulling="2026-03-13 15:21:29.034896686 +0000 UTC m=+1119.198108517" observedRunningTime="2026-03-13 15:21:37.207975523 +0000 UTC m=+1127.371187334" watchObservedRunningTime="2026-03-13 15:21:37.213978202 +0000 UTC m=+1127.377190013" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.264269 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" podStartSLOduration=4.235449704 podStartE2EDuration="30.264252708s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.553509459 +0000 UTC m=+1099.716721270" lastFinishedPulling="2026-03-13 15:21:35.582312463 +0000 UTC m=+1125.745524274" observedRunningTime="2026-03-13 15:21:37.262835123 +0000 UTC m=+1127.426046934" watchObservedRunningTime="2026-03-13 15:21:37.264252708 +0000 UTC m=+1127.427464519" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.265351 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" podStartSLOduration=10.360810033 podStartE2EDuration="30.265345785s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.130053376 +0000 UTC m=+1099.293265177" lastFinishedPulling="2026-03-13 15:21:29.034589118 +0000 UTC m=+1119.197800929" observedRunningTime="2026-03-13 15:21:37.240315035 +0000 UTC m=+1127.403526866" watchObservedRunningTime="2026-03-13 15:21:37.265345785 +0000 UTC m=+1127.428557606" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.868576 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.868651 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.868704 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.869415 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a926826d2fa94d740a03a1a08b36f6e48f1ce5a8cc37acd7e0fef98af56e6473"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:21:37 crc kubenswrapper[4786]: I0313 15:21:37.869488 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://a926826d2fa94d740a03a1a08b36f6e48f1ce5a8cc37acd7e0fef98af56e6473" gracePeriod=600 Mar 13 15:21:38 crc kubenswrapper[4786]: I0313 15:21:38.007086 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" event={"ID":"e60c275e-371c-48d7-8816-56ae26f8e911","Type":"ContainerStarted","Data":"396cf6e8b6576d1127372080deaf27edb9a4ca66d70f4556797c29a0765efd00"} Mar 13 15:21:38 crc kubenswrapper[4786]: I0313 15:21:38.007911 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" Mar 13 15:21:38 crc kubenswrapper[4786]: I0313 15:21:38.010092 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" event={"ID":"bad187db-13d7-4bf9-9b5f-9ce08a17b9c7","Type":"ContainerStarted","Data":"cdcb4ed0400c5b46a702bb882e17b294fb183a47695aaaa518605c39069e95cd"} Mar 13 15:21:38 crc kubenswrapper[4786]: I0313 15:21:38.054668 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" podStartSLOduration=3.096272201 podStartE2EDuration="31.054652932s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.36935142 +0000 UTC m=+1099.532563231" lastFinishedPulling="2026-03-13 15:21:37.327732151 +0000 UTC m=+1127.490943962" observedRunningTime="2026-03-13 15:21:38.052232222 +0000 UTC m=+1128.215444053" watchObservedRunningTime="2026-03-13 15:21:38.054652932 +0000 UTC m=+1128.217864743" Mar 13 15:21:38 crc kubenswrapper[4786]: I0313 15:21:38.056272 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" podStartSLOduration=2.662566909 podStartE2EDuration="31.056264882s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:08.935035982 +0000 UTC m=+1099.098247793" lastFinishedPulling="2026-03-13 15:21:37.328733945 +0000 UTC m=+1127.491945766" observedRunningTime="2026-03-13 15:21:38.030190556 +0000 UTC m=+1128.193402367" watchObservedRunningTime="2026-03-13 15:21:38.056264882 +0000 UTC m=+1128.219476693" Mar 13 15:21:38 crc kubenswrapper[4786]: I0313 15:21:38.235549 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" Mar 13 15:21:39 crc kubenswrapper[4786]: I0313 15:21:39.020000 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="a926826d2fa94d740a03a1a08b36f6e48f1ce5a8cc37acd7e0fef98af56e6473" exitCode=0 Mar 13 15:21:39 crc kubenswrapper[4786]: I0313 15:21:39.020124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"a926826d2fa94d740a03a1a08b36f6e48f1ce5a8cc37acd7e0fef98af56e6473"} Mar 13 15:21:39 crc kubenswrapper[4786]: I0313 15:21:39.020157 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"cedf575f572a8d2fa7d4acff7bfb9c6086d44a2d58bd68733b103bae3b833d49"} Mar 13 15:21:39 crc kubenswrapper[4786]: I0313 15:21:39.020177 4786 scope.go:117] "RemoveContainer" containerID="dad5dab593ccac2d22182d28c8abfa4af5554be94f9545ed96143d1052cb64d4" Mar 13 15:21:40 crc kubenswrapper[4786]: I0313 15:21:40.318132 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:40 crc kubenswrapper[4786]: I0313 15:21:40.326671 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365f42f0-aed6-4131-b8d7-c01b9a8418d1-cert\") pod \"openstack-baremetal-operator-controller-manager-6f7958d7742jq6p\" (UID: \"365f42f0-aed6-4131-b8d7-c01b9a8418d1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:40 crc kubenswrapper[4786]: I0313 15:21:40.464328 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:21:40 crc kubenswrapper[4786]: I0313 15:21:40.624219 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:40 crc kubenswrapper[4786]: I0313 15:21:40.647193 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/66bf8109-666c-469d-b33c-ba5152cde7d9-webhook-certs\") pod \"openstack-operator-controller-manager-6484b7b757-tsssq\" (UID: \"66bf8109-666c-469d-b33c-ba5152cde7d9\") " pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:40 crc kubenswrapper[4786]: I0313 15:21:40.800103 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:40 crc kubenswrapper[4786]: I0313 15:21:40.925134 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p"] Mar 13 15:21:41 crc kubenswrapper[4786]: I0313 15:21:41.037020 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" event={"ID":"f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53","Type":"ContainerStarted","Data":"1619e19cff1aa83b71a9d15e3284c09e54585312652fc6755321106cded22e45"} Mar 13 15:21:41 crc kubenswrapper[4786]: I0313 15:21:41.037222 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" Mar 13 15:21:41 crc kubenswrapper[4786]: I0313 15:21:41.059306 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" podStartSLOduration=2.247223114 podStartE2EDuration="33.059286718s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.546702237 +0000 UTC m=+1099.709914058" lastFinishedPulling="2026-03-13 15:21:40.358765851 +0000 UTC m=+1130.521977662" observedRunningTime="2026-03-13 15:21:41.057293459 +0000 UTC m=+1131.220505270" watchObservedRunningTime="2026-03-13 15:21:41.059286718 +0000 UTC m=+1131.222498529" Mar 13 15:21:41 crc kubenswrapper[4786]: W0313 15:21:41.828102 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod365f42f0_aed6_4131_b8d7_c01b9a8418d1.slice/crio-6afc0bf7cbf065d9b20083ab29331a944517bddb09e1478e967956bbcb0d0ded WatchSource:0}: Error finding container 6afc0bf7cbf065d9b20083ab29331a944517bddb09e1478e967956bbcb0d0ded: Status 404 returned error can't find the container with id 6afc0bf7cbf065d9b20083ab29331a944517bddb09e1478e967956bbcb0d0ded Mar 13 15:21:42 crc kubenswrapper[4786]: I0313 15:21:42.049207 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" event={"ID":"365f42f0-aed6-4131-b8d7-c01b9a8418d1","Type":"ContainerStarted","Data":"6afc0bf7cbf065d9b20083ab29331a944517bddb09e1478e967956bbcb0d0ded"} Mar 13 15:21:42 crc kubenswrapper[4786]: I0313 15:21:42.793186 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq"] Mar 13 15:21:43 crc kubenswrapper[4786]: I0313 15:21:43.058836 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" event={"ID":"7826406e-4038-4851-a54e-bf72ff94287f","Type":"ContainerStarted","Data":"fda88c838b570d8699902854bf56432492a297a1f1a85dddc2fdb53d0437827a"} Mar 13 15:21:43 crc kubenswrapper[4786]: I0313 15:21:43.059910 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:43 crc kubenswrapper[4786]: I0313 15:21:43.060481 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" event={"ID":"8e63300a-b2d6-438f-9b17-989c103b9975","Type":"ContainerStarted","Data":"da3c0de9cb2320ba296575feb87a5eaf2af5683b73ece46daba76ec89e9788c4"} Mar 13 15:21:43 crc kubenswrapper[4786]: I0313 15:21:43.060755 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" Mar 13 15:21:43 crc kubenswrapper[4786]: I0313 15:21:43.076200 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" podStartSLOduration=29.241833134 podStartE2EDuration="36.076182141s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:35.72465641 +0000 UTC m=+1125.887868211" lastFinishedPulling="2026-03-13 15:21:42.559005357 +0000 UTC m=+1132.722217218" observedRunningTime="2026-03-13 15:21:43.07328449 +0000 UTC m=+1133.236496301" watchObservedRunningTime="2026-03-13 15:21:43.076182141 +0000 UTC m=+1133.239393952" Mar 13 15:21:43 crc kubenswrapper[4786]: I0313 15:21:43.096198 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" podStartSLOduration=2.889745086 podStartE2EDuration="36.096176527s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.353268243 +0000 UTC m=+1099.516480054" lastFinishedPulling="2026-03-13 15:21:42.559699644 +0000 UTC m=+1132.722911495" observedRunningTime="2026-03-13 15:21:43.090065035 +0000 UTC m=+1133.253276846" watchObservedRunningTime="2026-03-13 15:21:43.096176527 +0000 UTC m=+1133.259388338" Mar 13 15:21:44 crc kubenswrapper[4786]: I0313 15:21:44.068573 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" event={"ID":"b93a2ad9-e58e-4b33-8d34-8101b1fa2d38","Type":"ContainerStarted","Data":"026def7ef52a811d0f23f01623cafe2c8101d7a2209a01e2e7fcf906fc65d47e"} Mar 13 15:21:44 crc kubenswrapper[4786]: I0313 15:21:44.070098 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" Mar 13 15:21:44 crc kubenswrapper[4786]: I0313 15:21:44.073346 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" event={"ID":"66bf8109-666c-469d-b33c-ba5152cde7d9","Type":"ContainerStarted","Data":"0e5056cccf37e84b0fd4c5770c82b9cc190f4d08fa7d2bac8ab5aa9fd0ac6d3b"} Mar 13 15:21:44 crc kubenswrapper[4786]: I0313 15:21:44.073388 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:44 crc kubenswrapper[4786]: I0313 15:21:44.073402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" event={"ID":"66bf8109-666c-469d-b33c-ba5152cde7d9","Type":"ContainerStarted","Data":"e9efd44482e423833e634d3a52021a36590171b02bd6f2d4efcefa7a24b714c1"} Mar 13 15:21:44 crc kubenswrapper[4786]: I0313 15:21:44.153302 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" podStartSLOduration=3.403864116 podStartE2EDuration="37.153278809s" podCreationTimestamp="2026-03-13 15:21:07 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.37487791 +0000 UTC m=+1099.538089721" lastFinishedPulling="2026-03-13 15:21:43.124292603 +0000 UTC m=+1133.287504414" observedRunningTime="2026-03-13 15:21:44.101405664 +0000 UTC m=+1134.264617485" watchObservedRunningTime="2026-03-13 15:21:44.153278809 +0000 UTC m=+1134.316490620" Mar 13 15:21:44 crc kubenswrapper[4786]: I0313 15:21:44.153462 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" podStartSLOduration=36.153456444 podStartE2EDuration="36.153456444s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:21:44.148635344 +0000 UTC m=+1134.311847165" watchObservedRunningTime="2026-03-13 15:21:44.153456444 +0000 UTC m=+1134.316668255" Mar 13 15:21:45 crc kubenswrapper[4786]: I0313 15:21:45.083454 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" event={"ID":"e32e285a-970c-49ac-8531-d0d87b217b08","Type":"ContainerStarted","Data":"d37a29c1097d2d6d021f7c8e94240971292c55bc738cb77953650881a144aa7c"} Mar 13 15:21:45 crc kubenswrapper[4786]: I0313 15:21:45.084602 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" Mar 13 15:21:45 crc kubenswrapper[4786]: I0313 15:21:45.105208 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" podStartSLOduration=2.162127019 podStartE2EDuration="37.105191736s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.400108968 +0000 UTC m=+1099.563320779" lastFinishedPulling="2026-03-13 15:21:44.343173685 +0000 UTC m=+1134.506385496" observedRunningTime="2026-03-13 15:21:45.104217531 +0000 UTC m=+1135.267429342" watchObservedRunningTime="2026-03-13 15:21:45.105191736 +0000 UTC m=+1135.268403547" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.203057 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-wgpvr" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.220313 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-xpqf7" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.238829 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-zq9jr" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.249502 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-gr9w7" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.254746 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpfq5" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.274173 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-gtwlk" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.404823 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-8bd8f" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.433450 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-wxjdd" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.440270 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hxcnt" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.458811 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-65hkw" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.476360 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-skfmd" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.609722 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-wc9h2" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.707174 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-rx48d" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.739479 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-spsz5" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.757068 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-s5wjg" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.833575 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-pnzg8" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.858418 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zg7jh" Mar 13 15:21:48 crc kubenswrapper[4786]: I0313 15:21:48.944307 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-nfvdd" Mar 13 15:21:50 crc kubenswrapper[4786]: I0313 15:21:50.806022 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6484b7b757-tsssq" Mar 13 15:21:53 crc kubenswrapper[4786]: I0313 15:21:53.896440 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-98tsp" Mar 13 15:21:57 crc kubenswrapper[4786]: E0313 15:21:57.823757 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:ba0c22da8f244a1e601ba32831a029b4b5d4fd2df2d39abf4a2ccf73783dba1f" Mar 13 15:21:57 crc kubenswrapper[4786]: E0313 15:21:57.825543 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:ba0c22da8f244a1e601ba32831a029b4b5d4fd2df2d39abf4a2ccf73783dba1f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:7222940ba1e9d1688a07cf203d00b78bfc3930cbcb72e889bfe58f819ff12379,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:a8da62d734643927e28a259fa0b64b81a85cf108a29154c4ddaf9cf271c05610,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:19bc7d08e9acebb34c6b2cfeb856b58cd3c64f38cfd8d3cc94a166c74f32064e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:57b65e8c13a89e6909690f4a7f052d7eec6260ac653bc76796eb59c60302f691,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:d3e2c6d3ff91bfe40c91728b3b7327f86058f3b1780a714102743a6dfbdef806,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:255f48a3eca87bc58ccd2b391f46dddb030a36ce004295ee007d106d63fe5756,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e0697f36760789183fefc807dafa3bfeb4098725f923eb9a8f034725a01fbf9f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:8265607412410bea0d3cbc3035c6e206597476a4c1c890d351475ef75d69a7fe,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:5f2db5cf2f99da0ac03cac99deab5edec79a7d26f2b2707c9c8d72ca4c420e31,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:718ce7d3397ab3eef55d3be4add819266aff70eb4c4192b3a9ef4ee37b0e9d2f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:c7e1bce75de00992f2d2f5e173dfc2e8975feeb291b48441a1b11bc2c27a659f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:e9007a191212be577a752d8da6dc158647bc5f23f77175129109e1e170b120ad,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:123d36916bd093a766186e1b06e3530db7b0ef2a2322e16333556401111a8995,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09be81cac358ce64112079e1e7eb42adede7a01434f107dd2ab2ee173bad7e4b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:2781f3bed351ce4c77a235e2381576637203459384fd93e05584a0013b3fe93e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:9e7c747eeeefb391dc6dedaaac57fa694c4d08b991c54bb99aa6de77451e792f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:7423f71c91f5a1d0aec9dcf0993db6e2495b520b5e5bbcf1615b9ac9759c0a58,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api@sha256:84dfe899f1fc57fb9d0249c9004b40a555a5068c93fdab22d02d8f6c99010e71,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor@sha256:819a009897b9c1bcdca123450f8cfe50e4419f5b8460ab4c57ec16f93cb26d77,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:bbae680a993970f86bc7dc25556faedc1d9a1834047471d4b43c80798dc86576,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:20b3eb0def3bb07b82a26f68205640cca3db62e2db688b8c0ca79460dbb85f75,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:7955ccebc8abf63a4993d5c4e353b9b58550da3cb943bfb7f25624b31ce6db21,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:b4bde23597951cffc67c4dd58962ba5168dbeedcd1a79c08bf6c893d2ca5af99,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:ebef703d67382ecbfc574ba84c56b3a4ec690df2dd064b4533e8c818edd272d3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:ced321c9bf0c9748eac1abf4e8ce2359066994c40eec7da7187142b53a3fcdff,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:7d91c4de7301fa6c820e5f2c9a1583506d766852f972071bd5d7fc8d286074ca,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:ee714d6b60dcc4390b164850753805edc7720fb094366399f98ed99a1ebc648b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:7602372bf7284ae1ac2d718ec114876cdbdf91ad13130c15d225785ae9518ee4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:d62849adb4bbea9a201dd0b34bd4a5707cc5f52aa350fa4906428253fe8eab4b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:9e178b5236b52ad057846ae276284eea94f0995bc503d1c7b4fc58e2dd045bec,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:4601f2a7e3dcb1be099bf88c3de8b717d7de7912e8d47ddb6da514f2d1b4d607,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:664155e92db8e6d697f36f6ae03354116f89ad7924cd4658d5732dbd9e0d2241,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:ec9b9c2fd3916641a066d8e611698a476f255cb6aea69a6d16fe7acf6ab6fdf0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:16257b110f038fc034c7946869150e56a4dcd304fbb92fe4ef08e1068ce86548,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:872864539e48f0bc728de1da3242d3c6443525788ad2ec3121e77c984c9d1f8b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:dae5e39780d5a15eed030c7009f8e5317139d447558ac83f038497be594be120,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:ae40ad71d6fcb060045291c06195b703d15f91145df69f061c1bde413e08ea24,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:62229bf62c1730889da6cb213599356edd098259f10165d0b07ea4904545f1a5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:fb91a978d759c9c4848dd5a71ce09b48398dc4d18f08276cdca1833dd00856d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:0d759b31e4da88b3fa1b823ab634d982fd713e81ed648626de1d8ec40ae7cad4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:66081ddbe5fe62cad3c20592a5b5612beb3b717aaac0f3955c1f0b3a55e00af7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:ce607a55786d436f9ce7ef94f3d17fa07b0f19364c3ec81b45dd7c499aaea800,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:0312c8ff4b98bfc1e0c9bb717adb3247305749e34533eff91099c88ed9a1ed7f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:ccdc96c8db6368182c0f0d0bc4f482ca2f757ecee75656e2715f2eaeb2662030,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:d966b6024aabc0d204669b2b99cd0a9a39114334c8b426830c8c178dec6746ec,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:dcfa03e979271a6ec3bc9c3eaa380fbf2d0078aaf9f65ae7f720a86b0b075f6b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:9a84ec2a45b65df722d841a52ab3c330633ca7d80df5f3dd7d8bcb8056da70a1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:3a148c87899d4cbbb6bbe1203ad6c237fb295b5f42abda425dc0329305723414,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:f66694c7b4294017c9252ebc9ba3f2f1270c74433de21b71ee8873ac6f4b1645,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:af6fdb3f441bef512c4ef1386648af262a39184c1d76dfdc452b3b637e68d6e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:de190cc322d2d501c1ecfaed6f957463cc089d09f400c0f3ed969623aa116fc2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:92ba1014101bec0e9cc4abedad4e041a2ee59fc8fc96b6e27b73f6ec33ea9e22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:faf711e1e5fa2ad74a73d3dfffd88f6312fb045cb69e9b7e6331558784163d16,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:6b397803b01f768cce049e7cde307c9e8ace7a7cd6ef15eb69d482ec76b6f2c2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:d347d48e9a8ae4136dd99c5222480ceccb2819beaf80b11048644d4acf0a4305,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:228b3c59ea6527048a4b3d1e340c15f22dcf9f9ba8f302d6263f2e4ef79463ff,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:d89e44b4641e8bd60abf1b674253975596fafc490022169681555069174a414e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:ae435f19f09bbd2aba6abc350bd2d5a9b88c3da3553a70170f368f32c2a72105,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:0ac9d940efa1227f448ed9f281c724342c5d357a721b848fc1847031f282d758,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:ee509cc8ecd05292bed8fb9c373b60d1d66cb867823da69aa5a2a55177969bdd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:bd3a4163305dd912d7936ff7121d8a266f424b483161408f5cfa7701fcb05842,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:5ef21bc87f6e3db016e72d8015dcd8354466072a55ef69c9ad9b5ea365efdb29,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:d765b589a5f7bc8573b3b192ed265654699012e6342cc4829bd8ea65a7b239a5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:7cdea36ddb29e0814006da6ef0a0dd0d2c44975a23ddabbe32663c6f88dc35f3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:894ce79b38510973ca610423cc34a7383b7761b6ceb47d18637daffaa93336f7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:79213c923a25c7aa65998a66c3c2c2fbd8973f837cfb94f867e567cd71614af0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:943db2d546cbc78f663edcd102c478b71d755a66f99d24fea1b4e628c4125104,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:ac71d8f4475d08f0a40a993cf5f083aead99232c2d5d8cd9514d63a345d0c128,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:3d4aa78bc0932fd39a377beb5a649e47832c0de33a62c413776de2f9de31763e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:bba31d7d170c92451c1d62346da1057e9c0e941a074a32cc54219cb79a4ea24a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:b8a5d052890fb9cefa333baf10b607add227ed5d79aa108b576a97b21e89327a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:f919023e9754d0d94b3fa3e7f571e6d22330ad3cdbb17b20d6143d2581b49ef1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:37da219a7d5254e5fa6cac571f99d8ca7c600d3243b68ffb282a6c70ff8b3ff2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:4e5f8d6d28c08944c0ebb00adfacc6d0e02c382682893488ece1578b0542a583,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:6c3eb966650a7a98feb4ddb31e1bdba1095b0c62e349196aca6a423681d7e5fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:3b0b5a022e65a4c53af09db8a63d0e87b3b8d6151dbc59a0c3f570bf096aea59,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:c180220d5a8708e34b203815facd274d2cca7fe2f30b34729bbd795f38a0af07,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:91993e3cd1d8682861d2a0ef7a0664d229dc5744078d259bc49a9426c3a08bbf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:9b1e99ea6373e1596c4d1d42b5986eb4b9de8e5b9a4fc3278833b27588703c8c,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7thpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6f7958d7742jq6p_openstack-operators(365f42f0-aed6-4131-b8d7-c01b9a8418d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:21:57 crc kubenswrapper[4786]: E0313 15:21:57.827581 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" podUID="365f42f0-aed6-4131-b8d7-c01b9a8418d1" Mar 13 15:21:58 crc kubenswrapper[4786]: E0313 15:21:58.189568 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:ba0c22da8f244a1e601ba32831a029b4b5d4fd2df2d39abf4a2ccf73783dba1f\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" podUID="365f42f0-aed6-4131-b8d7-c01b9a8418d1" Mar 13 15:21:58 crc kubenswrapper[4786]: I0313 15:21:58.464325 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-dzj6t" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.151593 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556922-kt9db"] Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.154714 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-kt9db" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.161573 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.161948 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.162764 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.162911 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-kt9db"] Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.220207 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6dsf\" (UniqueName: \"kubernetes.io/projected/21189be6-18a4-47ac-a41d-f8f6ab6ff875-kube-api-access-z6dsf\") pod \"auto-csr-approver-29556922-kt9db\" (UID: \"21189be6-18a4-47ac-a41d-f8f6ab6ff875\") " pod="openshift-infra/auto-csr-approver-29556922-kt9db" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.321852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6dsf\" (UniqueName: \"kubernetes.io/projected/21189be6-18a4-47ac-a41d-f8f6ab6ff875-kube-api-access-z6dsf\") pod \"auto-csr-approver-29556922-kt9db\" (UID: \"21189be6-18a4-47ac-a41d-f8f6ab6ff875\") " pod="openshift-infra/auto-csr-approver-29556922-kt9db" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.347721 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6dsf\" (UniqueName: \"kubernetes.io/projected/21189be6-18a4-47ac-a41d-f8f6ab6ff875-kube-api-access-z6dsf\") pod \"auto-csr-approver-29556922-kt9db\" (UID: \"21189be6-18a4-47ac-a41d-f8f6ab6ff875\") " pod="openshift-infra/auto-csr-approver-29556922-kt9db" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.474844 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-kt9db" Mar 13 15:22:00 crc kubenswrapper[4786]: I0313 15:22:00.692248 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-kt9db"] Mar 13 15:22:01 crc kubenswrapper[4786]: I0313 15:22:01.219668 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-kt9db" event={"ID":"21189be6-18a4-47ac-a41d-f8f6ab6ff875","Type":"ContainerStarted","Data":"d4dd67ce103889e3ef8db7a3e50501f763f57d5129f477eb07843fab7cd851d5"} Mar 13 15:22:02 crc kubenswrapper[4786]: I0313 15:22:02.230720 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" event={"ID":"5dcee199-4a59-4394-a565-2ce8e15e787c","Type":"ContainerStarted","Data":"bc09e292b9c2894e618bc7f8fa538bf7d829281251dc55691bbc7ac63d4aee1e"} Mar 13 15:22:02 crc kubenswrapper[4786]: I0313 15:22:02.258120 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sfxhr" podStartSLOduration=3.007586696 podStartE2EDuration="54.258093688s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:09.836795096 +0000 UTC m=+1100.000006907" lastFinishedPulling="2026-03-13 15:22:01.087302088 +0000 UTC m=+1151.250513899" observedRunningTime="2026-03-13 15:22:02.257006301 +0000 UTC m=+1152.420218152" watchObservedRunningTime="2026-03-13 15:22:02.258093688 +0000 UTC m=+1152.421305529" Mar 13 15:22:03 crc kubenswrapper[4786]: I0313 15:22:03.241934 4786 generic.go:334] "Generic (PLEG): container finished" podID="21189be6-18a4-47ac-a41d-f8f6ab6ff875" containerID="58df202a11d1b8b5d5b191c42d457275030fe01c20524509c28f992abe5986cb" exitCode=0 Mar 13 15:22:03 crc kubenswrapper[4786]: I0313 15:22:03.242016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-kt9db" event={"ID":"21189be6-18a4-47ac-a41d-f8f6ab6ff875","Type":"ContainerDied","Data":"58df202a11d1b8b5d5b191c42d457275030fe01c20524509c28f992abe5986cb"} Mar 13 15:22:04 crc kubenswrapper[4786]: I0313 15:22:04.538712 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-kt9db" Mar 13 15:22:04 crc kubenswrapper[4786]: I0313 15:22:04.578655 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6dsf\" (UniqueName: \"kubernetes.io/projected/21189be6-18a4-47ac-a41d-f8f6ab6ff875-kube-api-access-z6dsf\") pod \"21189be6-18a4-47ac-a41d-f8f6ab6ff875\" (UID: \"21189be6-18a4-47ac-a41d-f8f6ab6ff875\") " Mar 13 15:22:04 crc kubenswrapper[4786]: I0313 15:22:04.585092 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21189be6-18a4-47ac-a41d-f8f6ab6ff875-kube-api-access-z6dsf" (OuterVolumeSpecName: "kube-api-access-z6dsf") pod "21189be6-18a4-47ac-a41d-f8f6ab6ff875" (UID: "21189be6-18a4-47ac-a41d-f8f6ab6ff875"). InnerVolumeSpecName "kube-api-access-z6dsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:22:04 crc kubenswrapper[4786]: I0313 15:22:04.680667 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6dsf\" (UniqueName: \"kubernetes.io/projected/21189be6-18a4-47ac-a41d-f8f6ab6ff875-kube-api-access-z6dsf\") on node \"crc\" DevicePath \"\"" Mar 13 15:22:05 crc kubenswrapper[4786]: I0313 15:22:05.256403 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-kt9db" event={"ID":"21189be6-18a4-47ac-a41d-f8f6ab6ff875","Type":"ContainerDied","Data":"d4dd67ce103889e3ef8db7a3e50501f763f57d5129f477eb07843fab7cd851d5"} Mar 13 15:22:05 crc kubenswrapper[4786]: I0313 15:22:05.256604 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4dd67ce103889e3ef8db7a3e50501f763f57d5129f477eb07843fab7cd851d5" Mar 13 15:22:05 crc kubenswrapper[4786]: I0313 15:22:05.256650 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-kt9db" Mar 13 15:22:05 crc kubenswrapper[4786]: I0313 15:22:05.619219 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-lz92k"] Mar 13 15:22:05 crc kubenswrapper[4786]: I0313 15:22:05.628990 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-lz92k"] Mar 13 15:22:06 crc kubenswrapper[4786]: I0313 15:22:06.564643 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad" path="/var/lib/kubelet/pods/8d4fc3e9-25a8-4b9f-abcb-20cdbc6338ad/volumes" Mar 13 15:22:13 crc kubenswrapper[4786]: I0313 15:22:13.311839 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" event={"ID":"365f42f0-aed6-4131-b8d7-c01b9a8418d1","Type":"ContainerStarted","Data":"20ba850e8807d733c27d5653cd608942b1dcc5d827ad364f553aec7f9863ff1b"} Mar 13 15:22:13 crc kubenswrapper[4786]: I0313 15:22:13.312754 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:22:13 crc kubenswrapper[4786]: I0313 15:22:13.339164 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" podStartSLOduration=35.60802306 podStartE2EDuration="1m5.339146026s" podCreationTimestamp="2026-03-13 15:21:08 +0000 UTC" firstStartedPulling="2026-03-13 15:21:42.550757183 +0000 UTC m=+1132.713969024" lastFinishedPulling="2026-03-13 15:22:12.281880179 +0000 UTC m=+1162.445091990" observedRunningTime="2026-03-13 15:22:13.334920391 +0000 UTC m=+1163.498132192" watchObservedRunningTime="2026-03-13 15:22:13.339146026 +0000 UTC m=+1163.502357847" Mar 13 15:22:20 crc kubenswrapper[4786]: I0313 15:22:20.474031 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f7958d7742jq6p" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.577165 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-89s2p"] Mar 13 15:22:35 crc kubenswrapper[4786]: E0313 15:22:35.578169 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21189be6-18a4-47ac-a41d-f8f6ab6ff875" containerName="oc" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.578184 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="21189be6-18a4-47ac-a41d-f8f6ab6ff875" containerName="oc" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.578360 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="21189be6-18a4-47ac-a41d-f8f6ab6ff875" containerName="oc" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.579203 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.581469 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.581481 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.581701 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-k9bzv" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.581844 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.636833 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-89s2p"] Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.713905 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5bhmw"] Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.715442 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.721281 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.734761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zklft\" (UniqueName: \"kubernetes.io/projected/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-kube-api-access-zklft\") pod \"dnsmasq-dns-5448ff6dc7-89s2p\" (UID: \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\") " pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.756020 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-config\") pod \"dnsmasq-dns-5448ff6dc7-89s2p\" (UID: \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\") " pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.855427 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5bhmw"] Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.857468 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-config\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.857633 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spmd5\" (UniqueName: \"kubernetes.io/projected/3b735e0e-0b59-4c2d-9682-f16b705c5461-kube-api-access-spmd5\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.857767 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zklft\" (UniqueName: \"kubernetes.io/projected/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-kube-api-access-zklft\") pod \"dnsmasq-dns-5448ff6dc7-89s2p\" (UID: \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\") " pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.857884 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-config\") pod \"dnsmasq-dns-5448ff6dc7-89s2p\" (UID: \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\") " pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.858010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-dns-svc\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.859343 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-config\") pod \"dnsmasq-dns-5448ff6dc7-89s2p\" (UID: \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\") " pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.905527 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zklft\" (UniqueName: \"kubernetes.io/projected/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-kube-api-access-zklft\") pod \"dnsmasq-dns-5448ff6dc7-89s2p\" (UID: \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\") " pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.959387 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-config\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.959455 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spmd5\" (UniqueName: \"kubernetes.io/projected/3b735e0e-0b59-4c2d-9682-f16b705c5461-kube-api-access-spmd5\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.959513 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-dns-svc\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.960261 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-config\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.960346 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-dns-svc\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:35 crc kubenswrapper[4786]: I0313 15:22:35.992443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spmd5\" (UniqueName: \"kubernetes.io/projected/3b735e0e-0b59-4c2d-9682-f16b705c5461-kube-api-access-spmd5\") pod \"dnsmasq-dns-64696987c5-5bhmw\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:36 crc kubenswrapper[4786]: I0313 15:22:36.095892 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:22:36 crc kubenswrapper[4786]: I0313 15:22:36.199388 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:22:36 crc kubenswrapper[4786]: I0313 15:22:36.570027 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5bhmw"] Mar 13 15:22:36 crc kubenswrapper[4786]: I0313 15:22:36.574832 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:22:36 crc kubenswrapper[4786]: I0313 15:22:36.704564 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-89s2p"] Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.508668 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" event={"ID":"a5a8295a-b149-4454-9ed1-f8444dbe7ffa","Type":"ContainerStarted","Data":"cd86081d8013e7c238451245c990bcd69f0ed12354fc9faac8bb2758c5bb1030"} Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.510631 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-5bhmw" event={"ID":"3b735e0e-0b59-4c2d-9682-f16b705c5461","Type":"ContainerStarted","Data":"eb409695dfce3344412ce03054b4b96659d9b6987c3df804ad800851a643fa31"} Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.580294 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-89s2p"] Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.615702 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-dddfc"] Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.617354 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.633387 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-dddfc"] Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.787956 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.788054 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-config\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.788110 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvbxc\" (UniqueName: \"kubernetes.io/projected/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-kube-api-access-bvbxc\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.893983 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.894079 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-config\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.894127 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvbxc\" (UniqueName: \"kubernetes.io/projected/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-kube-api-access-bvbxc\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.895219 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-config\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.895836 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.914965 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvbxc\" (UniqueName: \"kubernetes.io/projected/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-kube-api-access-bvbxc\") pod \"dnsmasq-dns-658f55c9f5-dddfc\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:37 crc kubenswrapper[4786]: I0313 15:22:37.945807 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.510873 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-dddfc"] Mar 13 15:22:38 crc kubenswrapper[4786]: W0313 15:22:38.545337 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1030ecb0_1b6e_4344_b7c2_4b544e63edc7.slice/crio-824bc3a16d48ffb99accac24b5d6a22200ee1f7752ecbc84d3856f65b76bbaf7 WatchSource:0}: Error finding container 824bc3a16d48ffb99accac24b5d6a22200ee1f7752ecbc84d3856f65b76bbaf7: Status 404 returned error can't find the container with id 824bc3a16d48ffb99accac24b5d6a22200ee1f7752ecbc84d3856f65b76bbaf7 Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.647935 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5bhmw"] Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.696579 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-mftfr"] Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.698014 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.710643 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-mftfr"] Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.714084 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcs4x\" (UniqueName: \"kubernetes.io/projected/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-kube-api-access-hcs4x\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.714391 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-config\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.714479 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.791119 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.793234 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.812772 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.813619 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.814121 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.814311 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.814421 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.816333 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-v847v" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.816544 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.833238 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-config\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.833317 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.833453 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcs4x\" (UniqueName: \"kubernetes.io/projected/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-kube-api-access-hcs4x\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.837240 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-config\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.837899 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.857903 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.890118 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcs4x\" (UniqueName: \"kubernetes.io/projected/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-kube-api-access-hcs4x\") pod \"dnsmasq-dns-54b5dffb47-mftfr\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934390 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934417 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934502 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934543 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934579 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934745 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934791 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934838 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934884 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgfxm\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-kube-api-access-sgfxm\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:38 crc kubenswrapper[4786]: I0313 15:22:38.934899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036107 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036154 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036185 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036225 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036245 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036272 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036313 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036338 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036362 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgfxm\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-kube-api-access-sgfxm\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.036393 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.037329 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.037522 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.037612 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.038418 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.038834 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.039140 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.040134 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.042712 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.042828 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.042851 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.059547 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.064434 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgfxm\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-kube-api-access-sgfxm\") pod \"rabbitmq-cell1-server-0\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.066531 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.159506 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.540938 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" event={"ID":"1030ecb0-1b6e-4344-b7c2-4b544e63edc7","Type":"ContainerStarted","Data":"824bc3a16d48ffb99accac24b5d6a22200ee1f7752ecbc84d3856f65b76bbaf7"} Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.595299 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-mftfr"] Mar 13 15:22:39 crc kubenswrapper[4786]: W0313 15:22:39.599033 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1753bbf8_ecfc_4ad6_b87f_9e38b8372862.slice/crio-cdc463af25ded7ebe95c78f280b93fb864353ec7f9bec06884e2b818ce5dd769 WatchSource:0}: Error finding container cdc463af25ded7ebe95c78f280b93fb864353ec7f9bec06884e2b818ce5dd769: Status 404 returned error can't find the container with id cdc463af25ded7ebe95c78f280b93fb864353ec7f9bec06884e2b818ce5dd769 Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.699393 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 15:22:39 crc kubenswrapper[4786]: W0313 15:22:39.699761 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65e5ca7c_1c5e_4f9e_85df_a92feaeddb43.slice/crio-54846a642fd59e535935d429d2a622eb8c50104653f9ccfb1950cc77bd37e558 WatchSource:0}: Error finding container 54846a642fd59e535935d429d2a622eb8c50104653f9ccfb1950cc77bd37e558: Status 404 returned error can't find the container with id 54846a642fd59e535935d429d2a622eb8c50104653f9ccfb1950cc77bd37e558 Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.846477 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.852129 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.856940 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8tkb4" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.857275 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.857391 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.857508 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.857601 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.857704 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.858273 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.866371 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954513 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954558 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954582 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954608 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954628 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954648 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954678 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954692 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954714 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954730 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhvv\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-kube-api-access-tfhvv\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:39 crc kubenswrapper[4786]: I0313 15:22:39.954755 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-config-data\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056116 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056159 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056183 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056199 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhvv\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-kube-api-access-tfhvv\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056223 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-config-data\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056274 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056296 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056316 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056342 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056362 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.056382 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.057272 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.057591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.057676 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.057920 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.058353 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.059010 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-config-data\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.064006 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.065439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.067881 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.068416 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.080116 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhvv\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-kube-api-access-tfhvv\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.097523 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.190654 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.574720 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" event={"ID":"1753bbf8-ecfc-4ad6-b87f-9e38b8372862","Type":"ContainerStarted","Data":"cdc463af25ded7ebe95c78f280b93fb864353ec7f9bec06884e2b818ce5dd769"} Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.574788 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43","Type":"ContainerStarted","Data":"54846a642fd59e535935d429d2a622eb8c50104653f9ccfb1950cc77bd37e558"} Mar 13 15:22:40 crc kubenswrapper[4786]: I0313 15:22:40.669374 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.002630 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.003906 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.009393 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.009584 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7v2p6" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.010039 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.012225 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.017902 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.025244 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.075325 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-default\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.075379 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kolla-config\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.075410 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.075438 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lfzd\" (UniqueName: \"kubernetes.io/projected/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kube-api-access-5lfzd\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.075457 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.075494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.075525 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.075552 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.176533 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-default\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.176595 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kolla-config\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.176619 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.176649 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lfzd\" (UniqueName: \"kubernetes.io/projected/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kube-api-access-5lfzd\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.176672 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.176702 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.176729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.176745 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.178470 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-default\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.178924 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kolla-config\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.179151 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.180961 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.185196 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.191271 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.204496 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.209532 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.213170 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lfzd\" (UniqueName: \"kubernetes.io/projected/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kube-api-access-5lfzd\") pod \"openstack-galera-0\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.384097 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 15:22:41 crc kubenswrapper[4786]: I0313 15:22:41.585400 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f964a2e6-aad3-42c0-8290-c3aa52d99e5b","Type":"ContainerStarted","Data":"ceb6568d7895ac9e87026451c3bde662d7b26342da0acb09c93120f9f42f2c20"} Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.296203 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.297882 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.300435 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.300666 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.301593 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.304850 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-lfvcg" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.326527 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.396818 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.396953 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.397001 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.400542 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.400628 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2q8\" (UniqueName: \"kubernetes.io/projected/45c96441-7032-49b6-b5fe-129ed26c4e38-kube-api-access-mq2q8\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.400671 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.400731 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.400762 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.499507 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.500557 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.503539 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dljdm" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.508213 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.508387 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509491 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509501 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509594 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2q8\" (UniqueName: \"kubernetes.io/projected/45c96441-7032-49b6-b5fe-129ed26c4e38-kube-api-access-mq2q8\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509633 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509685 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509710 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509748 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509795 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.509845 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.510277 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.510466 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.511319 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.512361 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.512810 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.514845 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.518006 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.535379 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2q8\" (UniqueName: \"kubernetes.io/projected/45c96441-7032-49b6-b5fe-129ed26c4e38-kube-api-access-mq2q8\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.536057 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.611594 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.611867 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-config-data\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.612024 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kolla-config\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.612124 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vwnw\" (UniqueName: \"kubernetes.io/projected/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kube-api-access-4vwnw\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.612260 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.636765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.713262 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kolla-config\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.713318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vwnw\" (UniqueName: \"kubernetes.io/projected/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kube-api-access-4vwnw\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.713372 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.713444 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.713484 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-config-data\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.714284 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-config-data\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.714417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kolla-config\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.718259 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.723401 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.732046 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vwnw\" (UniqueName: \"kubernetes.io/projected/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kube-api-access-4vwnw\") pod \"memcached-0\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " pod="openstack/memcached-0" Mar 13 15:22:42 crc kubenswrapper[4786]: I0313 15:22:42.890223 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 15:22:43 crc kubenswrapper[4786]: E0313 15:22:43.078784 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading blob sha256:80bba89bc841385ac1405e74e590e67ebcd38bd67c1ec1b1073c74a69a710c9b: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/80/80bba89bc841385ac1405e74e590e67ebcd38bd67c1ec1b1073c74a69a710c9b?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T152240Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=67f3a75cb81c10cad052acc47930cc0758317cff2f6688e459f3c9878b64a651®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-rabbitmq&akamai_signature=exp=1773416260~hmac=30276933052379a967748d893926c233cae57ad2b5f496892dbb0f42a67853f9\": remote error: tls: internal error" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d" Mar 13 15:22:43 crc kubenswrapper[4786]: E0313 15:22:43.079225 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgfxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(65e5ca7c-1c5e-4f9e-85df-a92feaeddb43): ErrImagePull: reading blob sha256:80bba89bc841385ac1405e74e590e67ebcd38bd67c1ec1b1073c74a69a710c9b: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/80/80bba89bc841385ac1405e74e590e67ebcd38bd67c1ec1b1073c74a69a710c9b?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T152240Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=67f3a75cb81c10cad052acc47930cc0758317cff2f6688e459f3c9878b64a651®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-rabbitmq&akamai_signature=exp=1773416260~hmac=30276933052379a967748d893926c233cae57ad2b5f496892dbb0f42a67853f9\": remote error: tls: internal error" logger="UnhandledError" Mar 13 15:22:43 crc kubenswrapper[4786]: E0313 15:22:43.080896 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"reading blob sha256:80bba89bc841385ac1405e74e590e67ebcd38bd67c1ec1b1073c74a69a710c9b: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/80/80bba89bc841385ac1405e74e590e67ebcd38bd67c1ec1b1073c74a69a710c9b?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T152240Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=67f3a75cb81c10cad052acc47930cc0758317cff2f6688e459f3c9878b64a651®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-rabbitmq&akamai_signature=exp=1773416260~hmac=30276933052379a967748d893926c233cae57ad2b5f496892dbb0f42a67853f9\\\": remote error: tls: internal error\"" pod="openstack/rabbitmq-cell1-server-0" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" Mar 13 15:22:43 crc kubenswrapper[4786]: E0313 15:22:43.611180 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:2087a09e7ea9f1dbadd433366bb46cc93dd5460ac9606b65f430460f4c2ee18d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" Mar 13 15:22:44 crc kubenswrapper[4786]: I0313 15:22:44.718223 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:22:44 crc kubenswrapper[4786]: I0313 15:22:44.719222 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 15:22:44 crc kubenswrapper[4786]: I0313 15:22:44.722175 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4br7r" Mar 13 15:22:44 crc kubenswrapper[4786]: I0313 15:22:44.737808 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:22:44 crc kubenswrapper[4786]: I0313 15:22:44.768392 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68m96\" (UniqueName: \"kubernetes.io/projected/13eda4d3-ef97-4ed1-889d-bd7b60b91179-kube-api-access-68m96\") pod \"kube-state-metrics-0\" (UID: \"13eda4d3-ef97-4ed1-889d-bd7b60b91179\") " pod="openstack/kube-state-metrics-0" Mar 13 15:22:44 crc kubenswrapper[4786]: I0313 15:22:44.870165 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68m96\" (UniqueName: \"kubernetes.io/projected/13eda4d3-ef97-4ed1-889d-bd7b60b91179-kube-api-access-68m96\") pod \"kube-state-metrics-0\" (UID: \"13eda4d3-ef97-4ed1-889d-bd7b60b91179\") " pod="openstack/kube-state-metrics-0" Mar 13 15:22:44 crc kubenswrapper[4786]: I0313 15:22:44.894443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68m96\" (UniqueName: \"kubernetes.io/projected/13eda4d3-ef97-4ed1-889d-bd7b60b91179-kube-api-access-68m96\") pod \"kube-state-metrics-0\" (UID: \"13eda4d3-ef97-4ed1-889d-bd7b60b91179\") " pod="openstack/kube-state-metrics-0" Mar 13 15:22:45 crc kubenswrapper[4786]: I0313 15:22:45.041584 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.549965 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.553234 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.561527 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.563981 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.564133 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bqj4h" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.564309 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.564316 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.589116 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.623283 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.623351 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.623372 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.623406 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-config\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.623431 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.623463 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7rt\" (UniqueName: \"kubernetes.io/projected/73a4439f-50e7-4620-bf95-48d591ec6e3a-kube-api-access-dk7rt\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.623484 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.623504 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.724491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk7rt\" (UniqueName: \"kubernetes.io/projected/73a4439f-50e7-4620-bf95-48d591ec6e3a-kube-api-access-dk7rt\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.724548 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.724574 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.724625 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.724664 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.724688 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.724730 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-config\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.724758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.725103 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.725801 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-config\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.726924 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.727587 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.733421 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.736651 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.742600 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.745796 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk7rt\" (UniqueName: \"kubernetes.io/projected/73a4439f-50e7-4620-bf95-48d591ec6e3a-kube-api-access-dk7rt\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.751806 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:48 crc kubenswrapper[4786]: I0313 15:22:48.878502 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.374005 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bfm8s"] Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.376123 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.378681 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.378927 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8w2jg" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.379139 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.393830 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bfm8s"] Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.435568 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2hb98"] Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.436408 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchht\" (UniqueName: \"kubernetes.io/projected/5a56ecb5-18f5-4645-8626-03f231f99f03-kube-api-access-wchht\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.436494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a56ecb5-18f5-4645-8626-03f231f99f03-scripts\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.436530 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run-ovn\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.436595 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-log-ovn\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.436626 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.436652 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-ovn-controller-tls-certs\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.436682 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-combined-ca-bundle\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.437217 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.451710 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2hb98"] Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538273 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-ovn-controller-tls-certs\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-combined-ca-bundle\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538356 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-run\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538386 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run-ovn\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-lib\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538436 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-log-ovn\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538453 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-log\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538489 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l4d7\" (UniqueName: \"kubernetes.io/projected/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-kube-api-access-9l4d7\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538515 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-etc-ovs\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538541 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wchht\" (UniqueName: \"kubernetes.io/projected/5a56ecb5-18f5-4645-8626-03f231f99f03-kube-api-access-wchht\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538561 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a56ecb5-18f5-4645-8626-03f231f99f03-scripts\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.538592 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-scripts\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.539168 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.539180 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run-ovn\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.539827 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-log-ovn\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.542354 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a56ecb5-18f5-4645-8626-03f231f99f03-scripts\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.543817 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-ovn-controller-tls-certs\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.549248 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-combined-ca-bundle\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.571484 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchht\" (UniqueName: \"kubernetes.io/projected/5a56ecb5-18f5-4645-8626-03f231f99f03-kube-api-access-wchht\") pod \"ovn-controller-bfm8s\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.640559 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-run\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.640768 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-run\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.640794 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-lib\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.641038 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-lib\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.641085 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-log\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.641118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l4d7\" (UniqueName: \"kubernetes.io/projected/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-kube-api-access-9l4d7\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.641199 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-log\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.641520 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-etc-ovs\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.641719 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-etc-ovs\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.641844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-scripts\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.643885 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-scripts\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.661574 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l4d7\" (UniqueName: \"kubernetes.io/projected/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-kube-api-access-9l4d7\") pod \"ovn-controller-ovs-2hb98\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.696211 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bfm8s" Mar 13 15:22:49 crc kubenswrapper[4786]: I0313 15:22:49.758640 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.136389 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.138999 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.142433 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.142724 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mp26m" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.142972 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.143108 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.160038 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.168844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxlr\" (UniqueName: \"kubernetes.io/projected/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-kube-api-access-mnxlr\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.168998 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.169041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.169065 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.169082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.169169 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.169222 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-config\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.169241 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.270697 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxlr\" (UniqueName: \"kubernetes.io/projected/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-kube-api-access-mnxlr\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.270748 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.270769 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.270787 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.270805 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.270838 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.270924 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-config\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.270941 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.271566 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.271753 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.272235 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-config\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.272674 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.275455 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.275582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.280907 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.294840 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxlr\" (UniqueName: \"kubernetes.io/projected/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-kube-api-access-mnxlr\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.297358 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:51 crc kubenswrapper[4786]: I0313 15:22:51.459291 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 15:22:52 crc kubenswrapper[4786]: I0313 15:22:52.330123 4786 scope.go:117] "RemoveContainer" containerID="ee23c4bdb46d75f37c05e047df4a8e16d06f52e8695adb4dd2ff95b6d53ae9e2" Mar 13 15:22:58 crc kubenswrapper[4786]: I0313 15:22:58.687937 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 15:22:59 crc kubenswrapper[4786]: E0313 15:22:59.134488 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 13 15:22:59 crc kubenswrapper[4786]: E0313 15:22:59.134642 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bvbxc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-dddfc_openstack(1030ecb0-1b6e-4344-b7c2-4b544e63edc7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:22:59 crc kubenswrapper[4786]: E0313 15:22:59.135969 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" Mar 13 15:22:59 crc kubenswrapper[4786]: E0313 15:22:59.735159 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.272218 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.272740 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zklft,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-89s2p_openstack(a5a8295a-b149-4454-9ed1-f8444dbe7ffa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.273956 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" podUID="a5a8295a-b149-4454-9ed1-f8444dbe7ffa" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.294851 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.294999 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hcs4x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54b5dffb47-mftfr_openstack(1753bbf8-ecfc-4ad6-b87f-9e38b8372862): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.297188 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.365073 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.365255 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-spmd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-5bhmw_openstack(3b735e0e-0b59-4c2d-9682-f16b705c5461): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.366975 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-5bhmw" podUID="3b735e0e-0b59-4c2d-9682-f16b705c5461" Mar 13 15:23:00 crc kubenswrapper[4786]: I0313 15:23:00.741705 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74e7c2f9-5486-4c21-a0b7-07c81d85a24c","Type":"ContainerStarted","Data":"e4989570687deeb4e766b59c8004ec3ece9c7569eab2a71136649278bd41bb95"} Mar 13 15:23:00 crc kubenswrapper[4786]: E0313 15:23:00.745754 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" Mar 13 15:23:00 crc kubenswrapper[4786]: I0313 15:23:00.830785 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 15:23:00 crc kubenswrapper[4786]: W0313 15:23:00.834992 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c96441_7032_49b6_b5fe_129ed26c4e38.slice/crio-f6da44f243c1bf50ae5e7e7f96501f1b52ebb5ea2544e64b0397bc4f358ff661 WatchSource:0}: Error finding container f6da44f243c1bf50ae5e7e7f96501f1b52ebb5ea2544e64b0397bc4f358ff661: Status 404 returned error can't find the container with id f6da44f243c1bf50ae5e7e7f96501f1b52ebb5ea2544e64b0397bc4f358ff661 Mar 13 15:23:00 crc kubenswrapper[4786]: I0313 15:23:00.917093 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 15:23:00 crc kubenswrapper[4786]: I0313 15:23:00.935583 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bfm8s"] Mar 13 15:23:00 crc kubenswrapper[4786]: I0313 15:23:00.944051 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.059605 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.150740 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.542429 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.558712 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.653065 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spmd5\" (UniqueName: \"kubernetes.io/projected/3b735e0e-0b59-4c2d-9682-f16b705c5461-kube-api-access-spmd5\") pod \"3b735e0e-0b59-4c2d-9682-f16b705c5461\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.654245 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-config\") pod \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\" (UID: \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\") " Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.654299 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-config\") pod \"3b735e0e-0b59-4c2d-9682-f16b705c5461\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.654319 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-dns-svc\") pod \"3b735e0e-0b59-4c2d-9682-f16b705c5461\" (UID: \"3b735e0e-0b59-4c2d-9682-f16b705c5461\") " Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.654368 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zklft\" (UniqueName: \"kubernetes.io/projected/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-kube-api-access-zklft\") pod \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\" (UID: \"a5a8295a-b149-4454-9ed1-f8444dbe7ffa\") " Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.655097 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-config" (OuterVolumeSpecName: "config") pod "3b735e0e-0b59-4c2d-9682-f16b705c5461" (UID: "3b735e0e-0b59-4c2d-9682-f16b705c5461"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.655141 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-config" (OuterVolumeSpecName: "config") pod "a5a8295a-b149-4454-9ed1-f8444dbe7ffa" (UID: "a5a8295a-b149-4454-9ed1-f8444dbe7ffa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.655645 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b735e0e-0b59-4c2d-9682-f16b705c5461" (UID: "3b735e0e-0b59-4c2d-9682-f16b705c5461"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.659704 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-kube-api-access-zklft" (OuterVolumeSpecName: "kube-api-access-zklft") pod "a5a8295a-b149-4454-9ed1-f8444dbe7ffa" (UID: "a5a8295a-b149-4454-9ed1-f8444dbe7ffa"). InnerVolumeSpecName "kube-api-access-zklft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.659753 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b735e0e-0b59-4c2d-9682-f16b705c5461-kube-api-access-spmd5" (OuterVolumeSpecName: "kube-api-access-spmd5") pod "3b735e0e-0b59-4c2d-9682-f16b705c5461" (UID: "3b735e0e-0b59-4c2d-9682-f16b705c5461"). InnerVolumeSpecName "kube-api-access-spmd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.753181 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" event={"ID":"a5a8295a-b149-4454-9ed1-f8444dbe7ffa","Type":"ContainerDied","Data":"cd86081d8013e7c238451245c990bcd69f0ed12354fc9faac8bb2758c5bb1030"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.753279 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-89s2p" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.761834 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.761892 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.761902 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b735e0e-0b59-4c2d-9682-f16b705c5461-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.761912 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zklft\" (UniqueName: \"kubernetes.io/projected/a5a8295a-b149-4454-9ed1-f8444dbe7ffa-kube-api-access-zklft\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.761924 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spmd5\" (UniqueName: \"kubernetes.io/projected/3b735e0e-0b59-4c2d-9682-f16b705c5461-kube-api-access-spmd5\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.762117 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13eda4d3-ef97-4ed1-889d-bd7b60b91179","Type":"ContainerStarted","Data":"f6457d5fad4655eff23797e21535c4e7bb7f64f53eb7cc8595d869689366ddf4"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.763128 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"393ef3eb-1c5f-4a06-a815-fe394d372ee6","Type":"ContainerStarted","Data":"f5923edfa2272384b6d33906e3b89ee8ea401a15793f63bd65c140f510f4b2a5"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.767561 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"73a4439f-50e7-4620-bf95-48d591ec6e3a","Type":"ContainerStarted","Data":"2758cd3bd25a52c3b20f2dd60ae0f14a4fd20cef44e5213d2d35be9d72126eb8"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.768660 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-5bhmw" event={"ID":"3b735e0e-0b59-4c2d-9682-f16b705c5461","Type":"ContainerDied","Data":"eb409695dfce3344412ce03054b4b96659d9b6987c3df804ad800851a643fa31"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.768693 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-5bhmw" Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.769479 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea","Type":"ContainerStarted","Data":"da35b4d74f29bea1e8deaa12f4a584643b7d0e274d406b996bfe687261e117f2"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.774783 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f964a2e6-aad3-42c0-8290-c3aa52d99e5b","Type":"ContainerStarted","Data":"8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.783891 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43","Type":"ContainerStarted","Data":"4c81517ba6f8b5efac24e0644f61c61845b3759183639397f2234090a8627707"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.785583 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45c96441-7032-49b6-b5fe-129ed26c4e38","Type":"ContainerStarted","Data":"f6da44f243c1bf50ae5e7e7f96501f1b52ebb5ea2544e64b0397bc4f358ff661"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.803817 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bfm8s" event={"ID":"5a56ecb5-18f5-4645-8626-03f231f99f03","Type":"ContainerStarted","Data":"6890e84989717cb2ec0c7f4516a8183cf76f220acdd3dbfcb76da2d18bb4274f"} Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.814928 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-89s2p"] Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.828379 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-89s2p"] Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.889849 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5bhmw"] Mar 13 15:23:01 crc kubenswrapper[4786]: I0313 15:23:01.896633 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-5bhmw"] Mar 13 15:23:02 crc kubenswrapper[4786]: I0313 15:23:02.246605 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2hb98"] Mar 13 15:23:02 crc kubenswrapper[4786]: I0313 15:23:02.563400 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b735e0e-0b59-4c2d-9682-f16b705c5461" path="/var/lib/kubelet/pods/3b735e0e-0b59-4c2d-9682-f16b705c5461/volumes" Mar 13 15:23:02 crc kubenswrapper[4786]: I0313 15:23:02.563867 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a8295a-b149-4454-9ed1-f8444dbe7ffa" path="/var/lib/kubelet/pods/a5a8295a-b149-4454-9ed1-f8444dbe7ffa/volumes" Mar 13 15:23:02 crc kubenswrapper[4786]: W0313 15:23:02.694124 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee4ee4a6_b86a_454b_8952_6a0f16ce6353.slice/crio-64c90c0674809177b1a8b5034d10a6b47f26b6a5e4e41edfb0306c72cbc6ba84 WatchSource:0}: Error finding container 64c90c0674809177b1a8b5034d10a6b47f26b6a5e4e41edfb0306c72cbc6ba84: Status 404 returned error can't find the container with id 64c90c0674809177b1a8b5034d10a6b47f26b6a5e4e41edfb0306c72cbc6ba84 Mar 13 15:23:02 crc kubenswrapper[4786]: I0313 15:23:02.813176 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2hb98" event={"ID":"ee4ee4a6-b86a-454b-8952-6a0f16ce6353","Type":"ContainerStarted","Data":"64c90c0674809177b1a8b5034d10a6b47f26b6a5e4e41edfb0306c72cbc6ba84"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.873349 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"73a4439f-50e7-4620-bf95-48d591ec6e3a","Type":"ContainerStarted","Data":"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.876040 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13eda4d3-ef97-4ed1-889d-bd7b60b91179","Type":"ContainerStarted","Data":"d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.876178 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.877746 4786 generic.go:334] "Generic (PLEG): container finished" podID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerID="2ca2b9e0d44338cc2d8dc4cd9960b1316db2789e01d7d069963e6c479bc4ee2d" exitCode=0 Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.878062 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2hb98" event={"ID":"ee4ee4a6-b86a-454b-8952-6a0f16ce6353","Type":"ContainerDied","Data":"2ca2b9e0d44338cc2d8dc4cd9960b1316db2789e01d7d069963e6c479bc4ee2d"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.879593 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"393ef3eb-1c5f-4a06-a815-fe394d372ee6","Type":"ContainerStarted","Data":"02c8962944741a5e21d0094c41f35cebfff6ed06de1fa2ee62e1f8c35e344d05"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.879632 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.881224 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45c96441-7032-49b6-b5fe-129ed26c4e38","Type":"ContainerStarted","Data":"0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.882508 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bfm8s" event={"ID":"5a56ecb5-18f5-4645-8626-03f231f99f03","Type":"ContainerStarted","Data":"3d075c999ac56a272c8241240052d1c6ec3450ff98e7174795b8e3a6b89162b8"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.882744 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-bfm8s" Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.884882 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea","Type":"ContainerStarted","Data":"97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.890236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74e7c2f9-5486-4c21-a0b7-07c81d85a24c","Type":"ContainerStarted","Data":"4f360270643133e3341091a9416bbc2ba46f23a4951802ac45c190664297de77"} Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.898637 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.696327449 podStartE2EDuration="26.898612776s" podCreationTimestamp="2026-03-13 15:22:44 +0000 UTC" firstStartedPulling="2026-03-13 15:23:01.08856311 +0000 UTC m=+1211.251774921" lastFinishedPulling="2026-03-13 15:23:10.290848437 +0000 UTC m=+1220.454060248" observedRunningTime="2026-03-13 15:23:10.893354996 +0000 UTC m=+1221.056566817" watchObservedRunningTime="2026-03-13 15:23:10.898612776 +0000 UTC m=+1221.061824597" Mar 13 15:23:10 crc kubenswrapper[4786]: I0313 15:23:10.976359 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.204926601 podStartE2EDuration="28.976344402s" podCreationTimestamp="2026-03-13 15:22:42 +0000 UTC" firstStartedPulling="2026-03-13 15:23:00.942149342 +0000 UTC m=+1211.105361153" lastFinishedPulling="2026-03-13 15:23:09.713567143 +0000 UTC m=+1219.876778954" observedRunningTime="2026-03-13 15:23:10.970723283 +0000 UTC m=+1221.133935104" watchObservedRunningTime="2026-03-13 15:23:10.976344402 +0000 UTC m=+1221.139556213" Mar 13 15:23:11 crc kubenswrapper[4786]: I0313 15:23:11.015539 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bfm8s" podStartSLOduration=13.25496638 podStartE2EDuration="22.015519993s" podCreationTimestamp="2026-03-13 15:22:49 +0000 UTC" firstStartedPulling="2026-03-13 15:23:01.054642429 +0000 UTC m=+1211.217854240" lastFinishedPulling="2026-03-13 15:23:09.815196042 +0000 UTC m=+1219.978407853" observedRunningTime="2026-03-13 15:23:11.008803166 +0000 UTC m=+1221.172014997" watchObservedRunningTime="2026-03-13 15:23:11.015519993 +0000 UTC m=+1221.178731804" Mar 13 15:23:11 crc kubenswrapper[4786]: I0313 15:23:11.902563 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2hb98" event={"ID":"ee4ee4a6-b86a-454b-8952-6a0f16ce6353","Type":"ContainerStarted","Data":"5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5"} Mar 13 15:23:11 crc kubenswrapper[4786]: I0313 15:23:11.902907 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2hb98" event={"ID":"ee4ee4a6-b86a-454b-8952-6a0f16ce6353","Type":"ContainerStarted","Data":"679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716"} Mar 13 15:23:11 crc kubenswrapper[4786]: I0313 15:23:11.906449 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:23:11 crc kubenswrapper[4786]: I0313 15:23:11.906890 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:23:11 crc kubenswrapper[4786]: I0313 15:23:11.953957 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2hb98" podStartSLOduration=15.938263644 podStartE2EDuration="22.953932183s" podCreationTimestamp="2026-03-13 15:22:49 +0000 UTC" firstStartedPulling="2026-03-13 15:23:02.697175507 +0000 UTC m=+1212.860387318" lastFinishedPulling="2026-03-13 15:23:09.712844026 +0000 UTC m=+1219.876055857" observedRunningTime="2026-03-13 15:23:11.941599348 +0000 UTC m=+1222.104811159" watchObservedRunningTime="2026-03-13 15:23:11.953932183 +0000 UTC m=+1222.117144024" Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.930259 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea","Type":"ContainerStarted","Data":"2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0"} Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.933765 4786 generic.go:334] "Generic (PLEG): container finished" podID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerID="4f360270643133e3341091a9416bbc2ba46f23a4951802ac45c190664297de77" exitCode=0 Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.933824 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74e7c2f9-5486-4c21-a0b7-07c81d85a24c","Type":"ContainerDied","Data":"4f360270643133e3341091a9416bbc2ba46f23a4951802ac45c190664297de77"} Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.937440 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"73a4439f-50e7-4620-bf95-48d591ec6e3a","Type":"ContainerStarted","Data":"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78"} Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.940019 4786 generic.go:334] "Generic (PLEG): container finished" podID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" containerID="8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899" exitCode=0 Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.940105 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" event={"ID":"1753bbf8-ecfc-4ad6-b87f-9e38b8372862","Type":"ContainerDied","Data":"8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899"} Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.942924 4786 generic.go:334] "Generic (PLEG): container finished" podID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" containerID="225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629" exitCode=0 Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.942986 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" event={"ID":"1030ecb0-1b6e-4344-b7c2-4b544e63edc7","Type":"ContainerDied","Data":"225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629"} Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.947180 4786 generic.go:334] "Generic (PLEG): container finished" podID="45c96441-7032-49b6-b5fe-129ed26c4e38" containerID="0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa" exitCode=0 Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.947442 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45c96441-7032-49b6-b5fe-129ed26c4e38","Type":"ContainerDied","Data":"0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa"} Mar 13 15:23:14 crc kubenswrapper[4786]: I0313 15:23:14.972349 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.520998112000001 podStartE2EDuration="24.97227187s" podCreationTimestamp="2026-03-13 15:22:50 +0000 UTC" firstStartedPulling="2026-03-13 15:23:01.33557225 +0000 UTC m=+1211.498784101" lastFinishedPulling="2026-03-13 15:23:13.786846008 +0000 UTC m=+1223.950057859" observedRunningTime="2026-03-13 15:23:14.966779184 +0000 UTC m=+1225.129991015" watchObservedRunningTime="2026-03-13 15:23:14.97227187 +0000 UTC m=+1225.135483681" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.001277 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.242919722 podStartE2EDuration="28.001261278s" podCreationTimestamp="2026-03-13 15:22:47 +0000 UTC" firstStartedPulling="2026-03-13 15:23:00.945603408 +0000 UTC m=+1211.108815219" lastFinishedPulling="2026-03-13 15:23:13.703944964 +0000 UTC m=+1223.867156775" observedRunningTime="2026-03-13 15:23:14.994923881 +0000 UTC m=+1225.158135712" watchObservedRunningTime="2026-03-13 15:23:15.001261278 +0000 UTC m=+1225.164473079" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.048789 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.460326 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.493767 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.879186 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.940162 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.963935 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" event={"ID":"1030ecb0-1b6e-4344-b7c2-4b544e63edc7","Type":"ContainerStarted","Data":"07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2"} Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.964478 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.970526 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45c96441-7032-49b6-b5fe-129ed26c4e38","Type":"ContainerStarted","Data":"28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83"} Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.977370 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74e7c2f9-5486-4c21-a0b7-07c81d85a24c","Type":"ContainerStarted","Data":"ef204212cb679a6a1f4f8cb961bcf40917c2172c42b199b0070296a420877d86"} Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.988065 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" event={"ID":"1753bbf8-ecfc-4ad6-b87f-9e38b8372862","Type":"ContainerStarted","Data":"8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9"} Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.988137 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.988686 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:23:15 crc kubenswrapper[4786]: I0313 15:23:15.989526 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.030619 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" podStartSLOduration=3.80018081 podStartE2EDuration="39.030578152s" podCreationTimestamp="2026-03-13 15:22:37 +0000 UTC" firstStartedPulling="2026-03-13 15:22:38.558150219 +0000 UTC m=+1188.721362030" lastFinishedPulling="2026-03-13 15:23:13.788547561 +0000 UTC m=+1223.951759372" observedRunningTime="2026-03-13 15:23:16.004660209 +0000 UTC m=+1226.167872020" watchObservedRunningTime="2026-03-13 15:23:16.030578152 +0000 UTC m=+1226.193790003" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.054087 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.057033 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.065650 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" podStartSLOduration=-9223371998.789154 podStartE2EDuration="38.06562086s" podCreationTimestamp="2026-03-13 15:22:38 +0000 UTC" firstStartedPulling="2026-03-13 15:22:39.600838653 +0000 UTC m=+1189.764050464" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:16.031165636 +0000 UTC m=+1226.194377467" watchObservedRunningTime="2026-03-13 15:23:16.06562086 +0000 UTC m=+1226.228832681" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.088442 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.213173342 podStartE2EDuration="35.088422075s" podCreationTimestamp="2026-03-13 15:22:41 +0000 UTC" firstStartedPulling="2026-03-13 15:23:00.837644324 +0000 UTC m=+1211.000856135" lastFinishedPulling="2026-03-13 15:23:09.712893057 +0000 UTC m=+1219.876104868" observedRunningTime="2026-03-13 15:23:16.067157938 +0000 UTC m=+1226.230369779" watchObservedRunningTime="2026-03-13 15:23:16.088422075 +0000 UTC m=+1226.251633886" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.091594 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-jc2x9"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.097298 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.100510 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.103413 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jc2x9"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.113696 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.715437169 podStartE2EDuration="37.11365581s" podCreationTimestamp="2026-03-13 15:22:39 +0000 UTC" firstStartedPulling="2026-03-13 15:23:00.315305311 +0000 UTC m=+1210.478517142" lastFinishedPulling="2026-03-13 15:23:09.713523952 +0000 UTC m=+1219.876735783" observedRunningTime="2026-03-13 15:23:16.096629848 +0000 UTC m=+1226.259841659" watchObservedRunningTime="2026-03-13 15:23:16.11365581 +0000 UTC m=+1226.276867931" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.260814 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-dddfc"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.275189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovn-rundir\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.275463 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-combined-ca-bundle\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.275580 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a45c6c-6521-40c9-af91-ac00f5427ce4-config\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.275663 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdlz\" (UniqueName: \"kubernetes.io/projected/a9a45c6c-6521-40c9-af91-ac00f5427ce4-kube-api-access-fhdlz\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.275820 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.275997 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovs-rundir\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.305196 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-p8tfg"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.306679 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.329849 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.339711 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-p8tfg"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.379609 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdlz\" (UniqueName: \"kubernetes.io/projected/a9a45c6c-6521-40c9-af91-ac00f5427ce4-kube-api-access-fhdlz\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.379740 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.379795 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovs-rundir\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.379821 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovn-rundir\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.379869 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-combined-ca-bundle\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.379898 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a45c6c-6521-40c9-af91-ac00f5427ce4-config\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.380424 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovn-rundir\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.380686 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a45c6c-6521-40c9-af91-ac00f5427ce4-config\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.381016 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovs-rundir\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.390261 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.412112 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdlz\" (UniqueName: \"kubernetes.io/projected/a9a45c6c-6521-40c9-af91-ac00f5427ce4-kube-api-access-fhdlz\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.416478 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-combined-ca-bundle\") pod \"ovn-controller-metrics-jc2x9\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.423254 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.481694 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.481768 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdjch\" (UniqueName: \"kubernetes.io/projected/eaec89c6-bef1-40a3-a93c-81f79c08abdc-kube-api-access-gdjch\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.481816 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-config\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.481875 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.507782 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-mftfr"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.586592 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.586644 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdjch\" (UniqueName: \"kubernetes.io/projected/eaec89c6-bef1-40a3-a93c-81f79c08abdc-kube-api-access-gdjch\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.586690 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-config\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.586732 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.587801 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-ovsdbserver-nb\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.590521 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-config\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.599224 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-dns-svc\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.599680 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-54fl5"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.603746 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.615390 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.631451 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdjch\" (UniqueName: \"kubernetes.io/projected/eaec89c6-bef1-40a3-a93c-81f79c08abdc-kube-api-access-gdjch\") pod \"dnsmasq-dns-6f9f59f7c5-p8tfg\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.637277 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-54fl5"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.646989 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.689002 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-config\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.689092 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.689121 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.689260 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhq6m\" (UniqueName: \"kubernetes.io/projected/3a4c1521-eb16-4def-a417-a0c8687f85bb-kube-api-access-qhq6m\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.689293 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.740150 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.741710 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.748291 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.748556 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-p867r" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.748682 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.748787 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.774053 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.790316 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.790357 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.790421 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhq6m\" (UniqueName: \"kubernetes.io/projected/3a4c1521-eb16-4def-a417-a0c8687f85bb-kube-api-access-qhq6m\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.790439 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.790491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-config\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.791718 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-config\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.792614 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-dns-svc\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.792988 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.793544 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.834703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhq6m\" (UniqueName: \"kubernetes.io/projected/3a4c1521-eb16-4def-a417-a0c8687f85bb-kube-api-access-qhq6m\") pod \"dnsmasq-dns-5d944d7b75-54fl5\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.892039 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.892303 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.892348 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-config\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.892388 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-scripts\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.892435 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.892491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.892541 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrd25\" (UniqueName: \"kubernetes.io/projected/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-kube-api-access-xrd25\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.994747 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-scripts\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.994804 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.994828 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.994914 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrd25\" (UniqueName: \"kubernetes.io/projected/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-kube-api-access-xrd25\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.994938 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.994953 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.994991 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-config\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.996365 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-scripts\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.996477 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.996580 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.996956 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-config\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:16 crc kubenswrapper[4786]: I0313 15:23:16.999152 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.000309 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.001023 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.017589 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrd25\" (UniqueName: \"kubernetes.io/projected/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-kube-api-access-xrd25\") pod \"ovn-northd-0\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " pod="openstack/ovn-northd-0" Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.055946 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jc2x9"] Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.088307 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 15:23:17 crc kubenswrapper[4786]: W0313 15:23:17.175073 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaec89c6_bef1_40a3_a93c_81f79c08abdc.slice/crio-b9c7b95507fbaf8915ad66b14b18666782a87e0dd2111ad381301abf52ed36c0 WatchSource:0}: Error finding container b9c7b95507fbaf8915ad66b14b18666782a87e0dd2111ad381301abf52ed36c0: Status 404 returned error can't find the container with id b9c7b95507fbaf8915ad66b14b18666782a87e0dd2111ad381301abf52ed36c0 Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.179053 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-p8tfg"] Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.432073 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-54fl5"] Mar 13 15:23:17 crc kubenswrapper[4786]: W0313 15:23:17.454786 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a4c1521_eb16_4def_a417_a0c8687f85bb.slice/crio-38fb1b0a8bf727470f01c7d6015fe69994125d189aac5797d3ee22b2404c7c9c WatchSource:0}: Error finding container 38fb1b0a8bf727470f01c7d6015fe69994125d189aac5797d3ee22b2404c7c9c: Status 404 returned error can't find the container with id 38fb1b0a8bf727470f01c7d6015fe69994125d189aac5797d3ee22b2404c7c9c Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.559380 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 15:23:17 crc kubenswrapper[4786]: W0313 15:23:17.575997 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02cf3ce6_6cec_451b_82c0_fdf9f1b7e10c.slice/crio-e862aef3a0d6193b8389161bb11d80336a8633537f33da76b2dd7fdb646b3609 WatchSource:0}: Error finding container e862aef3a0d6193b8389161bb11d80336a8633537f33da76b2dd7fdb646b3609: Status 404 returned error can't find the container with id e862aef3a0d6193b8389161bb11d80336a8633537f33da76b2dd7fdb646b3609 Mar 13 15:23:17 crc kubenswrapper[4786]: I0313 15:23:17.892065 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.000692 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c","Type":"ContainerStarted","Data":"e862aef3a0d6193b8389161bb11d80336a8633537f33da76b2dd7fdb646b3609"} Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.002742 4786 generic.go:334] "Generic (PLEG): container finished" podID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" containerID="df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5" exitCode=0 Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.002787 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" event={"ID":"eaec89c6-bef1-40a3-a93c-81f79c08abdc","Type":"ContainerDied","Data":"df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5"} Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.002820 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" event={"ID":"eaec89c6-bef1-40a3-a93c-81f79c08abdc","Type":"ContainerStarted","Data":"b9c7b95507fbaf8915ad66b14b18666782a87e0dd2111ad381301abf52ed36c0"} Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.004196 4786 generic.go:334] "Generic (PLEG): container finished" podID="3a4c1521-eb16-4def-a417-a0c8687f85bb" containerID="7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f" exitCode=0 Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.004242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" event={"ID":"3a4c1521-eb16-4def-a417-a0c8687f85bb","Type":"ContainerDied","Data":"7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f"} Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.004257 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" event={"ID":"3a4c1521-eb16-4def-a417-a0c8687f85bb","Type":"ContainerStarted","Data":"38fb1b0a8bf727470f01c7d6015fe69994125d189aac5797d3ee22b2404c7c9c"} Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.007483 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jc2x9" event={"ID":"a9a45c6c-6521-40c9-af91-ac00f5427ce4","Type":"ContainerStarted","Data":"8a0e90c6c16997dc571cfff737c2e9e9f5438e0dcf64bd4368c4878cc5a75790"} Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.007512 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jc2x9" event={"ID":"a9a45c6c-6521-40c9-af91-ac00f5427ce4","Type":"ContainerStarted","Data":"0c476858a0770da223391ca2917b88793837a68ba8245bc851494ca2ed801653"} Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.007875 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" containerName="dnsmasq-dns" containerID="cri-o://07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2" gracePeriod=10 Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.008157 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" containerName="dnsmasq-dns" containerID="cri-o://8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9" gracePeriod=10 Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.051131 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-jc2x9" podStartSLOduration=2.051113345 podStartE2EDuration="2.051113345s" podCreationTimestamp="2026-03-13 15:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:18.050933491 +0000 UTC m=+1228.214145322" watchObservedRunningTime="2026-03-13 15:23:18.051113345 +0000 UTC m=+1228.214325156" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.518918 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.527026 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.626045 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-dns-svc\") pod \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.626151 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-dns-svc\") pod \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.626184 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcs4x\" (UniqueName: \"kubernetes.io/projected/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-kube-api-access-hcs4x\") pod \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.626279 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-config\") pod \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.626304 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvbxc\" (UniqueName: \"kubernetes.io/projected/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-kube-api-access-bvbxc\") pod \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\" (UID: \"1030ecb0-1b6e-4344-b7c2-4b544e63edc7\") " Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.626326 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-config\") pod \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\" (UID: \"1753bbf8-ecfc-4ad6-b87f-9e38b8372862\") " Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.630678 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-kube-api-access-bvbxc" (OuterVolumeSpecName: "kube-api-access-bvbxc") pod "1030ecb0-1b6e-4344-b7c2-4b544e63edc7" (UID: "1030ecb0-1b6e-4344-b7c2-4b544e63edc7"). InnerVolumeSpecName "kube-api-access-bvbxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.631629 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-kube-api-access-hcs4x" (OuterVolumeSpecName: "kube-api-access-hcs4x") pod "1753bbf8-ecfc-4ad6-b87f-9e38b8372862" (UID: "1753bbf8-ecfc-4ad6-b87f-9e38b8372862"). InnerVolumeSpecName "kube-api-access-hcs4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.664416 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-config" (OuterVolumeSpecName: "config") pod "1753bbf8-ecfc-4ad6-b87f-9e38b8372862" (UID: "1753bbf8-ecfc-4ad6-b87f-9e38b8372862"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.665970 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1753bbf8-ecfc-4ad6-b87f-9e38b8372862" (UID: "1753bbf8-ecfc-4ad6-b87f-9e38b8372862"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.666930 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-config" (OuterVolumeSpecName: "config") pod "1030ecb0-1b6e-4344-b7c2-4b544e63edc7" (UID: "1030ecb0-1b6e-4344-b7c2-4b544e63edc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.670463 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1030ecb0-1b6e-4344-b7c2-4b544e63edc7" (UID: "1030ecb0-1b6e-4344-b7c2-4b544e63edc7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.727845 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.727900 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvbxc\" (UniqueName: \"kubernetes.io/projected/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-kube-api-access-bvbxc\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.727911 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.727919 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1030ecb0-1b6e-4344-b7c2-4b544e63edc7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.727928 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:18 crc kubenswrapper[4786]: I0313 15:23:18.727939 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcs4x\" (UniqueName: \"kubernetes.io/projected/1753bbf8-ecfc-4ad6-b87f-9e38b8372862-kube-api-access-hcs4x\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.016808 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" event={"ID":"3a4c1521-eb16-4def-a417-a0c8687f85bb","Type":"ContainerStarted","Data":"92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922"} Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.017069 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.019283 4786 generic.go:334] "Generic (PLEG): container finished" podID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" containerID="8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9" exitCode=0 Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.019331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" event={"ID":"1753bbf8-ecfc-4ad6-b87f-9e38b8372862","Type":"ContainerDied","Data":"8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9"} Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.019352 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" event={"ID":"1753bbf8-ecfc-4ad6-b87f-9e38b8372862","Type":"ContainerDied","Data":"cdc463af25ded7ebe95c78f280b93fb864353ec7f9bec06884e2b818ce5dd769"} Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.019367 4786 scope.go:117] "RemoveContainer" containerID="8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.019470 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-mftfr" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.040847 4786 generic.go:334] "Generic (PLEG): container finished" podID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" containerID="07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2" exitCode=0 Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.040921 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" event={"ID":"1030ecb0-1b6e-4344-b7c2-4b544e63edc7","Type":"ContainerDied","Data":"07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2"} Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.040943 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.040946 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-dddfc" event={"ID":"1030ecb0-1b6e-4344-b7c2-4b544e63edc7","Type":"ContainerDied","Data":"824bc3a16d48ffb99accac24b5d6a22200ee1f7752ecbc84d3856f65b76bbaf7"} Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.055149 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" podStartSLOduration=3.055133862 podStartE2EDuration="3.055133862s" podCreationTimestamp="2026-03-13 15:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:19.054070855 +0000 UTC m=+1229.217282666" watchObservedRunningTime="2026-03-13 15:23:19.055133862 +0000 UTC m=+1229.218345673" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.060182 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" event={"ID":"eaec89c6-bef1-40a3-a93c-81f79c08abdc","Type":"ContainerStarted","Data":"3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef"} Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.060376 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.080659 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" podStartSLOduration=3.080639974 podStartE2EDuration="3.080639974s" podCreationTimestamp="2026-03-13 15:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:19.078216034 +0000 UTC m=+1229.241427845" watchObservedRunningTime="2026-03-13 15:23:19.080639974 +0000 UTC m=+1229.243851785" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.112718 4786 scope.go:117] "RemoveContainer" containerID="8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.154715 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-dddfc"] Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.155714 4786 scope.go:117] "RemoveContainer" containerID="8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9" Mar 13 15:23:19 crc kubenswrapper[4786]: E0313 15:23:19.156474 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9\": container with ID starting with 8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9 not found: ID does not exist" containerID="8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.156618 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9"} err="failed to get container status \"8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9\": rpc error: code = NotFound desc = could not find container \"8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9\": container with ID starting with 8e7ef30eb6658a94d76a34385af093860495072105eb28104398fa9adc6b57f9 not found: ID does not exist" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.156733 4786 scope.go:117] "RemoveContainer" containerID="8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899" Mar 13 15:23:19 crc kubenswrapper[4786]: E0313 15:23:19.157190 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899\": container with ID starting with 8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899 not found: ID does not exist" containerID="8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.157270 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899"} err="failed to get container status \"8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899\": rpc error: code = NotFound desc = could not find container \"8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899\": container with ID starting with 8870dc6f2e4c0262edd10a3a95982cd7b2d13d3568eeb0021a80d591ac9f7899 not found: ID does not exist" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.157350 4786 scope.go:117] "RemoveContainer" containerID="07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.161372 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-dddfc"] Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.166699 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-mftfr"] Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.171608 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-mftfr"] Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.187683 4786 scope.go:117] "RemoveContainer" containerID="225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.203177 4786 scope.go:117] "RemoveContainer" containerID="07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2" Mar 13 15:23:19 crc kubenswrapper[4786]: E0313 15:23:19.203532 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2\": container with ID starting with 07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2 not found: ID does not exist" containerID="07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.203568 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2"} err="failed to get container status \"07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2\": rpc error: code = NotFound desc = could not find container \"07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2\": container with ID starting with 07c5c95d5eee5b65265df0a5f93f37a2eb2f2174d7f3925f6bce48e411c32eb2 not found: ID does not exist" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.203608 4786 scope.go:117] "RemoveContainer" containerID="225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629" Mar 13 15:23:19 crc kubenswrapper[4786]: E0313 15:23:19.204154 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629\": container with ID starting with 225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629 not found: ID does not exist" containerID="225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629" Mar 13 15:23:19 crc kubenswrapper[4786]: I0313 15:23:19.204186 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629"} err="failed to get container status \"225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629\": rpc error: code = NotFound desc = could not find container \"225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629\": container with ID starting with 225ccbcc0ed134f6939f3dda78cfb8d1b7866d446fbcf5cd18d1d6aa56c6a629 not found: ID does not exist" Mar 13 15:23:20 crc kubenswrapper[4786]: I0313 15:23:20.069849 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c","Type":"ContainerStarted","Data":"44fca03dce57cb826c43f7929ebd5d7925bdeb707e7610d77a9fed038f82c78f"} Mar 13 15:23:20 crc kubenswrapper[4786]: I0313 15:23:20.070248 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c","Type":"ContainerStarted","Data":"c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1"} Mar 13 15:23:20 crc kubenswrapper[4786]: I0313 15:23:20.070275 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 15:23:20 crc kubenswrapper[4786]: I0313 15:23:20.093257 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.695421599 podStartE2EDuration="4.093240033s" podCreationTimestamp="2026-03-13 15:23:16 +0000 UTC" firstStartedPulling="2026-03-13 15:23:17.579947161 +0000 UTC m=+1227.743158972" lastFinishedPulling="2026-03-13 15:23:18.977765595 +0000 UTC m=+1229.140977406" observedRunningTime="2026-03-13 15:23:20.092549616 +0000 UTC m=+1230.255761427" watchObservedRunningTime="2026-03-13 15:23:20.093240033 +0000 UTC m=+1230.256451844" Mar 13 15:23:20 crc kubenswrapper[4786]: I0313 15:23:20.575100 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" path="/var/lib/kubelet/pods/1030ecb0-1b6e-4344-b7c2-4b544e63edc7/volumes" Mar 13 15:23:20 crc kubenswrapper[4786]: I0313 15:23:20.575817 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" path="/var/lib/kubelet/pods/1753bbf8-ecfc-4ad6-b87f-9e38b8372862/volumes" Mar 13 15:23:21 crc kubenswrapper[4786]: I0313 15:23:21.385209 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 15:23:21 crc kubenswrapper[4786]: I0313 15:23:21.385276 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 15:23:21 crc kubenswrapper[4786]: I0313 15:23:21.455885 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 15:23:22 crc kubenswrapper[4786]: I0313 15:23:22.193795 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 15:23:22 crc kubenswrapper[4786]: I0313 15:23:22.637793 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 15:23:22 crc kubenswrapper[4786]: I0313 15:23:22.637843 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 15:23:22 crc kubenswrapper[4786]: I0313 15:23:22.722879 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.217796 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.922570 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4eba-account-create-update-d9lrg"] Mar 13 15:23:23 crc kubenswrapper[4786]: E0313 15:23:23.922993 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" containerName="dnsmasq-dns" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.923008 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" containerName="dnsmasq-dns" Mar 13 15:23:23 crc kubenswrapper[4786]: E0313 15:23:23.923053 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" containerName="init" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.923062 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" containerName="init" Mar 13 15:23:23 crc kubenswrapper[4786]: E0313 15:23:23.923081 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" containerName="dnsmasq-dns" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.923089 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" containerName="dnsmasq-dns" Mar 13 15:23:23 crc kubenswrapper[4786]: E0313 15:23:23.923107 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" containerName="init" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.923114 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" containerName="init" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.923264 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1753bbf8-ecfc-4ad6-b87f-9e38b8372862" containerName="dnsmasq-dns" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.923288 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1030ecb0-1b6e-4344-b7c2-4b544e63edc7" containerName="dnsmasq-dns" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.923914 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.929461 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.930843 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4eba-account-create-update-d9lrg"] Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.961055 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-x4fwm"] Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.962217 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:23 crc kubenswrapper[4786]: I0313 15:23:23.995211 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-x4fwm"] Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.030534 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6430e36a-b911-434c-8d5b-41bf29fa781f-operator-scripts\") pod \"keystone-4eba-account-create-update-d9lrg\" (UID: \"6430e36a-b911-434c-8d5b-41bf29fa781f\") " pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.031008 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctzvp\" (UniqueName: \"kubernetes.io/projected/6430e36a-b911-434c-8d5b-41bf29fa781f-kube-api-access-ctzvp\") pod \"keystone-4eba-account-create-update-d9lrg\" (UID: \"6430e36a-b911-434c-8d5b-41bf29fa781f\") " pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.141363 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctzvp\" (UniqueName: \"kubernetes.io/projected/6430e36a-b911-434c-8d5b-41bf29fa781f-kube-api-access-ctzvp\") pod \"keystone-4eba-account-create-update-d9lrg\" (UID: \"6430e36a-b911-434c-8d5b-41bf29fa781f\") " pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.141436 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-operator-scripts\") pod \"keystone-db-create-x4fwm\" (UID: \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\") " pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.141471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6430e36a-b911-434c-8d5b-41bf29fa781f-operator-scripts\") pod \"keystone-4eba-account-create-update-d9lrg\" (UID: \"6430e36a-b911-434c-8d5b-41bf29fa781f\") " pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.141561 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trw92\" (UniqueName: \"kubernetes.io/projected/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-kube-api-access-trw92\") pod \"keystone-db-create-x4fwm\" (UID: \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\") " pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.142882 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6430e36a-b911-434c-8d5b-41bf29fa781f-operator-scripts\") pod \"keystone-4eba-account-create-update-d9lrg\" (UID: \"6430e36a-b911-434c-8d5b-41bf29fa781f\") " pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.146498 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xlfs4"] Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.147558 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.152812 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xlfs4"] Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.176488 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctzvp\" (UniqueName: \"kubernetes.io/projected/6430e36a-b911-434c-8d5b-41bf29fa781f-kube-api-access-ctzvp\") pod \"keystone-4eba-account-create-update-d9lrg\" (UID: \"6430e36a-b911-434c-8d5b-41bf29fa781f\") " pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.243448 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-operator-scripts\") pod \"keystone-db-create-x4fwm\" (UID: \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\") " pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.243548 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkwpf\" (UniqueName: \"kubernetes.io/projected/bda4c1d6-a015-4455-8462-cd93e9fed73e-kube-api-access-nkwpf\") pod \"placement-db-create-xlfs4\" (UID: \"bda4c1d6-a015-4455-8462-cd93e9fed73e\") " pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.243570 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda4c1d6-a015-4455-8462-cd93e9fed73e-operator-scripts\") pod \"placement-db-create-xlfs4\" (UID: \"bda4c1d6-a015-4455-8462-cd93e9fed73e\") " pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.243589 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trw92\" (UniqueName: \"kubernetes.io/projected/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-kube-api-access-trw92\") pod \"keystone-db-create-x4fwm\" (UID: \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\") " pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.246267 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-operator-scripts\") pod \"keystone-db-create-x4fwm\" (UID: \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\") " pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.265407 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.277606 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trw92\" (UniqueName: \"kubernetes.io/projected/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-kube-api-access-trw92\") pod \"keystone-db-create-x4fwm\" (UID: \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\") " pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.287417 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e617-account-create-update-jrn5z"] Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.288429 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.290449 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.290714 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.296010 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e617-account-create-update-jrn5z"] Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.344743 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkwpf\" (UniqueName: \"kubernetes.io/projected/bda4c1d6-a015-4455-8462-cd93e9fed73e-kube-api-access-nkwpf\") pod \"placement-db-create-xlfs4\" (UID: \"bda4c1d6-a015-4455-8462-cd93e9fed73e\") " pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.344789 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda4c1d6-a015-4455-8462-cd93e9fed73e-operator-scripts\") pod \"placement-db-create-xlfs4\" (UID: \"bda4c1d6-a015-4455-8462-cd93e9fed73e\") " pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.345491 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda4c1d6-a015-4455-8462-cd93e9fed73e-operator-scripts\") pod \"placement-db-create-xlfs4\" (UID: \"bda4c1d6-a015-4455-8462-cd93e9fed73e\") " pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.366364 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkwpf\" (UniqueName: \"kubernetes.io/projected/bda4c1d6-a015-4455-8462-cd93e9fed73e-kube-api-access-nkwpf\") pod \"placement-db-create-xlfs4\" (UID: \"bda4c1d6-a015-4455-8462-cd93e9fed73e\") " pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.446021 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd170a9-ab19-4fbc-8948-6810a0ba8615-operator-scripts\") pod \"placement-e617-account-create-update-jrn5z\" (UID: \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\") " pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.446063 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgfb\" (UniqueName: \"kubernetes.io/projected/dcd170a9-ab19-4fbc-8948-6810a0ba8615-kube-api-access-ddgfb\") pod \"placement-e617-account-create-update-jrn5z\" (UID: \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\") " pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.477173 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.548286 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd170a9-ab19-4fbc-8948-6810a0ba8615-operator-scripts\") pod \"placement-e617-account-create-update-jrn5z\" (UID: \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\") " pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.548327 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgfb\" (UniqueName: \"kubernetes.io/projected/dcd170a9-ab19-4fbc-8948-6810a0ba8615-kube-api-access-ddgfb\") pod \"placement-e617-account-create-update-jrn5z\" (UID: \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\") " pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.549252 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd170a9-ab19-4fbc-8948-6810a0ba8615-operator-scripts\") pod \"placement-e617-account-create-update-jrn5z\" (UID: \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\") " pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.570536 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgfb\" (UniqueName: \"kubernetes.io/projected/dcd170a9-ab19-4fbc-8948-6810a0ba8615-kube-api-access-ddgfb\") pod \"placement-e617-account-create-update-jrn5z\" (UID: \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\") " pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.654316 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.750975 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-x4fwm"] Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.893254 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4eba-account-create-update-d9lrg"] Mar 13 15:23:24 crc kubenswrapper[4786]: W0313 15:23:24.949867 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6430e36a_b911_434c_8d5b_41bf29fa781f.slice/crio-ba5780845e1b7331a3d6674f51cd019ace79f6babee047d55524fe52e9080d97 WatchSource:0}: Error finding container ba5780845e1b7331a3d6674f51cd019ace79f6babee047d55524fe52e9080d97: Status 404 returned error can't find the container with id ba5780845e1b7331a3d6674f51cd019ace79f6babee047d55524fe52e9080d97 Mar 13 15:23:24 crc kubenswrapper[4786]: I0313 15:23:24.960030 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xlfs4"] Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.100538 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-54fl5"] Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.100837 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" podUID="3a4c1521-eb16-4def-a417-a0c8687f85bb" containerName="dnsmasq-dns" containerID="cri-o://92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922" gracePeriod=10 Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.103405 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.170338 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlfs4" event={"ID":"bda4c1d6-a015-4455-8462-cd93e9fed73e","Type":"ContainerStarted","Data":"dc292fe2a2e545fa31b47f663e8e573013586844c65b73078652d55db147193c"} Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.176887 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-hszcv"] Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.178449 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4eba-account-create-update-d9lrg" event={"ID":"6430e36a-b911-434c-8d5b-41bf29fa781f","Type":"ContainerStarted","Data":"ba5780845e1b7331a3d6674f51cd019ace79f6babee047d55524fe52e9080d97"} Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.178570 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.185520 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x4fwm" event={"ID":"fbf5dc57-49cf-4a6a-97ff-4a946db6823d","Type":"ContainerStarted","Data":"4e8b97f387f4fc2e4f6ac5319be8d84c34c6b50c92e4e873bda672174a618a08"} Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.185572 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x4fwm" event={"ID":"fbf5dc57-49cf-4a6a-97ff-4a946db6823d","Type":"ContainerStarted","Data":"1afac10b0190c90b7cb4571d6b09e0968d94c467f028c40b428956c151b1af2c"} Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.201303 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-hszcv"] Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.233518 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e617-account-create-update-jrn5z"] Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.245093 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-x4fwm" podStartSLOduration=2.245071816 podStartE2EDuration="2.245071816s" podCreationTimestamp="2026-03-13 15:23:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:25.22020057 +0000 UTC m=+1235.383412371" watchObservedRunningTime="2026-03-13 15:23:25.245071816 +0000 UTC m=+1235.408283627" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.261993 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-config\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.262312 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6w87\" (UniqueName: \"kubernetes.io/projected/39015b4e-70c0-48e9-aad2-14cc102da742-kube-api-access-n6w87\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.262353 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.262389 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.262409 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.364979 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.365028 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.365085 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-config\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.365165 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6w87\" (UniqueName: \"kubernetes.io/projected/39015b4e-70c0-48e9-aad2-14cc102da742-kube-api-access-n6w87\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.365210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.366161 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.366532 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-config\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.367244 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-dns-svc\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.369780 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.411147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6w87\" (UniqueName: \"kubernetes.io/projected/39015b4e-70c0-48e9-aad2-14cc102da742-kube-api-access-n6w87\") pod \"dnsmasq-dns-7b9fd7d84c-hszcv\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.613495 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.684610 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.771374 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhq6m\" (UniqueName: \"kubernetes.io/projected/3a4c1521-eb16-4def-a417-a0c8687f85bb-kube-api-access-qhq6m\") pod \"3a4c1521-eb16-4def-a417-a0c8687f85bb\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.771437 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-sb\") pod \"3a4c1521-eb16-4def-a417-a0c8687f85bb\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.771507 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-dns-svc\") pod \"3a4c1521-eb16-4def-a417-a0c8687f85bb\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.771550 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-config\") pod \"3a4c1521-eb16-4def-a417-a0c8687f85bb\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.771573 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-nb\") pod \"3a4c1521-eb16-4def-a417-a0c8687f85bb\" (UID: \"3a4c1521-eb16-4def-a417-a0c8687f85bb\") " Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.782222 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a4c1521-eb16-4def-a417-a0c8687f85bb-kube-api-access-qhq6m" (OuterVolumeSpecName: "kube-api-access-qhq6m") pod "3a4c1521-eb16-4def-a417-a0c8687f85bb" (UID: "3a4c1521-eb16-4def-a417-a0c8687f85bb"). InnerVolumeSpecName "kube-api-access-qhq6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.817205 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-config" (OuterVolumeSpecName: "config") pod "3a4c1521-eb16-4def-a417-a0c8687f85bb" (UID: "3a4c1521-eb16-4def-a417-a0c8687f85bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.817420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a4c1521-eb16-4def-a417-a0c8687f85bb" (UID: "3a4c1521-eb16-4def-a417-a0c8687f85bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.820783 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a4c1521-eb16-4def-a417-a0c8687f85bb" (UID: "3a4c1521-eb16-4def-a417-a0c8687f85bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.821415 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a4c1521-eb16-4def-a417-a0c8687f85bb" (UID: "3a4c1521-eb16-4def-a417-a0c8687f85bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.873090 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhq6m\" (UniqueName: \"kubernetes.io/projected/3a4c1521-eb16-4def-a417-a0c8687f85bb-kube-api-access-qhq6m\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.873126 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.873139 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.873150 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:25 crc kubenswrapper[4786]: I0313 15:23:25.873162 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a4c1521-eb16-4def-a417-a0c8687f85bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.078826 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-hszcv"] Mar 13 15:23:26 crc kubenswrapper[4786]: W0313 15:23:26.081398 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39015b4e_70c0_48e9_aad2_14cc102da742.slice/crio-aa1eebac2e81fd4b92eb0019d24ba92b1f1f5983093cecb9a98deee4250d3a08 WatchSource:0}: Error finding container aa1eebac2e81fd4b92eb0019d24ba92b1f1f5983093cecb9a98deee4250d3a08: Status 404 returned error can't find the container with id aa1eebac2e81fd4b92eb0019d24ba92b1f1f5983093cecb9a98deee4250d3a08 Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.212230 4786 generic.go:334] "Generic (PLEG): container finished" podID="bda4c1d6-a015-4455-8462-cd93e9fed73e" containerID="c808c6542f268f29e39e9d5e67ec3597a53f8eb001ef4a8af43f8bcd4c925919" exitCode=0 Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.212373 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlfs4" event={"ID":"bda4c1d6-a015-4455-8462-cd93e9fed73e","Type":"ContainerDied","Data":"c808c6542f268f29e39e9d5e67ec3597a53f8eb001ef4a8af43f8bcd4c925919"} Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.214616 4786 generic.go:334] "Generic (PLEG): container finished" podID="6430e36a-b911-434c-8d5b-41bf29fa781f" containerID="0cf605181558cf4d31fe583a542de7db9acfb6209a581890ce5a50ea1ba5c372" exitCode=0 Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.214694 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4eba-account-create-update-d9lrg" event={"ID":"6430e36a-b911-434c-8d5b-41bf29fa781f","Type":"ContainerDied","Data":"0cf605181558cf4d31fe583a542de7db9acfb6209a581890ce5a50ea1ba5c372"} Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.219577 4786 generic.go:334] "Generic (PLEG): container finished" podID="dcd170a9-ab19-4fbc-8948-6810a0ba8615" containerID="cfade56165e64c5564e292137f43c0b63c9976a2de22095e5f66321f4a9d1220" exitCode=0 Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.219702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e617-account-create-update-jrn5z" event={"ID":"dcd170a9-ab19-4fbc-8948-6810a0ba8615","Type":"ContainerDied","Data":"cfade56165e64c5564e292137f43c0b63c9976a2de22095e5f66321f4a9d1220"} Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.219735 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e617-account-create-update-jrn5z" event={"ID":"dcd170a9-ab19-4fbc-8948-6810a0ba8615","Type":"ContainerStarted","Data":"e1f96b385a4f61ec277e43741470fc9d2157e2906b208ee86e86a80a07b47024"} Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.232484 4786 generic.go:334] "Generic (PLEG): container finished" podID="3a4c1521-eb16-4def-a417-a0c8687f85bb" containerID="92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922" exitCode=0 Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.232575 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.233669 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" event={"ID":"3a4c1521-eb16-4def-a417-a0c8687f85bb","Type":"ContainerDied","Data":"92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922"} Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.234092 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d944d7b75-54fl5" event={"ID":"3a4c1521-eb16-4def-a417-a0c8687f85bb","Type":"ContainerDied","Data":"38fb1b0a8bf727470f01c7d6015fe69994125d189aac5797d3ee22b2404c7c9c"} Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.234137 4786 scope.go:117] "RemoveContainer" containerID="92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.239256 4786 generic.go:334] "Generic (PLEG): container finished" podID="fbf5dc57-49cf-4a6a-97ff-4a946db6823d" containerID="4e8b97f387f4fc2e4f6ac5319be8d84c34c6b50c92e4e873bda672174a618a08" exitCode=0 Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.239340 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x4fwm" event={"ID":"fbf5dc57-49cf-4a6a-97ff-4a946db6823d","Type":"ContainerDied","Data":"4e8b97f387f4fc2e4f6ac5319be8d84c34c6b50c92e4e873bda672174a618a08"} Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.260001 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" event={"ID":"39015b4e-70c0-48e9-aad2-14cc102da742","Type":"ContainerStarted","Data":"aa1eebac2e81fd4b92eb0019d24ba92b1f1f5983093cecb9a98deee4250d3a08"} Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.276634 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.277487 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4c1521-eb16-4def-a417-a0c8687f85bb" containerName="init" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.277506 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4c1521-eb16-4def-a417-a0c8687f85bb" containerName="init" Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.277521 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4c1521-eb16-4def-a417-a0c8687f85bb" containerName="dnsmasq-dns" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.277527 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4c1521-eb16-4def-a417-a0c8687f85bb" containerName="dnsmasq-dns" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.277781 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4c1521-eb16-4def-a417-a0c8687f85bb" containerName="dnsmasq-dns" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.280845 4786 scope.go:117] "RemoveContainer" containerID="7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.285841 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.292758 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.292807 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.292780 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6kbzr" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.302148 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.316459 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.318602 4786 scope.go:117] "RemoveContainer" containerID="92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922" Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.319120 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922\": container with ID starting with 92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922 not found: ID does not exist" containerID="92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.319166 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922"} err="failed to get container status \"92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922\": rpc error: code = NotFound desc = could not find container \"92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922\": container with ID starting with 92431921fb7f9efc0ccc42b492e3b3c35363be67e041e969aeb2c74818904922 not found: ID does not exist" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.319193 4786 scope.go:117] "RemoveContainer" containerID="7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f" Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.319478 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f\": container with ID starting with 7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f not found: ID does not exist" containerID="7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.319813 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f"} err="failed to get container status \"7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f\": rpc error: code = NotFound desc = could not find container \"7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f\": container with ID starting with 7191a11234d409263d9b50ae0ae8211431956b3e4ef316f294cd93eb325b895f not found: ID does not exist" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.330104 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-54fl5"] Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.338130 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d944d7b75-54fl5"] Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.383197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.383455 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e503bc45-db60-4bc8-bb97-3472d2456fdb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.383615 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvbr8\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-kube-api-access-jvbr8\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.383697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.383787 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-lock\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.383870 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-cache\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.485414 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvbr8\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-kube-api-access-jvbr8\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.485476 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.485523 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-lock\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.485545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-cache\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.485601 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.485634 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e503bc45-db60-4bc8-bb97-3472d2456fdb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.485708 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.485738 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.485793 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift podName:e503bc45-db60-4bc8-bb97-3472d2456fdb nodeName:}" failed. No retries permitted until 2026-03-13 15:23:26.985776418 +0000 UTC m=+1237.148988229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift") pod "swift-storage-0" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb") : configmap "swift-ring-files" not found Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.486190 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.486196 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-cache\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.488041 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-lock\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.495557 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e503bc45-db60-4bc8-bb97-3472d2456fdb-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.501143 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvbr8\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-kube-api-access-jvbr8\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.507675 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.563225 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a4c1521-eb16-4def-a417-a0c8687f85bb" path="/var/lib/kubelet/pods/3a4c1521-eb16-4def-a417-a0c8687f85bb/volumes" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.648737 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.741641 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-w8c7w"] Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.745329 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.749047 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.749426 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.749584 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.761077 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-w8c7w"] Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.894872 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-combined-ca-bundle\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.894923 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a2107c-c333-4776-ad2a-ce59edf18d04-etc-swift\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.894958 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-ring-data-devices\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.895000 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-swiftconf\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.895102 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-scripts\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.895143 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrlzh\" (UniqueName: \"kubernetes.io/projected/17a2107c-c333-4776-ad2a-ce59edf18d04-kube-api-access-nrlzh\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.895166 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-dispersionconf\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.996546 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-dispersionconf\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.996677 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrlzh\" (UniqueName: \"kubernetes.io/projected/17a2107c-c333-4776-ad2a-ce59edf18d04-kube-api-access-nrlzh\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.996783 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-combined-ca-bundle\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.996816 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a2107c-c333-4776-ad2a-ce59edf18d04-etc-swift\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.996834 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-ring-data-devices\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.996850 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-swiftconf\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.996906 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.996963 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-scripts\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.997751 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-scripts\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.998389 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.998409 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 15:23:26 crc kubenswrapper[4786]: E0313 15:23:26.998626 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift podName:e503bc45-db60-4bc8-bb97-3472d2456fdb nodeName:}" failed. No retries permitted until 2026-03-13 15:23:27.998609748 +0000 UTC m=+1238.161821559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift") pod "swift-storage-0" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb") : configmap "swift-ring-files" not found Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.998656 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-ring-data-devices\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:26 crc kubenswrapper[4786]: I0313 15:23:26.999306 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a2107c-c333-4776-ad2a-ce59edf18d04-etc-swift\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.000478 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-swiftconf\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.003240 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-dispersionconf\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.003439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-combined-ca-bundle\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.015397 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrlzh\" (UniqueName: \"kubernetes.io/projected/17a2107c-c333-4776-ad2a-ce59edf18d04-kube-api-access-nrlzh\") pod \"swift-ring-rebalance-w8c7w\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.066491 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.269803 4786 generic.go:334] "Generic (PLEG): container finished" podID="39015b4e-70c0-48e9-aad2-14cc102da742" containerID="dab01b53dc911359bf134047f019339fcdd078355c6fa62d83d79ebf64888823" exitCode=0 Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.269870 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" event={"ID":"39015b4e-70c0-48e9-aad2-14cc102da742","Type":"ContainerDied","Data":"dab01b53dc911359bf134047f019339fcdd078355c6fa62d83d79ebf64888823"} Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.511295 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-w8c7w"] Mar 13 15:23:27 crc kubenswrapper[4786]: W0313 15:23:27.532870 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17a2107c_c333_4776_ad2a_ce59edf18d04.slice/crio-dc5808266e2b9572939323da58864fee90962ef7de2a8f8b711dc51647299215 WatchSource:0}: Error finding container dc5808266e2b9572939323da58864fee90962ef7de2a8f8b711dc51647299215: Status 404 returned error can't find the container with id dc5808266e2b9572939323da58864fee90962ef7de2a8f8b711dc51647299215 Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.625937 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.718618 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddgfb\" (UniqueName: \"kubernetes.io/projected/dcd170a9-ab19-4fbc-8948-6810a0ba8615-kube-api-access-ddgfb\") pod \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\" (UID: \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\") " Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.718780 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd170a9-ab19-4fbc-8948-6810a0ba8615-operator-scripts\") pod \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\" (UID: \"dcd170a9-ab19-4fbc-8948-6810a0ba8615\") " Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.719627 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd170a9-ab19-4fbc-8948-6810a0ba8615-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcd170a9-ab19-4fbc-8948-6810a0ba8615" (UID: "dcd170a9-ab19-4fbc-8948-6810a0ba8615"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.724124 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd170a9-ab19-4fbc-8948-6810a0ba8615-kube-api-access-ddgfb" (OuterVolumeSpecName: "kube-api-access-ddgfb") pod "dcd170a9-ab19-4fbc-8948-6810a0ba8615" (UID: "dcd170a9-ab19-4fbc-8948-6810a0ba8615"). InnerVolumeSpecName "kube-api-access-ddgfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.784966 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.790794 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.814452 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.821236 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddgfb\" (UniqueName: \"kubernetes.io/projected/dcd170a9-ab19-4fbc-8948-6810a0ba8615-kube-api-access-ddgfb\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.821275 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd170a9-ab19-4fbc-8948-6810a0ba8615-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.922555 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6430e36a-b911-434c-8d5b-41bf29fa781f-operator-scripts\") pod \"6430e36a-b911-434c-8d5b-41bf29fa781f\" (UID: \"6430e36a-b911-434c-8d5b-41bf29fa781f\") " Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.922596 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda4c1d6-a015-4455-8462-cd93e9fed73e-operator-scripts\") pod \"bda4c1d6-a015-4455-8462-cd93e9fed73e\" (UID: \"bda4c1d6-a015-4455-8462-cd93e9fed73e\") " Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.922617 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkwpf\" (UniqueName: \"kubernetes.io/projected/bda4c1d6-a015-4455-8462-cd93e9fed73e-kube-api-access-nkwpf\") pod \"bda4c1d6-a015-4455-8462-cd93e9fed73e\" (UID: \"bda4c1d6-a015-4455-8462-cd93e9fed73e\") " Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.922701 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trw92\" (UniqueName: \"kubernetes.io/projected/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-kube-api-access-trw92\") pod \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\" (UID: \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\") " Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.922798 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-operator-scripts\") pod \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\" (UID: \"fbf5dc57-49cf-4a6a-97ff-4a946db6823d\") " Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.922820 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctzvp\" (UniqueName: \"kubernetes.io/projected/6430e36a-b911-434c-8d5b-41bf29fa781f-kube-api-access-ctzvp\") pod \"6430e36a-b911-434c-8d5b-41bf29fa781f\" (UID: \"6430e36a-b911-434c-8d5b-41bf29fa781f\") " Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.923150 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda4c1d6-a015-4455-8462-cd93e9fed73e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bda4c1d6-a015-4455-8462-cd93e9fed73e" (UID: "bda4c1d6-a015-4455-8462-cd93e9fed73e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.923221 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbf5dc57-49cf-4a6a-97ff-4a946db6823d" (UID: "fbf5dc57-49cf-4a6a-97ff-4a946db6823d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.923456 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6430e36a-b911-434c-8d5b-41bf29fa781f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6430e36a-b911-434c-8d5b-41bf29fa781f" (UID: "6430e36a-b911-434c-8d5b-41bf29fa781f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.927278 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6430e36a-b911-434c-8d5b-41bf29fa781f-kube-api-access-ctzvp" (OuterVolumeSpecName: "kube-api-access-ctzvp") pod "6430e36a-b911-434c-8d5b-41bf29fa781f" (UID: "6430e36a-b911-434c-8d5b-41bf29fa781f"). InnerVolumeSpecName "kube-api-access-ctzvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.927311 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-kube-api-access-trw92" (OuterVolumeSpecName: "kube-api-access-trw92") pod "fbf5dc57-49cf-4a6a-97ff-4a946db6823d" (UID: "fbf5dc57-49cf-4a6a-97ff-4a946db6823d"). InnerVolumeSpecName "kube-api-access-trw92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:27 crc kubenswrapper[4786]: I0313 15:23:27.928475 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda4c1d6-a015-4455-8462-cd93e9fed73e-kube-api-access-nkwpf" (OuterVolumeSpecName: "kube-api-access-nkwpf") pod "bda4c1d6-a015-4455-8462-cd93e9fed73e" (UID: "bda4c1d6-a015-4455-8462-cd93e9fed73e"). InnerVolumeSpecName "kube-api-access-nkwpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.024397 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.024638 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bda4c1d6-a015-4455-8462-cd93e9fed73e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.024654 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkwpf\" (UniqueName: \"kubernetes.io/projected/bda4c1d6-a015-4455-8462-cd93e9fed73e-kube-api-access-nkwpf\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.024667 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trw92\" (UniqueName: \"kubernetes.io/projected/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-kube-api-access-trw92\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.024680 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbf5dc57-49cf-4a6a-97ff-4a946db6823d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.024690 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctzvp\" (UniqueName: \"kubernetes.io/projected/6430e36a-b911-434c-8d5b-41bf29fa781f-kube-api-access-ctzvp\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.024702 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6430e36a-b911-434c-8d5b-41bf29fa781f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:28 crc kubenswrapper[4786]: E0313 15:23:28.024821 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 15:23:28 crc kubenswrapper[4786]: E0313 15:23:28.024836 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 15:23:28 crc kubenswrapper[4786]: E0313 15:23:28.024907 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift podName:e503bc45-db60-4bc8-bb97-3472d2456fdb nodeName:}" failed. No retries permitted until 2026-03-13 15:23:30.024889042 +0000 UTC m=+1240.188100853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift") pod "swift-storage-0" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb") : configmap "swift-ring-files" not found Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.026724 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kk4lt"] Mar 13 15:23:28 crc kubenswrapper[4786]: E0313 15:23:28.027152 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6430e36a-b911-434c-8d5b-41bf29fa781f" containerName="mariadb-account-create-update" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.027185 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6430e36a-b911-434c-8d5b-41bf29fa781f" containerName="mariadb-account-create-update" Mar 13 15:23:28 crc kubenswrapper[4786]: E0313 15:23:28.027212 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda4c1d6-a015-4455-8462-cd93e9fed73e" containerName="mariadb-database-create" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.027225 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda4c1d6-a015-4455-8462-cd93e9fed73e" containerName="mariadb-database-create" Mar 13 15:23:28 crc kubenswrapper[4786]: E0313 15:23:28.027242 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf5dc57-49cf-4a6a-97ff-4a946db6823d" containerName="mariadb-database-create" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.027252 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf5dc57-49cf-4a6a-97ff-4a946db6823d" containerName="mariadb-database-create" Mar 13 15:23:28 crc kubenswrapper[4786]: E0313 15:23:28.027274 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd170a9-ab19-4fbc-8948-6810a0ba8615" containerName="mariadb-account-create-update" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.027284 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd170a9-ab19-4fbc-8948-6810a0ba8615" containerName="mariadb-account-create-update" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.027526 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf5dc57-49cf-4a6a-97ff-4a946db6823d" containerName="mariadb-database-create" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.027552 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd170a9-ab19-4fbc-8948-6810a0ba8615" containerName="mariadb-account-create-update" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.027568 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda4c1d6-a015-4455-8462-cd93e9fed73e" containerName="mariadb-database-create" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.027590 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6430e36a-b911-434c-8d5b-41bf29fa781f" containerName="mariadb-account-create-update" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.028238 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.036722 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kk4lt"] Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.126741 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8576af97-74b4-4318-b12b-02c8b106a6eb-operator-scripts\") pod \"glance-db-create-kk4lt\" (UID: \"8576af97-74b4-4318-b12b-02c8b106a6eb\") " pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.126821 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4b6q\" (UniqueName: \"kubernetes.io/projected/8576af97-74b4-4318-b12b-02c8b106a6eb-kube-api-access-z4b6q\") pod \"glance-db-create-kk4lt\" (UID: \"8576af97-74b4-4318-b12b-02c8b106a6eb\") " pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.168004 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-92a2-account-create-update-x5fmd"] Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.168989 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.177951 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.180136 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-92a2-account-create-update-x5fmd"] Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.228764 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8576af97-74b4-4318-b12b-02c8b106a6eb-operator-scripts\") pod \"glance-db-create-kk4lt\" (UID: \"8576af97-74b4-4318-b12b-02c8b106a6eb\") " pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.228849 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4b6q\" (UniqueName: \"kubernetes.io/projected/8576af97-74b4-4318-b12b-02c8b106a6eb-kube-api-access-z4b6q\") pod \"glance-db-create-kk4lt\" (UID: \"8576af97-74b4-4318-b12b-02c8b106a6eb\") " pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.230207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8576af97-74b4-4318-b12b-02c8b106a6eb-operator-scripts\") pod \"glance-db-create-kk4lt\" (UID: \"8576af97-74b4-4318-b12b-02c8b106a6eb\") " pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.252743 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4b6q\" (UniqueName: \"kubernetes.io/projected/8576af97-74b4-4318-b12b-02c8b106a6eb-kube-api-access-z4b6q\") pod \"glance-db-create-kk4lt\" (UID: \"8576af97-74b4-4318-b12b-02c8b106a6eb\") " pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.287525 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" event={"ID":"39015b4e-70c0-48e9-aad2-14cc102da742","Type":"ContainerStarted","Data":"a5ee6500159e95cd3b50a5f36867482dcb62b6e4af81c6c2bc25a4b78e2ac686"} Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.289190 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w8c7w" event={"ID":"17a2107c-c333-4776-ad2a-ce59edf18d04","Type":"ContainerStarted","Data":"dc5808266e2b9572939323da58864fee90962ef7de2a8f8b711dc51647299215"} Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.291091 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlfs4" event={"ID":"bda4c1d6-a015-4455-8462-cd93e9fed73e","Type":"ContainerDied","Data":"dc292fe2a2e545fa31b47f663e8e573013586844c65b73078652d55db147193c"} Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.291128 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc292fe2a2e545fa31b47f663e8e573013586844c65b73078652d55db147193c" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.291199 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlfs4" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.294624 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4eba-account-create-update-d9lrg" event={"ID":"6430e36a-b911-434c-8d5b-41bf29fa781f","Type":"ContainerDied","Data":"ba5780845e1b7331a3d6674f51cd019ace79f6babee047d55524fe52e9080d97"} Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.294653 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba5780845e1b7331a3d6674f51cd019ace79f6babee047d55524fe52e9080d97" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.294700 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4eba-account-create-update-d9lrg" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.298708 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e617-account-create-update-jrn5z" event={"ID":"dcd170a9-ab19-4fbc-8948-6810a0ba8615","Type":"ContainerDied","Data":"e1f96b385a4f61ec277e43741470fc9d2157e2906b208ee86e86a80a07b47024"} Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.298771 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e617-account-create-update-jrn5z" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.298777 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1f96b385a4f61ec277e43741470fc9d2157e2906b208ee86e86a80a07b47024" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.300828 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-x4fwm" event={"ID":"fbf5dc57-49cf-4a6a-97ff-4a946db6823d","Type":"ContainerDied","Data":"1afac10b0190c90b7cb4571d6b09e0968d94c467f028c40b428956c151b1af2c"} Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.300879 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-x4fwm" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.300880 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1afac10b0190c90b7cb4571d6b09e0968d94c467f028c40b428956c151b1af2c" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.329904 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21fe0b9a-4302-4672-945d-e0195f3f86b5-operator-scripts\") pod \"glance-92a2-account-create-update-x5fmd\" (UID: \"21fe0b9a-4302-4672-945d-e0195f3f86b5\") " pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.329952 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrhtt\" (UniqueName: \"kubernetes.io/projected/21fe0b9a-4302-4672-945d-e0195f3f86b5-kube-api-access-wrhtt\") pod \"glance-92a2-account-create-update-x5fmd\" (UID: \"21fe0b9a-4302-4672-945d-e0195f3f86b5\") " pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.349308 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.433705 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21fe0b9a-4302-4672-945d-e0195f3f86b5-operator-scripts\") pod \"glance-92a2-account-create-update-x5fmd\" (UID: \"21fe0b9a-4302-4672-945d-e0195f3f86b5\") " pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.433753 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrhtt\" (UniqueName: \"kubernetes.io/projected/21fe0b9a-4302-4672-945d-e0195f3f86b5-kube-api-access-wrhtt\") pod \"glance-92a2-account-create-update-x5fmd\" (UID: \"21fe0b9a-4302-4672-945d-e0195f3f86b5\") " pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.434842 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21fe0b9a-4302-4672-945d-e0195f3f86b5-operator-scripts\") pod \"glance-92a2-account-create-update-x5fmd\" (UID: \"21fe0b9a-4302-4672-945d-e0195f3f86b5\") " pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.454262 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrhtt\" (UniqueName: \"kubernetes.io/projected/21fe0b9a-4302-4672-945d-e0195f3f86b5-kube-api-access-wrhtt\") pod \"glance-92a2-account-create-update-x5fmd\" (UID: \"21fe0b9a-4302-4672-945d-e0195f3f86b5\") " pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.595093 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:28 crc kubenswrapper[4786]: I0313 15:23:28.800024 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kk4lt"] Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.057021 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-92a2-account-create-update-x5fmd"] Mar 13 15:23:29 crc kubenswrapper[4786]: W0313 15:23:29.074907 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21fe0b9a_4302_4672_945d_e0195f3f86b5.slice/crio-46708d84559cabeb3b075b4377cafa543bf2de2fe73dea31434c0fa2abddcf29 WatchSource:0}: Error finding container 46708d84559cabeb3b075b4377cafa543bf2de2fe73dea31434c0fa2abddcf29: Status 404 returned error can't find the container with id 46708d84559cabeb3b075b4377cafa543bf2de2fe73dea31434c0fa2abddcf29 Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.310953 4786 generic.go:334] "Generic (PLEG): container finished" podID="8576af97-74b4-4318-b12b-02c8b106a6eb" containerID="dc737315c5feeb30b31f19b951dc0b3012f4726fea5f38b613fa8c0ef34159c8" exitCode=0 Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.311013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kk4lt" event={"ID":"8576af97-74b4-4318-b12b-02c8b106a6eb","Type":"ContainerDied","Data":"dc737315c5feeb30b31f19b951dc0b3012f4726fea5f38b613fa8c0ef34159c8"} Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.311346 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kk4lt" event={"ID":"8576af97-74b4-4318-b12b-02c8b106a6eb","Type":"ContainerStarted","Data":"339652dd77c44634d3aeb4331f69afc72d8cc218c50f2f4fa34c2b0c5117cce8"} Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.314253 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-92a2-account-create-update-x5fmd" event={"ID":"21fe0b9a-4302-4672-945d-e0195f3f86b5","Type":"ContainerStarted","Data":"80be103a5b34c19bf82c7bb43c94d36cfe4c07dd80d0861ee4b562ee8a5e495d"} Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.314290 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-92a2-account-create-update-x5fmd" event={"ID":"21fe0b9a-4302-4672-945d-e0195f3f86b5","Type":"ContainerStarted","Data":"46708d84559cabeb3b075b4377cafa543bf2de2fe73dea31434c0fa2abddcf29"} Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.349102 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" podStartSLOduration=4.349082784 podStartE2EDuration="4.349082784s" podCreationTimestamp="2026-03-13 15:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:29.34257407 +0000 UTC m=+1239.505785881" watchObservedRunningTime="2026-03-13 15:23:29.349082784 +0000 UTC m=+1239.512294595" Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.366200 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-92a2-account-create-update-x5fmd" podStartSLOduration=1.366176833 podStartE2EDuration="1.366176833s" podCreationTimestamp="2026-03-13 15:23:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:29.359894835 +0000 UTC m=+1239.523106646" watchObservedRunningTime="2026-03-13 15:23:29.366176833 +0000 UTC m=+1239.529388644" Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.961558 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ndjhq"] Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.962527 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.964585 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 15:23:29 crc kubenswrapper[4786]: I0313 15:23:29.998851 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ndjhq"] Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.081419 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.081599 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-operator-scripts\") pod \"root-account-create-update-ndjhq\" (UID: \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\") " pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.081643 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkt7\" (UniqueName: \"kubernetes.io/projected/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-kube-api-access-vjkt7\") pod \"root-account-create-update-ndjhq\" (UID: \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\") " pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:30 crc kubenswrapper[4786]: E0313 15:23:30.081744 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 15:23:30 crc kubenswrapper[4786]: E0313 15:23:30.081777 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 15:23:30 crc kubenswrapper[4786]: E0313 15:23:30.081849 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift podName:e503bc45-db60-4bc8-bb97-3472d2456fdb nodeName:}" failed. No retries permitted until 2026-03-13 15:23:34.081825229 +0000 UTC m=+1244.245037100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift") pod "swift-storage-0" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb") : configmap "swift-ring-files" not found Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.183719 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-operator-scripts\") pod \"root-account-create-update-ndjhq\" (UID: \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\") " pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.183790 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkt7\" (UniqueName: \"kubernetes.io/projected/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-kube-api-access-vjkt7\") pod \"root-account-create-update-ndjhq\" (UID: \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\") " pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.184637 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-operator-scripts\") pod \"root-account-create-update-ndjhq\" (UID: \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\") " pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.206363 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkt7\" (UniqueName: \"kubernetes.io/projected/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-kube-api-access-vjkt7\") pod \"root-account-create-update-ndjhq\" (UID: \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\") " pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.291372 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:30 crc kubenswrapper[4786]: I0313 15:23:30.613914 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.331968 4786 generic.go:334] "Generic (PLEG): container finished" podID="21fe0b9a-4302-4672-945d-e0195f3f86b5" containerID="80be103a5b34c19bf82c7bb43c94d36cfe4c07dd80d0861ee4b562ee8a5e495d" exitCode=0 Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.332013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-92a2-account-create-update-x5fmd" event={"ID":"21fe0b9a-4302-4672-945d-e0195f3f86b5","Type":"ContainerDied","Data":"80be103a5b34c19bf82c7bb43c94d36cfe4c07dd80d0861ee4b562ee8a5e495d"} Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.632481 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.818832 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8576af97-74b4-4318-b12b-02c8b106a6eb-operator-scripts\") pod \"8576af97-74b4-4318-b12b-02c8b106a6eb\" (UID: \"8576af97-74b4-4318-b12b-02c8b106a6eb\") " Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.819258 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4b6q\" (UniqueName: \"kubernetes.io/projected/8576af97-74b4-4318-b12b-02c8b106a6eb-kube-api-access-z4b6q\") pod \"8576af97-74b4-4318-b12b-02c8b106a6eb\" (UID: \"8576af97-74b4-4318-b12b-02c8b106a6eb\") " Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.819717 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8576af97-74b4-4318-b12b-02c8b106a6eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8576af97-74b4-4318-b12b-02c8b106a6eb" (UID: "8576af97-74b4-4318-b12b-02c8b106a6eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.825100 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8576af97-74b4-4318-b12b-02c8b106a6eb-kube-api-access-z4b6q" (OuterVolumeSpecName: "kube-api-access-z4b6q") pod "8576af97-74b4-4318-b12b-02c8b106a6eb" (UID: "8576af97-74b4-4318-b12b-02c8b106a6eb"). InnerVolumeSpecName "kube-api-access-z4b6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.921336 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4b6q\" (UniqueName: \"kubernetes.io/projected/8576af97-74b4-4318-b12b-02c8b106a6eb-kube-api-access-z4b6q\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.921370 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8576af97-74b4-4318-b12b-02c8b106a6eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:31 crc kubenswrapper[4786]: I0313 15:23:31.965261 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ndjhq"] Mar 13 15:23:31 crc kubenswrapper[4786]: W0313 15:23:31.969405 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d5bfd3e_21e4_4c35_bbb0_12a06604b64d.slice/crio-053e2bc1b0dfbd132649aff249d392e2719f8b51ce4b5081f37d7c6abad389e2 WatchSource:0}: Error finding container 053e2bc1b0dfbd132649aff249d392e2719f8b51ce4b5081f37d7c6abad389e2: Status 404 returned error can't find the container with id 053e2bc1b0dfbd132649aff249d392e2719f8b51ce4b5081f37d7c6abad389e2 Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.343420 4786 generic.go:334] "Generic (PLEG): container finished" podID="6d5bfd3e-21e4-4c35-bbb0-12a06604b64d" containerID="1205a8ab420b0ae6beac6025bfc70200476d490c756137e4c5d99d2fb21dcf74" exitCode=0 Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.343700 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ndjhq" event={"ID":"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d","Type":"ContainerDied","Data":"1205a8ab420b0ae6beac6025bfc70200476d490c756137e4c5d99d2fb21dcf74"} Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.343807 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ndjhq" event={"ID":"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d","Type":"ContainerStarted","Data":"053e2bc1b0dfbd132649aff249d392e2719f8b51ce4b5081f37d7c6abad389e2"} Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.346013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kk4lt" event={"ID":"8576af97-74b4-4318-b12b-02c8b106a6eb","Type":"ContainerDied","Data":"339652dd77c44634d3aeb4331f69afc72d8cc218c50f2f4fa34c2b0c5117cce8"} Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.346137 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="339652dd77c44634d3aeb4331f69afc72d8cc218c50f2f4fa34c2b0c5117cce8" Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.346302 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kk4lt" Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.349329 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w8c7w" event={"ID":"17a2107c-c333-4776-ad2a-ce59edf18d04","Type":"ContainerStarted","Data":"e8453d45bc476d4ec12b3b5bae95abb36c77cd13025a4bf085679749a3c0f337"} Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.396197 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-w8c7w" podStartSLOduration=2.416538065 podStartE2EDuration="6.396176288s" podCreationTimestamp="2026-03-13 15:23:26 +0000 UTC" firstStartedPulling="2026-03-13 15:23:27.535730447 +0000 UTC m=+1237.698942258" lastFinishedPulling="2026-03-13 15:23:31.51536867 +0000 UTC m=+1241.678580481" observedRunningTime="2026-03-13 15:23:32.384791251 +0000 UTC m=+1242.548003092" watchObservedRunningTime="2026-03-13 15:23:32.396176288 +0000 UTC m=+1242.559388109" Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.626636 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.733075 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrhtt\" (UniqueName: \"kubernetes.io/projected/21fe0b9a-4302-4672-945d-e0195f3f86b5-kube-api-access-wrhtt\") pod \"21fe0b9a-4302-4672-945d-e0195f3f86b5\" (UID: \"21fe0b9a-4302-4672-945d-e0195f3f86b5\") " Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.733178 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21fe0b9a-4302-4672-945d-e0195f3f86b5-operator-scripts\") pod \"21fe0b9a-4302-4672-945d-e0195f3f86b5\" (UID: \"21fe0b9a-4302-4672-945d-e0195f3f86b5\") " Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.733692 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21fe0b9a-4302-4672-945d-e0195f3f86b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21fe0b9a-4302-4672-945d-e0195f3f86b5" (UID: "21fe0b9a-4302-4672-945d-e0195f3f86b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.740192 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21fe0b9a-4302-4672-945d-e0195f3f86b5-kube-api-access-wrhtt" (OuterVolumeSpecName: "kube-api-access-wrhtt") pod "21fe0b9a-4302-4672-945d-e0195f3f86b5" (UID: "21fe0b9a-4302-4672-945d-e0195f3f86b5"). InnerVolumeSpecName "kube-api-access-wrhtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.835150 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrhtt\" (UniqueName: \"kubernetes.io/projected/21fe0b9a-4302-4672-945d-e0195f3f86b5-kube-api-access-wrhtt\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:32 crc kubenswrapper[4786]: I0313 15:23:32.835187 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21fe0b9a-4302-4672-945d-e0195f3f86b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:33 crc kubenswrapper[4786]: I0313 15:23:33.361635 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-92a2-account-create-update-x5fmd" event={"ID":"21fe0b9a-4302-4672-945d-e0195f3f86b5","Type":"ContainerDied","Data":"46708d84559cabeb3b075b4377cafa543bf2de2fe73dea31434c0fa2abddcf29"} Mar 13 15:23:33 crc kubenswrapper[4786]: I0313 15:23:33.362019 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46708d84559cabeb3b075b4377cafa543bf2de2fe73dea31434c0fa2abddcf29" Mar 13 15:23:33 crc kubenswrapper[4786]: I0313 15:23:33.361968 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-92a2-account-create-update-x5fmd" Mar 13 15:23:33 crc kubenswrapper[4786]: I0313 15:23:33.832129 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:33 crc kubenswrapper[4786]: I0313 15:23:33.953427 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-operator-scripts\") pod \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\" (UID: \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\") " Mar 13 15:23:33 crc kubenswrapper[4786]: I0313 15:23:33.953545 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjkt7\" (UniqueName: \"kubernetes.io/projected/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-kube-api-access-vjkt7\") pod \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\" (UID: \"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d\") " Mar 13 15:23:33 crc kubenswrapper[4786]: I0313 15:23:33.954110 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d5bfd3e-21e4-4c35-bbb0-12a06604b64d" (UID: "6d5bfd3e-21e4-4c35-bbb0-12a06604b64d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:33 crc kubenswrapper[4786]: I0313 15:23:33.958160 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-kube-api-access-vjkt7" (OuterVolumeSpecName: "kube-api-access-vjkt7") pod "6d5bfd3e-21e4-4c35-bbb0-12a06604b64d" (UID: "6d5bfd3e-21e4-4c35-bbb0-12a06604b64d"). InnerVolumeSpecName "kube-api-access-vjkt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.055360 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjkt7\" (UniqueName: \"kubernetes.io/projected/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-kube-api-access-vjkt7\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.055405 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.156794 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:34 crc kubenswrapper[4786]: E0313 15:23:34.157041 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 15:23:34 crc kubenswrapper[4786]: E0313 15:23:34.157063 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 15:23:34 crc kubenswrapper[4786]: E0313 15:23:34.157124 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift podName:e503bc45-db60-4bc8-bb97-3472d2456fdb nodeName:}" failed. No retries permitted until 2026-03-13 15:23:42.157107926 +0000 UTC m=+1252.320319747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift") pod "swift-storage-0" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb") : configmap "swift-ring-files" not found Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.374352 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ndjhq" event={"ID":"6d5bfd3e-21e4-4c35-bbb0-12a06604b64d","Type":"ContainerDied","Data":"053e2bc1b0dfbd132649aff249d392e2719f8b51ce4b5081f37d7c6abad389e2"} Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.374399 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053e2bc1b0dfbd132649aff249d392e2719f8b51ce4b5081f37d7c6abad389e2" Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.374445 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ndjhq" Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.378043 4786 generic.go:334] "Generic (PLEG): container finished" podID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerID="8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d" exitCode=0 Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.378164 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f964a2e6-aad3-42c0-8290-c3aa52d99e5b","Type":"ContainerDied","Data":"8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d"} Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.385539 4786 generic.go:334] "Generic (PLEG): container finished" podID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerID="4c81517ba6f8b5efac24e0644f61c61845b3759183639397f2234090a8627707" exitCode=0 Mar 13 15:23:34 crc kubenswrapper[4786]: I0313 15:23:34.385604 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43","Type":"ContainerDied","Data":"4c81517ba6f8b5efac24e0644f61c61845b3759183639397f2234090a8627707"} Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.394510 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f964a2e6-aad3-42c0-8290-c3aa52d99e5b","Type":"ContainerStarted","Data":"542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e"} Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.395022 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.396827 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43","Type":"ContainerStarted","Data":"789307556dd54b21497583b90b06c0b5ce70e7eed63ba1acf314c8edc36e15af"} Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.397088 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.423629 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.750754998 podStartE2EDuration="57.423615408s" podCreationTimestamp="2026-03-13 15:22:38 +0000 UTC" firstStartedPulling="2026-03-13 15:22:40.688949134 +0000 UTC m=+1190.852160945" lastFinishedPulling="2026-03-13 15:23:00.361809544 +0000 UTC m=+1210.525021355" observedRunningTime="2026-03-13 15:23:35.420508139 +0000 UTC m=+1245.583719970" watchObservedRunningTime="2026-03-13 15:23:35.423615408 +0000 UTC m=+1245.586827219" Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.454999 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.716114783 podStartE2EDuration="58.454979086s" podCreationTimestamp="2026-03-13 15:22:37 +0000 UTC" firstStartedPulling="2026-03-13 15:22:39.709147347 +0000 UTC m=+1189.872359158" lastFinishedPulling="2026-03-13 15:23:00.44801165 +0000 UTC m=+1210.611223461" observedRunningTime="2026-03-13 15:23:35.448641627 +0000 UTC m=+1245.611853458" watchObservedRunningTime="2026-03-13 15:23:35.454979086 +0000 UTC m=+1245.618190917" Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.615345 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.680743 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-p8tfg"] Mar 13 15:23:35 crc kubenswrapper[4786]: I0313 15:23:35.681033 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" podUID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" containerName="dnsmasq-dns" containerID="cri-o://3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef" gracePeriod=10 Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.184746 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.283690 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ndjhq"] Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.289389 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ndjhq"] Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.300669 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-config\") pod \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.300805 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdjch\" (UniqueName: \"kubernetes.io/projected/eaec89c6-bef1-40a3-a93c-81f79c08abdc-kube-api-access-gdjch\") pod \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.300978 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-dns-svc\") pod \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.301012 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-ovsdbserver-nb\") pod \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\" (UID: \"eaec89c6-bef1-40a3-a93c-81f79c08abdc\") " Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.307102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaec89c6-bef1-40a3-a93c-81f79c08abdc-kube-api-access-gdjch" (OuterVolumeSpecName: "kube-api-access-gdjch") pod "eaec89c6-bef1-40a3-a93c-81f79c08abdc" (UID: "eaec89c6-bef1-40a3-a93c-81f79c08abdc"). InnerVolumeSpecName "kube-api-access-gdjch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.344625 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eaec89c6-bef1-40a3-a93c-81f79c08abdc" (UID: "eaec89c6-bef1-40a3-a93c-81f79c08abdc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.354660 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eaec89c6-bef1-40a3-a93c-81f79c08abdc" (UID: "eaec89c6-bef1-40a3-a93c-81f79c08abdc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.355565 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-config" (OuterVolumeSpecName: "config") pod "eaec89c6-bef1-40a3-a93c-81f79c08abdc" (UID: "eaec89c6-bef1-40a3-a93c-81f79c08abdc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.402573 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.403846 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.403959 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaec89c6-bef1-40a3-a93c-81f79c08abdc-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.404020 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdjch\" (UniqueName: \"kubernetes.io/projected/eaec89c6-bef1-40a3-a93c-81f79c08abdc-kube-api-access-gdjch\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.406799 4786 generic.go:334] "Generic (PLEG): container finished" podID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" containerID="3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef" exitCode=0 Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.406885 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" event={"ID":"eaec89c6-bef1-40a3-a93c-81f79c08abdc","Type":"ContainerDied","Data":"3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef"} Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.406923 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" event={"ID":"eaec89c6-bef1-40a3-a93c-81f79c08abdc","Type":"ContainerDied","Data":"b9c7b95507fbaf8915ad66b14b18666782a87e0dd2111ad381301abf52ed36c0"} Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.406941 4786 scope.go:117] "RemoveContainer" containerID="3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.407408 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f9f59f7c5-p8tfg" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.445939 4786 scope.go:117] "RemoveContainer" containerID="df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.454438 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-p8tfg"] Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.461027 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f9f59f7c5-p8tfg"] Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.475947 4786 scope.go:117] "RemoveContainer" containerID="3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef" Mar 13 15:23:36 crc kubenswrapper[4786]: E0313 15:23:36.476283 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef\": container with ID starting with 3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef not found: ID does not exist" containerID="3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.476413 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef"} err="failed to get container status \"3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef\": rpc error: code = NotFound desc = could not find container \"3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef\": container with ID starting with 3bc7ab6d917f999a14ea8489559d959c2931f1d17a8e3ae940410e0705d0bfef not found: ID does not exist" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.476515 4786 scope.go:117] "RemoveContainer" containerID="df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5" Mar 13 15:23:36 crc kubenswrapper[4786]: E0313 15:23:36.476801 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5\": container with ID starting with df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5 not found: ID does not exist" containerID="df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.476823 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5"} err="failed to get container status \"df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5\": rpc error: code = NotFound desc = could not find container \"df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5\": container with ID starting with df7a52d0a849ae778061d4956c703c4a212f75319621ec1fd92ff273596a1aa5 not found: ID does not exist" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.561709 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5bfd3e-21e4-4c35-bbb0-12a06604b64d" path="/var/lib/kubelet/pods/6d5bfd3e-21e4-4c35-bbb0-12a06604b64d/volumes" Mar 13 15:23:36 crc kubenswrapper[4786]: I0313 15:23:36.562391 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" path="/var/lib/kubelet/pods/eaec89c6-bef1-40a3-a93c-81f79c08abdc/volumes" Mar 13 15:23:37 crc kubenswrapper[4786]: I0313 15:23:37.144931 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.313335 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zg25m"] Mar 13 15:23:38 crc kubenswrapper[4786]: E0313 15:23:38.313907 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21fe0b9a-4302-4672-945d-e0195f3f86b5" containerName="mariadb-account-create-update" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.313921 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="21fe0b9a-4302-4672-945d-e0195f3f86b5" containerName="mariadb-account-create-update" Mar 13 15:23:38 crc kubenswrapper[4786]: E0313 15:23:38.313931 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8576af97-74b4-4318-b12b-02c8b106a6eb" containerName="mariadb-database-create" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.313939 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8576af97-74b4-4318-b12b-02c8b106a6eb" containerName="mariadb-database-create" Mar 13 15:23:38 crc kubenswrapper[4786]: E0313 15:23:38.313949 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" containerName="dnsmasq-dns" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.313955 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" containerName="dnsmasq-dns" Mar 13 15:23:38 crc kubenswrapper[4786]: E0313 15:23:38.313975 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" containerName="init" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.313981 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" containerName="init" Mar 13 15:23:38 crc kubenswrapper[4786]: E0313 15:23:38.313994 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5bfd3e-21e4-4c35-bbb0-12a06604b64d" containerName="mariadb-account-create-update" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.314000 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5bfd3e-21e4-4c35-bbb0-12a06604b64d" containerName="mariadb-account-create-update" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.314135 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaec89c6-bef1-40a3-a93c-81f79c08abdc" containerName="dnsmasq-dns" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.314145 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5bfd3e-21e4-4c35-bbb0-12a06604b64d" containerName="mariadb-account-create-update" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.314156 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8576af97-74b4-4318-b12b-02c8b106a6eb" containerName="mariadb-database-create" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.314166 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="21fe0b9a-4302-4672-945d-e0195f3f86b5" containerName="mariadb-account-create-update" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.314652 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.317250 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cd7b7" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.320907 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.323594 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zg25m"] Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.439041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzg4p\" (UniqueName: \"kubernetes.io/projected/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-kube-api-access-vzg4p\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.439213 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-config-data\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.439348 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-db-sync-config-data\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.439660 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-combined-ca-bundle\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.541208 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzg4p\" (UniqueName: \"kubernetes.io/projected/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-kube-api-access-vzg4p\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.541472 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-config-data\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.541562 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-db-sync-config-data\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.541662 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-combined-ca-bundle\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.547405 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-combined-ca-bundle\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.548181 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-config-data\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.551498 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-db-sync-config-data\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.582616 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzg4p\" (UniqueName: \"kubernetes.io/projected/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-kube-api-access-vzg4p\") pod \"glance-db-sync-zg25m\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:38 crc kubenswrapper[4786]: I0313 15:23:38.630218 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zg25m" Mar 13 15:23:39 crc kubenswrapper[4786]: I0313 15:23:39.170352 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zg25m"] Mar 13 15:23:39 crc kubenswrapper[4786]: W0313 15:23:39.175394 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6adeaf7_f94a_4c32_9069_822bcb8d31b8.slice/crio-d7409927243a6a4975527cda43da1c4577f92f4435984931a5019e1f7df83318 WatchSource:0}: Error finding container d7409927243a6a4975527cda43da1c4577f92f4435984931a5019e1f7df83318: Status 404 returned error can't find the container with id d7409927243a6a4975527cda43da1c4577f92f4435984931a5019e1f7df83318 Mar 13 15:23:39 crc kubenswrapper[4786]: I0313 15:23:39.431358 4786 generic.go:334] "Generic (PLEG): container finished" podID="17a2107c-c333-4776-ad2a-ce59edf18d04" containerID="e8453d45bc476d4ec12b3b5bae95abb36c77cd13025a4bf085679749a3c0f337" exitCode=0 Mar 13 15:23:39 crc kubenswrapper[4786]: I0313 15:23:39.431443 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w8c7w" event={"ID":"17a2107c-c333-4776-ad2a-ce59edf18d04","Type":"ContainerDied","Data":"e8453d45bc476d4ec12b3b5bae95abb36c77cd13025a4bf085679749a3c0f337"} Mar 13 15:23:39 crc kubenswrapper[4786]: I0313 15:23:39.434146 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zg25m" event={"ID":"d6adeaf7-f94a-4c32-9069-822bcb8d31b8","Type":"ContainerStarted","Data":"d7409927243a6a4975527cda43da1c4577f92f4435984931a5019e1f7df83318"} Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.844398 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.878713 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a2107c-c333-4776-ad2a-ce59edf18d04-etc-swift\") pod \"17a2107c-c333-4776-ad2a-ce59edf18d04\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.878801 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-swiftconf\") pod \"17a2107c-c333-4776-ad2a-ce59edf18d04\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.878837 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrlzh\" (UniqueName: \"kubernetes.io/projected/17a2107c-c333-4776-ad2a-ce59edf18d04-kube-api-access-nrlzh\") pod \"17a2107c-c333-4776-ad2a-ce59edf18d04\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.878899 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-combined-ca-bundle\") pod \"17a2107c-c333-4776-ad2a-ce59edf18d04\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.878942 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-ring-data-devices\") pod \"17a2107c-c333-4776-ad2a-ce59edf18d04\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.878964 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-scripts\") pod \"17a2107c-c333-4776-ad2a-ce59edf18d04\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.879006 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-dispersionconf\") pod \"17a2107c-c333-4776-ad2a-ce59edf18d04\" (UID: \"17a2107c-c333-4776-ad2a-ce59edf18d04\") " Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.879664 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "17a2107c-c333-4776-ad2a-ce59edf18d04" (UID: "17a2107c-c333-4776-ad2a-ce59edf18d04"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.879802 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17a2107c-c333-4776-ad2a-ce59edf18d04-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "17a2107c-c333-4776-ad2a-ce59edf18d04" (UID: "17a2107c-c333-4776-ad2a-ce59edf18d04"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.885160 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a2107c-c333-4776-ad2a-ce59edf18d04-kube-api-access-nrlzh" (OuterVolumeSpecName: "kube-api-access-nrlzh") pod "17a2107c-c333-4776-ad2a-ce59edf18d04" (UID: "17a2107c-c333-4776-ad2a-ce59edf18d04"). InnerVolumeSpecName "kube-api-access-nrlzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.891456 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "17a2107c-c333-4776-ad2a-ce59edf18d04" (UID: "17a2107c-c333-4776-ad2a-ce59edf18d04"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.902994 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-scripts" (OuterVolumeSpecName: "scripts") pod "17a2107c-c333-4776-ad2a-ce59edf18d04" (UID: "17a2107c-c333-4776-ad2a-ce59edf18d04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.905464 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17a2107c-c333-4776-ad2a-ce59edf18d04" (UID: "17a2107c-c333-4776-ad2a-ce59edf18d04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.923809 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "17a2107c-c333-4776-ad2a-ce59edf18d04" (UID: "17a2107c-c333-4776-ad2a-ce59edf18d04"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.981290 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17a2107c-c333-4776-ad2a-ce59edf18d04-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.981324 4786 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.981336 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrlzh\" (UniqueName: \"kubernetes.io/projected/17a2107c-c333-4776-ad2a-ce59edf18d04-kube-api-access-nrlzh\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.981346 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.981355 4786 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.981365 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17a2107c-c333-4776-ad2a-ce59edf18d04-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:40 crc kubenswrapper[4786]: I0313 15:23:40.981373 4786 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17a2107c-c333-4776-ad2a-ce59edf18d04-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.269175 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ds5d4"] Mar 13 15:23:41 crc kubenswrapper[4786]: E0313 15:23:41.269559 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a2107c-c333-4776-ad2a-ce59edf18d04" containerName="swift-ring-rebalance" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.269580 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a2107c-c333-4776-ad2a-ce59edf18d04" containerName="swift-ring-rebalance" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.269812 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a2107c-c333-4776-ad2a-ce59edf18d04" containerName="swift-ring-rebalance" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.270444 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.281330 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ds5d4"] Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.284337 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.387662 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a95f56a-1b06-43de-87e7-06dca92043be-operator-scripts\") pod \"root-account-create-update-ds5d4\" (UID: \"0a95f56a-1b06-43de-87e7-06dca92043be\") " pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.387705 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvqz\" (UniqueName: \"kubernetes.io/projected/0a95f56a-1b06-43de-87e7-06dca92043be-kube-api-access-gwvqz\") pod \"root-account-create-update-ds5d4\" (UID: \"0a95f56a-1b06-43de-87e7-06dca92043be\") " pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.447690 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w8c7w" event={"ID":"17a2107c-c333-4776-ad2a-ce59edf18d04","Type":"ContainerDied","Data":"dc5808266e2b9572939323da58864fee90962ef7de2a8f8b711dc51647299215"} Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.447731 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc5808266e2b9572939323da58864fee90962ef7de2a8f8b711dc51647299215" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.447786 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w8c7w" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.489040 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a95f56a-1b06-43de-87e7-06dca92043be-operator-scripts\") pod \"root-account-create-update-ds5d4\" (UID: \"0a95f56a-1b06-43de-87e7-06dca92043be\") " pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.489304 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvqz\" (UniqueName: \"kubernetes.io/projected/0a95f56a-1b06-43de-87e7-06dca92043be-kube-api-access-gwvqz\") pod \"root-account-create-update-ds5d4\" (UID: \"0a95f56a-1b06-43de-87e7-06dca92043be\") " pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.490038 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a95f56a-1b06-43de-87e7-06dca92043be-operator-scripts\") pod \"root-account-create-update-ds5d4\" (UID: \"0a95f56a-1b06-43de-87e7-06dca92043be\") " pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.506284 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvqz\" (UniqueName: \"kubernetes.io/projected/0a95f56a-1b06-43de-87e7-06dca92043be-kube-api-access-gwvqz\") pod \"root-account-create-update-ds5d4\" (UID: \"0a95f56a-1b06-43de-87e7-06dca92043be\") " pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:41 crc kubenswrapper[4786]: I0313 15:23:41.588611 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:42 crc kubenswrapper[4786]: I0313 15:23:42.032684 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ds5d4"] Mar 13 15:23:42 crc kubenswrapper[4786]: W0313 15:23:42.036025 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a95f56a_1b06_43de_87e7_06dca92043be.slice/crio-0b1bfea74b64ee67038b2bc1912f8b526783b9b3d23a463405b3ba5bebfa2474 WatchSource:0}: Error finding container 0b1bfea74b64ee67038b2bc1912f8b526783b9b3d23a463405b3ba5bebfa2474: Status 404 returned error can't find the container with id 0b1bfea74b64ee67038b2bc1912f8b526783b9b3d23a463405b3ba5bebfa2474 Mar 13 15:23:42 crc kubenswrapper[4786]: I0313 15:23:42.198749 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:42 crc kubenswrapper[4786]: I0313 15:23:42.204217 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"swift-storage-0\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " pod="openstack/swift-storage-0" Mar 13 15:23:42 crc kubenswrapper[4786]: I0313 15:23:42.238670 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 15:23:42 crc kubenswrapper[4786]: I0313 15:23:42.460052 4786 generic.go:334] "Generic (PLEG): container finished" podID="0a95f56a-1b06-43de-87e7-06dca92043be" containerID="ecc2577d6b2f2afdad59b20d9b01f674dd6de6f89593f69c27e47f63b8c4aa05" exitCode=0 Mar 13 15:23:42 crc kubenswrapper[4786]: I0313 15:23:42.460101 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ds5d4" event={"ID":"0a95f56a-1b06-43de-87e7-06dca92043be","Type":"ContainerDied","Data":"ecc2577d6b2f2afdad59b20d9b01f674dd6de6f89593f69c27e47f63b8c4aa05"} Mar 13 15:23:42 crc kubenswrapper[4786]: I0313 15:23:42.460129 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ds5d4" event={"ID":"0a95f56a-1b06-43de-87e7-06dca92043be","Type":"ContainerStarted","Data":"0b1bfea74b64ee67038b2bc1912f8b526783b9b3d23a463405b3ba5bebfa2474"} Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.356604 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.470441 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"69b0fc969286be40ea887307f460354e7cbd0ad76bfe68859e2fcdfa3007227f"} Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.783258 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.825725 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a95f56a-1b06-43de-87e7-06dca92043be-operator-scripts\") pod \"0a95f56a-1b06-43de-87e7-06dca92043be\" (UID: \"0a95f56a-1b06-43de-87e7-06dca92043be\") " Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.825934 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwvqz\" (UniqueName: \"kubernetes.io/projected/0a95f56a-1b06-43de-87e7-06dca92043be-kube-api-access-gwvqz\") pod \"0a95f56a-1b06-43de-87e7-06dca92043be\" (UID: \"0a95f56a-1b06-43de-87e7-06dca92043be\") " Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.826832 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a95f56a-1b06-43de-87e7-06dca92043be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a95f56a-1b06-43de-87e7-06dca92043be" (UID: "0a95f56a-1b06-43de-87e7-06dca92043be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.835051 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95f56a-1b06-43de-87e7-06dca92043be-kube-api-access-gwvqz" (OuterVolumeSpecName: "kube-api-access-gwvqz") pod "0a95f56a-1b06-43de-87e7-06dca92043be" (UID: "0a95f56a-1b06-43de-87e7-06dca92043be"). InnerVolumeSpecName "kube-api-access-gwvqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.927822 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwvqz\" (UniqueName: \"kubernetes.io/projected/0a95f56a-1b06-43de-87e7-06dca92043be-kube-api-access-gwvqz\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:43 crc kubenswrapper[4786]: I0313 15:23:43.928232 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a95f56a-1b06-43de-87e7-06dca92043be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:44 crc kubenswrapper[4786]: I0313 15:23:44.479814 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ds5d4" event={"ID":"0a95f56a-1b06-43de-87e7-06dca92043be","Type":"ContainerDied","Data":"0b1bfea74b64ee67038b2bc1912f8b526783b9b3d23a463405b3ba5bebfa2474"} Mar 13 15:23:44 crc kubenswrapper[4786]: I0313 15:23:44.479847 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b1bfea74b64ee67038b2bc1912f8b526783b9b3d23a463405b3ba5bebfa2474" Mar 13 15:23:44 crc kubenswrapper[4786]: I0313 15:23:44.479851 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ds5d4" Mar 13 15:23:44 crc kubenswrapper[4786]: I0313 15:23:44.739893 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bfm8s" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerName="ovn-controller" probeResult="failure" output=< Mar 13 15:23:44 crc kubenswrapper[4786]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 15:23:44 crc kubenswrapper[4786]: > Mar 13 15:23:44 crc kubenswrapper[4786]: I0313 15:23:44.807204 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:23:44 crc kubenswrapper[4786]: I0313 15:23:44.818285 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.062685 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bfm8s-config-cw6z8"] Mar 13 15:23:45 crc kubenswrapper[4786]: E0313 15:23:45.063551 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a95f56a-1b06-43de-87e7-06dca92043be" containerName="mariadb-account-create-update" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.063569 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a95f56a-1b06-43de-87e7-06dca92043be" containerName="mariadb-account-create-update" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.063784 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a95f56a-1b06-43de-87e7-06dca92043be" containerName="mariadb-account-create-update" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.064516 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.067513 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.071411 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bfm8s-config-cw6z8"] Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.146176 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-scripts\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.146234 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dm7f\" (UniqueName: \"kubernetes.io/projected/49a45e1d-a545-4786-9235-f7b0e1ad82ee-kube-api-access-9dm7f\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.146306 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run-ovn\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.146490 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.146580 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-additional-scripts\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.146613 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-log-ovn\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248051 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-additional-scripts\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248117 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-log-ovn\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248181 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-scripts\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248215 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dm7f\" (UniqueName: \"kubernetes.io/projected/49a45e1d-a545-4786-9235-f7b0e1ad82ee-kube-api-access-9dm7f\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248256 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run-ovn\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248708 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248835 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run-ovn\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.248905 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-log-ovn\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.249752 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-additional-scripts\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.250984 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-scripts\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.284005 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dm7f\" (UniqueName: \"kubernetes.io/projected/49a45e1d-a545-4786-9235-f7b0e1ad82ee-kube-api-access-9dm7f\") pod \"ovn-controller-bfm8s-config-cw6z8\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:45 crc kubenswrapper[4786]: I0313 15:23:45.385060 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:47 crc kubenswrapper[4786]: I0313 15:23:47.581596 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bfm8s-config-cw6z8"] Mar 13 15:23:49 crc kubenswrapper[4786]: I0313 15:23:49.164135 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:23:49 crc kubenswrapper[4786]: I0313 15:23:49.750159 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bfm8s" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerName="ovn-controller" probeResult="failure" output=< Mar 13 15:23:49 crc kubenswrapper[4786]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 15:23:49 crc kubenswrapper[4786]: > Mar 13 15:23:50 crc kubenswrapper[4786]: I0313 15:23:50.194127 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.024185 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-q97rk"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.025499 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q97rk" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.033852 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-q97rk"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.141345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ls59\" (UniqueName: \"kubernetes.io/projected/75a874a7-d303-4a3a-b765-5d3316ad5c2b-kube-api-access-5ls59\") pod \"cinder-db-create-q97rk\" (UID: \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\") " pod="openstack/cinder-db-create-q97rk" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.141459 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a874a7-d303-4a3a-b765-5d3316ad5c2b-operator-scripts\") pod \"cinder-db-create-q97rk\" (UID: \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\") " pod="openstack/cinder-db-create-q97rk" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.143445 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ef50-account-create-update-lt26g"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.144674 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.149553 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.159101 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ef50-account-create-update-lt26g"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.219253 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-tjlws"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.222290 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tjlws" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.238100 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tjlws"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.242752 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb4e25a-01bd-46b0-8029-dde07c5bcfca-operator-scripts\") pod \"cinder-ef50-account-create-update-lt26g\" (UID: \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\") " pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.242851 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ls59\" (UniqueName: \"kubernetes.io/projected/75a874a7-d303-4a3a-b765-5d3316ad5c2b-kube-api-access-5ls59\") pod \"cinder-db-create-q97rk\" (UID: \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\") " pod="openstack/cinder-db-create-q97rk" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.242930 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a874a7-d303-4a3a-b765-5d3316ad5c2b-operator-scripts\") pod \"cinder-db-create-q97rk\" (UID: \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\") " pod="openstack/cinder-db-create-q97rk" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.242964 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx7rh\" (UniqueName: \"kubernetes.io/projected/edb4e25a-01bd-46b0-8029-dde07c5bcfca-kube-api-access-fx7rh\") pod \"cinder-ef50-account-create-update-lt26g\" (UID: \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\") " pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.244227 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a874a7-d303-4a3a-b765-5d3316ad5c2b-operator-scripts\") pod \"cinder-db-create-q97rk\" (UID: \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\") " pod="openstack/cinder-db-create-q97rk" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.279078 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ls59\" (UniqueName: \"kubernetes.io/projected/75a874a7-d303-4a3a-b765-5d3316ad5c2b-kube-api-access-5ls59\") pod \"cinder-db-create-q97rk\" (UID: \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\") " pod="openstack/cinder-db-create-q97rk" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.308349 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-cv6dz"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.309528 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cv6dz" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.322370 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cv6dz"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.344089 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx7rh\" (UniqueName: \"kubernetes.io/projected/edb4e25a-01bd-46b0-8029-dde07c5bcfca-kube-api-access-fx7rh\") pod \"cinder-ef50-account-create-update-lt26g\" (UID: \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\") " pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.344207 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb4e25a-01bd-46b0-8029-dde07c5bcfca-operator-scripts\") pod \"cinder-ef50-account-create-update-lt26g\" (UID: \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\") " pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.344243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564tz\" (UniqueName: \"kubernetes.io/projected/50342ecd-90d8-411c-aef8-53c0337267a9-kube-api-access-564tz\") pod \"barbican-db-create-tjlws\" (UID: \"50342ecd-90d8-411c-aef8-53c0337267a9\") " pod="openstack/barbican-db-create-tjlws" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.344321 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50342ecd-90d8-411c-aef8-53c0337267a9-operator-scripts\") pod \"barbican-db-create-tjlws\" (UID: \"50342ecd-90d8-411c-aef8-53c0337267a9\") " pod="openstack/barbican-db-create-tjlws" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.344507 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q97rk" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.345889 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb4e25a-01bd-46b0-8029-dde07c5bcfca-operator-scripts\") pod \"cinder-ef50-account-create-update-lt26g\" (UID: \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\") " pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.354615 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cfc0-account-create-update-s4gmg"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.356219 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.358941 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.368190 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx7rh\" (UniqueName: \"kubernetes.io/projected/edb4e25a-01bd-46b0-8029-dde07c5bcfca-kube-api-access-fx7rh\") pod \"cinder-ef50-account-create-update-lt26g\" (UID: \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\") " pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.380984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cfc0-account-create-update-s4gmg"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.416611 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-n6qr2"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.417927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.421792 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.421826 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.421884 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dsq8r" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.423531 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.427098 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-n6qr2"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.446489 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-564tz\" (UniqueName: \"kubernetes.io/projected/50342ecd-90d8-411c-aef8-53c0337267a9-kube-api-access-564tz\") pod \"barbican-db-create-tjlws\" (UID: \"50342ecd-90d8-411c-aef8-53c0337267a9\") " pod="openstack/barbican-db-create-tjlws" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.446553 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-operator-scripts\") pod \"barbican-cfc0-account-create-update-s4gmg\" (UID: \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\") " pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.446656 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50342ecd-90d8-411c-aef8-53c0337267a9-operator-scripts\") pod \"barbican-db-create-tjlws\" (UID: \"50342ecd-90d8-411c-aef8-53c0337267a9\") " pod="openstack/barbican-db-create-tjlws" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.446703 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8803f262-0a85-48aa-86f4-b01a2fc3692b-operator-scripts\") pod \"neutron-db-create-cv6dz\" (UID: \"8803f262-0a85-48aa-86f4-b01a2fc3692b\") " pod="openstack/neutron-db-create-cv6dz" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.446735 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzch8\" (UniqueName: \"kubernetes.io/projected/8803f262-0a85-48aa-86f4-b01a2fc3692b-kube-api-access-gzch8\") pod \"neutron-db-create-cv6dz\" (UID: \"8803f262-0a85-48aa-86f4-b01a2fc3692b\") " pod="openstack/neutron-db-create-cv6dz" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.446761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srsvd\" (UniqueName: \"kubernetes.io/projected/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-kube-api-access-srsvd\") pod \"barbican-cfc0-account-create-update-s4gmg\" (UID: \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\") " pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.447548 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50342ecd-90d8-411c-aef8-53c0337267a9-operator-scripts\") pod \"barbican-db-create-tjlws\" (UID: \"50342ecd-90d8-411c-aef8-53c0337267a9\") " pod="openstack/barbican-db-create-tjlws" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.467298 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.472877 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-564tz\" (UniqueName: \"kubernetes.io/projected/50342ecd-90d8-411c-aef8-53c0337267a9-kube-api-access-564tz\") pod \"barbican-db-create-tjlws\" (UID: \"50342ecd-90d8-411c-aef8-53c0337267a9\") " pod="openstack/barbican-db-create-tjlws" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.539946 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tjlws" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.544615 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ad0f-account-create-update-zfgm9"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.547229 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.551002 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8803f262-0a85-48aa-86f4-b01a2fc3692b-operator-scripts\") pod \"neutron-db-create-cv6dz\" (UID: \"8803f262-0a85-48aa-86f4-b01a2fc3692b\") " pod="openstack/neutron-db-create-cv6dz" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.551090 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-config-data\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.551133 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzch8\" (UniqueName: \"kubernetes.io/projected/8803f262-0a85-48aa-86f4-b01a2fc3692b-kube-api-access-gzch8\") pod \"neutron-db-create-cv6dz\" (UID: \"8803f262-0a85-48aa-86f4-b01a2fc3692b\") " pod="openstack/neutron-db-create-cv6dz" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.551196 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srsvd\" (UniqueName: \"kubernetes.io/projected/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-kube-api-access-srsvd\") pod \"barbican-cfc0-account-create-update-s4gmg\" (UID: \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\") " pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.551320 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67n7c\" (UniqueName: \"kubernetes.io/projected/2b29131e-a3b3-45c1-945c-b28cdfff2773-kube-api-access-67n7c\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.551381 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-operator-scripts\") pod \"barbican-cfc0-account-create-update-s4gmg\" (UID: \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\") " pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.551412 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-combined-ca-bundle\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.552364 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8803f262-0a85-48aa-86f4-b01a2fc3692b-operator-scripts\") pod \"neutron-db-create-cv6dz\" (UID: \"8803f262-0a85-48aa-86f4-b01a2fc3692b\") " pod="openstack/neutron-db-create-cv6dz" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.553214 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.553505 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-operator-scripts\") pod \"barbican-cfc0-account-create-update-s4gmg\" (UID: \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\") " pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.571342 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ad0f-account-create-update-zfgm9"] Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.574823 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzch8\" (UniqueName: \"kubernetes.io/projected/8803f262-0a85-48aa-86f4-b01a2fc3692b-kube-api-access-gzch8\") pod \"neutron-db-create-cv6dz\" (UID: \"8803f262-0a85-48aa-86f4-b01a2fc3692b\") " pod="openstack/neutron-db-create-cv6dz" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.577932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srsvd\" (UniqueName: \"kubernetes.io/projected/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-kube-api-access-srsvd\") pod \"barbican-cfc0-account-create-update-s4gmg\" (UID: \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\") " pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.636831 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cv6dz" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.653299 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dllxm\" (UniqueName: \"kubernetes.io/projected/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-kube-api-access-dllxm\") pod \"neutron-ad0f-account-create-update-zfgm9\" (UID: \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\") " pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.653379 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-config-data\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.653554 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-operator-scripts\") pod \"neutron-ad0f-account-create-update-zfgm9\" (UID: \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\") " pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.653679 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67n7c\" (UniqueName: \"kubernetes.io/projected/2b29131e-a3b3-45c1-945c-b28cdfff2773-kube-api-access-67n7c\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.653839 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-combined-ca-bundle\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.657098 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-combined-ca-bundle\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.657235 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-config-data\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.670170 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67n7c\" (UniqueName: \"kubernetes.io/projected/2b29131e-a3b3-45c1-945c-b28cdfff2773-kube-api-access-67n7c\") pod \"keystone-db-sync-n6qr2\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.711739 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.737760 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.755755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dllxm\" (UniqueName: \"kubernetes.io/projected/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-kube-api-access-dllxm\") pod \"neutron-ad0f-account-create-update-zfgm9\" (UID: \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\") " pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.755844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-operator-scripts\") pod \"neutron-ad0f-account-create-update-zfgm9\" (UID: \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\") " pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.756565 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-operator-scripts\") pod \"neutron-ad0f-account-create-update-zfgm9\" (UID: \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\") " pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.772631 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dllxm\" (UniqueName: \"kubernetes.io/projected/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-kube-api-access-dllxm\") pod \"neutron-ad0f-account-create-update-zfgm9\" (UID: \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\") " pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:23:51 crc kubenswrapper[4786]: I0313 15:23:51.878516 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:23:54 crc kubenswrapper[4786]: I0313 15:23:54.588999 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bfm8s-config-cw6z8" event={"ID":"49a45e1d-a545-4786-9235-f7b0e1ad82ee","Type":"ContainerStarted","Data":"e6801476c9414a02056b3dc52af3c84e2362b3e9b65ced40ca381f009f0a23c8"} Mar 13 15:23:54 crc kubenswrapper[4786]: I0313 15:23:54.814395 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bfm8s" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerName="ovn-controller" probeResult="failure" output=< Mar 13 15:23:54 crc kubenswrapper[4786]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 15:23:54 crc kubenswrapper[4786]: > Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.196343 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cfc0-account-create-update-s4gmg"] Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.273713 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ef50-account-create-update-lt26g"] Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.279323 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-n6qr2"] Mar 13 15:23:55 crc kubenswrapper[4786]: W0313 15:23:55.283160 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedb4e25a_01bd_46b0_8029_dde07c5bcfca.slice/crio-6bbc3b3685bf33683237c1563ee37c21d3a17350b75c46a1cdd8b51cce8fac33 WatchSource:0}: Error finding container 6bbc3b3685bf33683237c1563ee37c21d3a17350b75c46a1cdd8b51cce8fac33: Status 404 returned error can't find the container with id 6bbc3b3685bf33683237c1563ee37c21d3a17350b75c46a1cdd8b51cce8fac33 Mar 13 15:23:55 crc kubenswrapper[4786]: W0313 15:23:55.296177 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b29131e_a3b3_45c1_945c_b28cdfff2773.slice/crio-c021e341a62768733b4024c3b74fc68a180ded77e89adc8396493aca3bca6b65 WatchSource:0}: Error finding container c021e341a62768733b4024c3b74fc68a180ded77e89adc8396493aca3bca6b65: Status 404 returned error can't find the container with id c021e341a62768733b4024c3b74fc68a180ded77e89adc8396493aca3bca6b65 Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.404845 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cv6dz"] Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.416571 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ad0f-account-create-update-zfgm9"] Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.431804 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-q97rk"] Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.439277 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tjlws"] Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.601876 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cfc0-account-create-update-s4gmg" event={"ID":"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf","Type":"ContainerStarted","Data":"c5a558242d9db6738be911c2f2ead53cbee4619ab0b123e3cc7d2deb9efe88b0"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.602800 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cfc0-account-create-update-s4gmg" event={"ID":"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf","Type":"ContainerStarted","Data":"d69132dcb6eb2977547e898ade994926e12fa28c910cd3897f2aa28d77c12181"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.605587 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q97rk" event={"ID":"75a874a7-d303-4a3a-b765-5d3316ad5c2b","Type":"ContainerStarted","Data":"3420848d2e50f376d65e91606639cd9807743882e3a756dc44d06a665e4f7397"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.608526 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef50-account-create-update-lt26g" event={"ID":"edb4e25a-01bd-46b0-8029-dde07c5bcfca","Type":"ContainerStarted","Data":"5c01e07921c1f0119e885c5682397eb301c0509f4611ce41740be7679e8068a6"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.608719 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef50-account-create-update-lt26g" event={"ID":"edb4e25a-01bd-46b0-8029-dde07c5bcfca","Type":"ContainerStarted","Data":"6bbc3b3685bf33683237c1563ee37c21d3a17350b75c46a1cdd8b51cce8fac33"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.611669 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.611706 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.611716 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.611725 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.613112 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cv6dz" event={"ID":"8803f262-0a85-48aa-86f4-b01a2fc3692b","Type":"ContainerStarted","Data":"a29d8ac96f2b25f0069bfcd678d49fa5cc19ba8bbccd12fcfe92b6cb9db3a249"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.614309 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ad0f-account-create-update-zfgm9" event={"ID":"9aecf238-05ea-4d9c-8cac-58a9aaa8490d","Type":"ContainerStarted","Data":"b4be547890554782cbf731697520a9e68447d6b746643420dcc932b8514e34c5"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.616099 4786 generic.go:334] "Generic (PLEG): container finished" podID="49a45e1d-a545-4786-9235-f7b0e1ad82ee" containerID="917b6a1970b845be8ffb05e14e44dd41a32d8494185628afdab72b15710941e5" exitCode=0 Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.616164 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bfm8s-config-cw6z8" event={"ID":"49a45e1d-a545-4786-9235-f7b0e1ad82ee","Type":"ContainerDied","Data":"917b6a1970b845be8ffb05e14e44dd41a32d8494185628afdab72b15710941e5"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.617244 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n6qr2" event={"ID":"2b29131e-a3b3-45c1-945c-b28cdfff2773","Type":"ContainerStarted","Data":"c021e341a62768733b4024c3b74fc68a180ded77e89adc8396493aca3bca6b65"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.619714 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-cfc0-account-create-update-s4gmg" podStartSLOduration=4.619693714 podStartE2EDuration="4.619693714s" podCreationTimestamp="2026-03-13 15:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:55.615538259 +0000 UTC m=+1265.778750070" watchObservedRunningTime="2026-03-13 15:23:55.619693714 +0000 UTC m=+1265.782905535" Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.620308 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zg25m" event={"ID":"d6adeaf7-f94a-4c32-9069-822bcb8d31b8","Type":"ContainerStarted","Data":"c3378ad848becf43c536e9376944ca88cfa4805b8b8bfd9f3012155f1ebc4f1c"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.623669 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tjlws" event={"ID":"50342ecd-90d8-411c-aef8-53c0337267a9","Type":"ContainerStarted","Data":"1d596c6682afb95ce8dbdd63efa0591990e80462d48a22139cf9a4659dcce0b4"} Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.640587 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ef50-account-create-update-lt26g" podStartSLOduration=4.640569539 podStartE2EDuration="4.640569539s" podCreationTimestamp="2026-03-13 15:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:55.633359937 +0000 UTC m=+1265.796571748" watchObservedRunningTime="2026-03-13 15:23:55.640569539 +0000 UTC m=+1265.803781340" Mar 13 15:23:55 crc kubenswrapper[4786]: I0313 15:23:55.654901 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zg25m" podStartSLOduration=2.166061082 podStartE2EDuration="17.654883048s" podCreationTimestamp="2026-03-13 15:23:38 +0000 UTC" firstStartedPulling="2026-03-13 15:23:39.176797588 +0000 UTC m=+1249.340009399" lastFinishedPulling="2026-03-13 15:23:54.665619544 +0000 UTC m=+1264.828831365" observedRunningTime="2026-03-13 15:23:55.653157395 +0000 UTC m=+1265.816369206" watchObservedRunningTime="2026-03-13 15:23:55.654883048 +0000 UTC m=+1265.818094859" Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.636771 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tjlws" event={"ID":"50342ecd-90d8-411c-aef8-53c0337267a9","Type":"ContainerStarted","Data":"218185ba1cec541cc991701b18e7933dcc28c3b9c8ba80b603fd652e3058fda8"} Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.643616 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ad0f-account-create-update-zfgm9" event={"ID":"9aecf238-05ea-4d9c-8cac-58a9aaa8490d","Type":"ContainerStarted","Data":"0067f13081ec184761dd763d31be896d35e176fc5080d9886ecc604d62beab50"} Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.645564 4786 generic.go:334] "Generic (PLEG): container finished" podID="1c1e5dac-cc9d-4362-a3c7-2ea79797cacf" containerID="c5a558242d9db6738be911c2f2ead53cbee4619ab0b123e3cc7d2deb9efe88b0" exitCode=0 Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.645714 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cfc0-account-create-update-s4gmg" event={"ID":"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf","Type":"ContainerDied","Data":"c5a558242d9db6738be911c2f2ead53cbee4619ab0b123e3cc7d2deb9efe88b0"} Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.647989 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q97rk" event={"ID":"75a874a7-d303-4a3a-b765-5d3316ad5c2b","Type":"ContainerStarted","Data":"267d7ab68da6aba0afd06d3f284ba02e6f436062867044b1a8025ccca64f775f"} Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.652399 4786 generic.go:334] "Generic (PLEG): container finished" podID="edb4e25a-01bd-46b0-8029-dde07c5bcfca" containerID="5c01e07921c1f0119e885c5682397eb301c0509f4611ce41740be7679e8068a6" exitCode=0 Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.652566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef50-account-create-update-lt26g" event={"ID":"edb4e25a-01bd-46b0-8029-dde07c5bcfca","Type":"ContainerDied","Data":"5c01e07921c1f0119e885c5682397eb301c0509f4611ce41740be7679e8068a6"} Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.655440 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-tjlws" podStartSLOduration=5.655424676 podStartE2EDuration="5.655424676s" podCreationTimestamp="2026-03-13 15:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:56.653258721 +0000 UTC m=+1266.816470532" watchObservedRunningTime="2026-03-13 15:23:56.655424676 +0000 UTC m=+1266.818636487" Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.660586 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cv6dz" event={"ID":"8803f262-0a85-48aa-86f4-b01a2fc3692b","Type":"ContainerStarted","Data":"4cb869cd4456dd89c676022d5997f15243c54fe9debbee32cf20df121c9b0bf8"} Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.682166 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ad0f-account-create-update-zfgm9" podStartSLOduration=5.680803364 podStartE2EDuration="5.680803364s" podCreationTimestamp="2026-03-13 15:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:56.673988812 +0000 UTC m=+1266.837200623" watchObservedRunningTime="2026-03-13 15:23:56.680803364 +0000 UTC m=+1266.844015175" Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.696348 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-q97rk" podStartSLOduration=5.6963239340000005 podStartE2EDuration="5.696323934s" podCreationTimestamp="2026-03-13 15:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:56.686872666 +0000 UTC m=+1266.850084477" watchObservedRunningTime="2026-03-13 15:23:56.696323934 +0000 UTC m=+1266.859535745" Mar 13 15:23:56 crc kubenswrapper[4786]: I0313 15:23:56.736982 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-cv6dz" podStartSLOduration=5.736964955 podStartE2EDuration="5.736964955s" podCreationTimestamp="2026-03-13 15:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:23:56.728730438 +0000 UTC m=+1266.891942249" watchObservedRunningTime="2026-03-13 15:23:56.736964955 +0000 UTC m=+1266.900176766" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.028390 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.224171 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-scripts\") pod \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.224630 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-additional-scripts\") pod \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.224750 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run\") pod \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.224829 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run" (OuterVolumeSpecName: "var-run") pod "49a45e1d-a545-4786-9235-f7b0e1ad82ee" (UID: "49a45e1d-a545-4786-9235-f7b0e1ad82ee"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.224897 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dm7f\" (UniqueName: \"kubernetes.io/projected/49a45e1d-a545-4786-9235-f7b0e1ad82ee-kube-api-access-9dm7f\") pod \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.224933 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-log-ovn\") pod \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.224979 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run-ovn\") pod \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\" (UID: \"49a45e1d-a545-4786-9235-f7b0e1ad82ee\") " Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.225015 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "49a45e1d-a545-4786-9235-f7b0e1ad82ee" (UID: "49a45e1d-a545-4786-9235-f7b0e1ad82ee"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.225029 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "49a45e1d-a545-4786-9235-f7b0e1ad82ee" (UID: "49a45e1d-a545-4786-9235-f7b0e1ad82ee"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.225281 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "49a45e1d-a545-4786-9235-f7b0e1ad82ee" (UID: "49a45e1d-a545-4786-9235-f7b0e1ad82ee"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.225452 4786 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.225466 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.225475 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.225483 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/49a45e1d-a545-4786-9235-f7b0e1ad82ee-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.226829 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-scripts" (OuterVolumeSpecName: "scripts") pod "49a45e1d-a545-4786-9235-f7b0e1ad82ee" (UID: "49a45e1d-a545-4786-9235-f7b0e1ad82ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.230470 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a45e1d-a545-4786-9235-f7b0e1ad82ee-kube-api-access-9dm7f" (OuterVolumeSpecName: "kube-api-access-9dm7f") pod "49a45e1d-a545-4786-9235-f7b0e1ad82ee" (UID: "49a45e1d-a545-4786-9235-f7b0e1ad82ee"). InnerVolumeSpecName "kube-api-access-9dm7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.327091 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dm7f\" (UniqueName: \"kubernetes.io/projected/49a45e1d-a545-4786-9235-f7b0e1ad82ee-kube-api-access-9dm7f\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.327142 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49a45e1d-a545-4786-9235-f7b0e1ad82ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.672808 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bfm8s-config-cw6z8" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.672838 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bfm8s-config-cw6z8" event={"ID":"49a45e1d-a545-4786-9235-f7b0e1ad82ee","Type":"ContainerDied","Data":"e6801476c9414a02056b3dc52af3c84e2362b3e9b65ced40ca381f009f0a23c8"} Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.672900 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6801476c9414a02056b3dc52af3c84e2362b3e9b65ced40ca381f009f0a23c8" Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.675139 4786 generic.go:334] "Generic (PLEG): container finished" podID="50342ecd-90d8-411c-aef8-53c0337267a9" containerID="218185ba1cec541cc991701b18e7933dcc28c3b9c8ba80b603fd652e3058fda8" exitCode=0 Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.675218 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tjlws" event={"ID":"50342ecd-90d8-411c-aef8-53c0337267a9","Type":"ContainerDied","Data":"218185ba1cec541cc991701b18e7933dcc28c3b9c8ba80b603fd652e3058fda8"} Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.684022 4786 generic.go:334] "Generic (PLEG): container finished" podID="9aecf238-05ea-4d9c-8cac-58a9aaa8490d" containerID="0067f13081ec184761dd763d31be896d35e176fc5080d9886ecc604d62beab50" exitCode=0 Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.684818 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ad0f-account-create-update-zfgm9" event={"ID":"9aecf238-05ea-4d9c-8cac-58a9aaa8490d","Type":"ContainerDied","Data":"0067f13081ec184761dd763d31be896d35e176fc5080d9886ecc604d62beab50"} Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.702443 4786 generic.go:334] "Generic (PLEG): container finished" podID="75a874a7-d303-4a3a-b765-5d3316ad5c2b" containerID="267d7ab68da6aba0afd06d3f284ba02e6f436062867044b1a8025ccca64f775f" exitCode=0 Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.702681 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q97rk" event={"ID":"75a874a7-d303-4a3a-b765-5d3316ad5c2b","Type":"ContainerDied","Data":"267d7ab68da6aba0afd06d3f284ba02e6f436062867044b1a8025ccca64f775f"} Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.708096 4786 generic.go:334] "Generic (PLEG): container finished" podID="8803f262-0a85-48aa-86f4-b01a2fc3692b" containerID="4cb869cd4456dd89c676022d5997f15243c54fe9debbee32cf20df121c9b0bf8" exitCode=0 Mar 13 15:23:57 crc kubenswrapper[4786]: I0313 15:23:57.708300 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cv6dz" event={"ID":"8803f262-0a85-48aa-86f4-b01a2fc3692b","Type":"ContainerDied","Data":"4cb869cd4456dd89c676022d5997f15243c54fe9debbee32cf20df121c9b0bf8"} Mar 13 15:23:58 crc kubenswrapper[4786]: I0313 15:23:58.132017 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bfm8s-config-cw6z8"] Mar 13 15:23:58 crc kubenswrapper[4786]: I0313 15:23:58.143679 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bfm8s-config-cw6z8"] Mar 13 15:23:58 crc kubenswrapper[4786]: I0313 15:23:58.563818 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a45e1d-a545-4786-9235-f7b0e1ad82ee" path="/var/lib/kubelet/pods/49a45e1d-a545-4786-9235-f7b0e1ad82ee/volumes" Mar 13 15:23:59 crc kubenswrapper[4786]: I0313 15:23:59.739251 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-bfm8s" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.139625 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556924-czchf"] Mar 13 15:24:00 crc kubenswrapper[4786]: E0313 15:24:00.141595 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a45e1d-a545-4786-9235-f7b0e1ad82ee" containerName="ovn-config" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.141625 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a45e1d-a545-4786-9235-f7b0e1ad82ee" containerName="ovn-config" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.141956 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a45e1d-a545-4786-9235-f7b0e1ad82ee" containerName="ovn-config" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.142758 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-czchf" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.145254 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.145254 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.147139 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.159813 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsgtj\" (UniqueName: \"kubernetes.io/projected/48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3-kube-api-access-qsgtj\") pod \"auto-csr-approver-29556924-czchf\" (UID: \"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3\") " pod="openshift-infra/auto-csr-approver-29556924-czchf" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.162339 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-czchf"] Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.260568 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsgtj\" (UniqueName: \"kubernetes.io/projected/48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3-kube-api-access-qsgtj\") pod \"auto-csr-approver-29556924-czchf\" (UID: \"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3\") " pod="openshift-infra/auto-csr-approver-29556924-czchf" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.282985 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsgtj\" (UniqueName: \"kubernetes.io/projected/48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3-kube-api-access-qsgtj\") pod \"auto-csr-approver-29556924-czchf\" (UID: \"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3\") " pod="openshift-infra/auto-csr-approver-29556924-czchf" Mar 13 15:24:00 crc kubenswrapper[4786]: I0313 15:24:00.464179 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-czchf" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.181924 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tjlws" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.191958 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.198269 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.374951 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srsvd\" (UniqueName: \"kubernetes.io/projected/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-kube-api-access-srsvd\") pod \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\" (UID: \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.375041 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-operator-scripts\") pod \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\" (UID: \"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.375069 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb4e25a-01bd-46b0-8029-dde07c5bcfca-operator-scripts\") pod \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\" (UID: \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.375199 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-564tz\" (UniqueName: \"kubernetes.io/projected/50342ecd-90d8-411c-aef8-53c0337267a9-kube-api-access-564tz\") pod \"50342ecd-90d8-411c-aef8-53c0337267a9\" (UID: \"50342ecd-90d8-411c-aef8-53c0337267a9\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.375223 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50342ecd-90d8-411c-aef8-53c0337267a9-operator-scripts\") pod \"50342ecd-90d8-411c-aef8-53c0337267a9\" (UID: \"50342ecd-90d8-411c-aef8-53c0337267a9\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.375283 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx7rh\" (UniqueName: \"kubernetes.io/projected/edb4e25a-01bd-46b0-8029-dde07c5bcfca-kube-api-access-fx7rh\") pod \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\" (UID: \"edb4e25a-01bd-46b0-8029-dde07c5bcfca\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.376494 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edb4e25a-01bd-46b0-8029-dde07c5bcfca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "edb4e25a-01bd-46b0-8029-dde07c5bcfca" (UID: "edb4e25a-01bd-46b0-8029-dde07c5bcfca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.376930 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c1e5dac-cc9d-4362-a3c7-2ea79797cacf" (UID: "1c1e5dac-cc9d-4362-a3c7-2ea79797cacf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.377339 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50342ecd-90d8-411c-aef8-53c0337267a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50342ecd-90d8-411c-aef8-53c0337267a9" (UID: "50342ecd-90d8-411c-aef8-53c0337267a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.380653 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb4e25a-01bd-46b0-8029-dde07c5bcfca-kube-api-access-fx7rh" (OuterVolumeSpecName: "kube-api-access-fx7rh") pod "edb4e25a-01bd-46b0-8029-dde07c5bcfca" (UID: "edb4e25a-01bd-46b0-8029-dde07c5bcfca"). InnerVolumeSpecName "kube-api-access-fx7rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.380767 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50342ecd-90d8-411c-aef8-53c0337267a9-kube-api-access-564tz" (OuterVolumeSpecName: "kube-api-access-564tz") pod "50342ecd-90d8-411c-aef8-53c0337267a9" (UID: "50342ecd-90d8-411c-aef8-53c0337267a9"). InnerVolumeSpecName "kube-api-access-564tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.384197 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-kube-api-access-srsvd" (OuterVolumeSpecName: "kube-api-access-srsvd") pod "1c1e5dac-cc9d-4362-a3c7-2ea79797cacf" (UID: "1c1e5dac-cc9d-4362-a3c7-2ea79797cacf"). InnerVolumeSpecName "kube-api-access-srsvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.477131 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-564tz\" (UniqueName: \"kubernetes.io/projected/50342ecd-90d8-411c-aef8-53c0337267a9-kube-api-access-564tz\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.477191 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50342ecd-90d8-411c-aef8-53c0337267a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.477204 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx7rh\" (UniqueName: \"kubernetes.io/projected/edb4e25a-01bd-46b0-8029-dde07c5bcfca-kube-api-access-fx7rh\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.477214 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srsvd\" (UniqueName: \"kubernetes.io/projected/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-kube-api-access-srsvd\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.477225 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.477236 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb4e25a-01bd-46b0-8029-dde07c5bcfca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.632569 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cv6dz" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.638791 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.654965 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q97rk" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.766213 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cfc0-account-create-update-s4gmg" event={"ID":"1c1e5dac-cc9d-4362-a3c7-2ea79797cacf","Type":"ContainerDied","Data":"d69132dcb6eb2977547e898ade994926e12fa28c910cd3897f2aa28d77c12181"} Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.766249 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d69132dcb6eb2977547e898ade994926e12fa28c910cd3897f2aa28d77c12181" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.766302 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cfc0-account-create-update-s4gmg" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.771720 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cv6dz" event={"ID":"8803f262-0a85-48aa-86f4-b01a2fc3692b","Type":"ContainerDied","Data":"a29d8ac96f2b25f0069bfcd678d49fa5cc19ba8bbccd12fcfe92b6cb9db3a249"} Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.771764 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a29d8ac96f2b25f0069bfcd678d49fa5cc19ba8bbccd12fcfe92b6cb9db3a249" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.771836 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cv6dz" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.778355 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tjlws" event={"ID":"50342ecd-90d8-411c-aef8-53c0337267a9","Type":"ContainerDied","Data":"1d596c6682afb95ce8dbdd63efa0591990e80462d48a22139cf9a4659dcce0b4"} Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.778390 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d596c6682afb95ce8dbdd63efa0591990e80462d48a22139cf9a4659dcce0b4" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.778448 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tjlws" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.782377 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ad0f-account-create-update-zfgm9" event={"ID":"9aecf238-05ea-4d9c-8cac-58a9aaa8490d","Type":"ContainerDied","Data":"b4be547890554782cbf731697520a9e68447d6b746643420dcc932b8514e34c5"} Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.782419 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4be547890554782cbf731697520a9e68447d6b746643420dcc932b8514e34c5" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.782487 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ad0f-account-create-update-zfgm9" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.782952 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8803f262-0a85-48aa-86f4-b01a2fc3692b-operator-scripts\") pod \"8803f262-0a85-48aa-86f4-b01a2fc3692b\" (UID: \"8803f262-0a85-48aa-86f4-b01a2fc3692b\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.783096 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ls59\" (UniqueName: \"kubernetes.io/projected/75a874a7-d303-4a3a-b765-5d3316ad5c2b-kube-api-access-5ls59\") pod \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\" (UID: \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.783235 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dllxm\" (UniqueName: \"kubernetes.io/projected/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-kube-api-access-dllxm\") pod \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\" (UID: \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.783281 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzch8\" (UniqueName: \"kubernetes.io/projected/8803f262-0a85-48aa-86f4-b01a2fc3692b-kube-api-access-gzch8\") pod \"8803f262-0a85-48aa-86f4-b01a2fc3692b\" (UID: \"8803f262-0a85-48aa-86f4-b01a2fc3692b\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.783305 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a874a7-d303-4a3a-b765-5d3316ad5c2b-operator-scripts\") pod \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\" (UID: \"75a874a7-d303-4a3a-b765-5d3316ad5c2b\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.784464 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-operator-scripts\") pod \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\" (UID: \"9aecf238-05ea-4d9c-8cac-58a9aaa8490d\") " Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.784810 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8803f262-0a85-48aa-86f4-b01a2fc3692b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8803f262-0a85-48aa-86f4-b01a2fc3692b" (UID: "8803f262-0a85-48aa-86f4-b01a2fc3692b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.784955 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8803f262-0a85-48aa-86f4-b01a2fc3692b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.785555 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9aecf238-05ea-4d9c-8cac-58a9aaa8490d" (UID: "9aecf238-05ea-4d9c-8cac-58a9aaa8490d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.785748 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75a874a7-d303-4a3a-b765-5d3316ad5c2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75a874a7-d303-4a3a-b765-5d3316ad5c2b" (UID: "75a874a7-d303-4a3a-b765-5d3316ad5c2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.789365 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8803f262-0a85-48aa-86f4-b01a2fc3692b-kube-api-access-gzch8" (OuterVolumeSpecName: "kube-api-access-gzch8") pod "8803f262-0a85-48aa-86f4-b01a2fc3692b" (UID: "8803f262-0a85-48aa-86f4-b01a2fc3692b"). InnerVolumeSpecName "kube-api-access-gzch8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.791528 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a874a7-d303-4a3a-b765-5d3316ad5c2b-kube-api-access-5ls59" (OuterVolumeSpecName: "kube-api-access-5ls59") pod "75a874a7-d303-4a3a-b765-5d3316ad5c2b" (UID: "75a874a7-d303-4a3a-b765-5d3316ad5c2b"). InnerVolumeSpecName "kube-api-access-5ls59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.794423 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-kube-api-access-dllxm" (OuterVolumeSpecName: "kube-api-access-dllxm") pod "9aecf238-05ea-4d9c-8cac-58a9aaa8490d" (UID: "9aecf238-05ea-4d9c-8cac-58a9aaa8490d"). InnerVolumeSpecName "kube-api-access-dllxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.796909 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-q97rk" event={"ID":"75a874a7-d303-4a3a-b765-5d3316ad5c2b","Type":"ContainerDied","Data":"3420848d2e50f376d65e91606639cd9807743882e3a756dc44d06a665e4f7397"} Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.796939 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3420848d2e50f376d65e91606639cd9807743882e3a756dc44d06a665e4f7397" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.797010 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-q97rk" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.801440 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef50-account-create-update-lt26g" event={"ID":"edb4e25a-01bd-46b0-8029-dde07c5bcfca","Type":"ContainerDied","Data":"6bbc3b3685bf33683237c1563ee37c21d3a17350b75c46a1cdd8b51cce8fac33"} Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.801564 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bbc3b3685bf33683237c1563ee37c21d3a17350b75c46a1cdd8b51cce8fac33" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.801485 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef50-account-create-update-lt26g" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.888347 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ls59\" (UniqueName: \"kubernetes.io/projected/75a874a7-d303-4a3a-b765-5d3316ad5c2b-kube-api-access-5ls59\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.888804 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dllxm\" (UniqueName: \"kubernetes.io/projected/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-kube-api-access-dllxm\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.888837 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzch8\" (UniqueName: \"kubernetes.io/projected/8803f262-0a85-48aa-86f4-b01a2fc3692b-kube-api-access-gzch8\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.888873 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a874a7-d303-4a3a-b765-5d3316ad5c2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:01 crc kubenswrapper[4786]: I0313 15:24:01.888887 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9aecf238-05ea-4d9c-8cac-58a9aaa8490d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:02 crc kubenswrapper[4786]: I0313 15:24:02.046624 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-czchf"] Mar 13 15:24:02 crc kubenswrapper[4786]: W0313 15:24:02.058298 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48dd6149_3efc_4fa6_91b9_b6b32ed8a2e3.slice/crio-7a5eab161b1c5d0ecf52350bbdb68dd45a66fee6f2ca4332c1c93b511e968ba5 WatchSource:0}: Error finding container 7a5eab161b1c5d0ecf52350bbdb68dd45a66fee6f2ca4332c1c93b511e968ba5: Status 404 returned error can't find the container with id 7a5eab161b1c5d0ecf52350bbdb68dd45a66fee6f2ca4332c1c93b511e968ba5 Mar 13 15:24:02 crc kubenswrapper[4786]: I0313 15:24:02.810572 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n6qr2" event={"ID":"2b29131e-a3b3-45c1-945c-b28cdfff2773","Type":"ContainerStarted","Data":"321bb7aa80b43c241267fd5f31403d05a9a52408255cd6c724b195c5bb6ef716"} Mar 13 15:24:02 crc kubenswrapper[4786]: I0313 15:24:02.811372 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-czchf" event={"ID":"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3","Type":"ContainerStarted","Data":"7a5eab161b1c5d0ecf52350bbdb68dd45a66fee6f2ca4332c1c93b511e968ba5"} Mar 13 15:24:02 crc kubenswrapper[4786]: I0313 15:24:02.815100 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072"} Mar 13 15:24:02 crc kubenswrapper[4786]: I0313 15:24:02.815141 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79"} Mar 13 15:24:02 crc kubenswrapper[4786]: I0313 15:24:02.815153 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4"} Mar 13 15:24:02 crc kubenswrapper[4786]: I0313 15:24:02.815163 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508"} Mar 13 15:24:02 crc kubenswrapper[4786]: I0313 15:24:02.827542 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-n6qr2" podStartSLOduration=5.528857584 podStartE2EDuration="11.827526062s" podCreationTimestamp="2026-03-13 15:23:51 +0000 UTC" firstStartedPulling="2026-03-13 15:23:55.302364398 +0000 UTC m=+1265.465576209" lastFinishedPulling="2026-03-13 15:24:01.601032876 +0000 UTC m=+1271.764244687" observedRunningTime="2026-03-13 15:24:02.825715936 +0000 UTC m=+1272.988927747" watchObservedRunningTime="2026-03-13 15:24:02.827526062 +0000 UTC m=+1272.990737873" Mar 13 15:24:03 crc kubenswrapper[4786]: I0313 15:24:03.825267 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-czchf" event={"ID":"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3","Type":"ContainerStarted","Data":"b6d3b5c82813854a4d29327c7298f2fd417cb26077a9b2211e9e9e9f3281a1be"} Mar 13 15:24:03 crc kubenswrapper[4786]: I0313 15:24:03.856421 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556924-czchf" podStartSLOduration=2.449058369 podStartE2EDuration="3.856400201s" podCreationTimestamp="2026-03-13 15:24:00 +0000 UTC" firstStartedPulling="2026-03-13 15:24:02.060845522 +0000 UTC m=+1272.224057333" lastFinishedPulling="2026-03-13 15:24:03.468187364 +0000 UTC m=+1273.631399165" observedRunningTime="2026-03-13 15:24:03.850015701 +0000 UTC m=+1274.013227522" watchObservedRunningTime="2026-03-13 15:24:03.856400201 +0000 UTC m=+1274.019612012" Mar 13 15:24:04 crc kubenswrapper[4786]: I0313 15:24:04.833684 4786 generic.go:334] "Generic (PLEG): container finished" podID="48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3" containerID="b6d3b5c82813854a4d29327c7298f2fd417cb26077a9b2211e9e9e9f3281a1be" exitCode=0 Mar 13 15:24:04 crc kubenswrapper[4786]: I0313 15:24:04.833740 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-czchf" event={"ID":"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3","Type":"ContainerDied","Data":"b6d3b5c82813854a4d29327c7298f2fd417cb26077a9b2211e9e9e9f3281a1be"} Mar 13 15:24:04 crc kubenswrapper[4786]: I0313 15:24:04.838953 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d"} Mar 13 15:24:04 crc kubenswrapper[4786]: I0313 15:24:04.838991 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c"} Mar 13 15:24:04 crc kubenswrapper[4786]: I0313 15:24:04.839000 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57"} Mar 13 15:24:05 crc kubenswrapper[4786]: I0313 15:24:05.857032 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db"} Mar 13 15:24:05 crc kubenswrapper[4786]: I0313 15:24:05.857400 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8"} Mar 13 15:24:05 crc kubenswrapper[4786]: I0313 15:24:05.857417 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d"} Mar 13 15:24:05 crc kubenswrapper[4786]: I0313 15:24:05.857427 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerStarted","Data":"6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972"} Mar 13 15:24:05 crc kubenswrapper[4786]: I0313 15:24:05.895221 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.397657349 podStartE2EDuration="40.895201693s" podCreationTimestamp="2026-03-13 15:23:25 +0000 UTC" firstStartedPulling="2026-03-13 15:23:43.375442474 +0000 UTC m=+1253.538654295" lastFinishedPulling="2026-03-13 15:24:03.872986828 +0000 UTC m=+1274.036198639" observedRunningTime="2026-03-13 15:24:05.893761227 +0000 UTC m=+1276.056973058" watchObservedRunningTime="2026-03-13 15:24:05.895201693 +0000 UTC m=+1276.058413504" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.162595 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-czchf" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.183714 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67754df655-75x79"] Mar 13 15:24:06 crc kubenswrapper[4786]: E0313 15:24:06.184179 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8803f262-0a85-48aa-86f4-b01a2fc3692b" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184204 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8803f262-0a85-48aa-86f4-b01a2fc3692b" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: E0313 15:24:06.184221 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb4e25a-01bd-46b0-8029-dde07c5bcfca" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184229 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb4e25a-01bd-46b0-8029-dde07c5bcfca" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: E0313 15:24:06.184244 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1e5dac-cc9d-4362-a3c7-2ea79797cacf" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184251 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1e5dac-cc9d-4362-a3c7-2ea79797cacf" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: E0313 15:24:06.184265 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aecf238-05ea-4d9c-8cac-58a9aaa8490d" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184274 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aecf238-05ea-4d9c-8cac-58a9aaa8490d" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: E0313 15:24:06.184291 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a874a7-d303-4a3a-b765-5d3316ad5c2b" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184299 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a874a7-d303-4a3a-b765-5d3316ad5c2b" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: E0313 15:24:06.184312 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3" containerName="oc" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184319 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3" containerName="oc" Mar 13 15:24:06 crc kubenswrapper[4786]: E0313 15:24:06.184329 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50342ecd-90d8-411c-aef8-53c0337267a9" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184337 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="50342ecd-90d8-411c-aef8-53c0337267a9" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184531 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb4e25a-01bd-46b0-8029-dde07c5bcfca" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184549 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8803f262-0a85-48aa-86f4-b01a2fc3692b" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184568 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="50342ecd-90d8-411c-aef8-53c0337267a9" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184577 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aecf238-05ea-4d9c-8cac-58a9aaa8490d" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184587 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3" containerName="oc" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184610 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1e5dac-cc9d-4362-a3c7-2ea79797cacf" containerName="mariadb-account-create-update" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.184624 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a874a7-d303-4a3a-b765-5d3316ad5c2b" containerName="mariadb-database-create" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.185760 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.187673 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.207756 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67754df655-75x79"] Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.258994 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsgtj\" (UniqueName: \"kubernetes.io/projected/48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3-kube-api-access-qsgtj\") pod \"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3\" (UID: \"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3\") " Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.263622 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3-kube-api-access-qsgtj" (OuterVolumeSpecName: "kube-api-access-qsgtj") pod "48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3" (UID: "48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3"). InnerVolumeSpecName "kube-api-access-qsgtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.361090 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.361137 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.361159 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-svc\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.361198 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-config\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.361408 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvvb7\" (UniqueName: \"kubernetes.io/projected/498308e1-0d01-4899-8b74-dd393b4cad4b-kube-api-access-vvvb7\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.361555 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.361722 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsgtj\" (UniqueName: \"kubernetes.io/projected/48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3-kube-api-access-qsgtj\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.463508 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.463575 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.463609 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-svc\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.463679 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-config\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.463786 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvvb7\" (UniqueName: \"kubernetes.io/projected/498308e1-0d01-4899-8b74-dd393b4cad4b-kube-api-access-vvvb7\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.463862 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.464647 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-svc\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.464653 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-nb\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.464862 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-swift-storage-0\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.465024 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-config\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.465260 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-sb\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.482636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvvb7\" (UniqueName: \"kubernetes.io/projected/498308e1-0d01-4899-8b74-dd393b4cad4b-kube-api-access-vvvb7\") pod \"dnsmasq-dns-67754df655-75x79\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.557239 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.865037 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b29131e-a3b3-45c1-945c-b28cdfff2773" containerID="321bb7aa80b43c241267fd5f31403d05a9a52408255cd6c724b195c5bb6ef716" exitCode=0 Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.865127 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n6qr2" event={"ID":"2b29131e-a3b3-45c1-945c-b28cdfff2773","Type":"ContainerDied","Data":"321bb7aa80b43c241267fd5f31403d05a9a52408255cd6c724b195c5bb6ef716"} Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.867140 4786 generic.go:334] "Generic (PLEG): container finished" podID="d6adeaf7-f94a-4c32-9069-822bcb8d31b8" containerID="c3378ad848becf43c536e9376944ca88cfa4805b8b8bfd9f3012155f1ebc4f1c" exitCode=0 Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.867215 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zg25m" event={"ID":"d6adeaf7-f94a-4c32-9069-822bcb8d31b8","Type":"ContainerDied","Data":"c3378ad848becf43c536e9376944ca88cfa4805b8b8bfd9f3012155f1ebc4f1c"} Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.868686 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-czchf" Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.868676 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-czchf" event={"ID":"48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3","Type":"ContainerDied","Data":"7a5eab161b1c5d0ecf52350bbdb68dd45a66fee6f2ca4332c1c93b511e968ba5"} Mar 13 15:24:06 crc kubenswrapper[4786]: I0313 15:24:06.868846 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a5eab161b1c5d0ecf52350bbdb68dd45a66fee6f2ca4332c1c93b511e968ba5" Mar 13 15:24:07 crc kubenswrapper[4786]: I0313 15:24:07.005076 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67754df655-75x79"] Mar 13 15:24:07 crc kubenswrapper[4786]: I0313 15:24:07.225595 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-kdwgv"] Mar 13 15:24:07 crc kubenswrapper[4786]: I0313 15:24:07.233383 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-kdwgv"] Mar 13 15:24:07 crc kubenswrapper[4786]: I0313 15:24:07.868223 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:24:07 crc kubenswrapper[4786]: I0313 15:24:07.868316 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:24:07 crc kubenswrapper[4786]: I0313 15:24:07.878440 4786 generic.go:334] "Generic (PLEG): container finished" podID="498308e1-0d01-4899-8b74-dd393b4cad4b" containerID="d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5" exitCode=0 Mar 13 15:24:07 crc kubenswrapper[4786]: I0313 15:24:07.878486 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-75x79" event={"ID":"498308e1-0d01-4899-8b74-dd393b4cad4b","Type":"ContainerDied","Data":"d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5"} Mar 13 15:24:07 crc kubenswrapper[4786]: I0313 15:24:07.878557 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-75x79" event={"ID":"498308e1-0d01-4899-8b74-dd393b4cad4b","Type":"ContainerStarted","Data":"bfdf1e66f0919b5c902725e210399772ebf43998736f5d16fa57ae5a6f54aea1"} Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.224711 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.310469 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zg25m" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.402104 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-db-sync-config-data\") pod \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.402216 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-combined-ca-bundle\") pod \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.402271 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-combined-ca-bundle\") pod \"2b29131e-a3b3-45c1-945c-b28cdfff2773\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.402304 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzg4p\" (UniqueName: \"kubernetes.io/projected/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-kube-api-access-vzg4p\") pod \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.402388 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-config-data\") pod \"2b29131e-a3b3-45c1-945c-b28cdfff2773\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.402455 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-config-data\") pod \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\" (UID: \"d6adeaf7-f94a-4c32-9069-822bcb8d31b8\") " Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.402478 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67n7c\" (UniqueName: \"kubernetes.io/projected/2b29131e-a3b3-45c1-945c-b28cdfff2773-kube-api-access-67n7c\") pod \"2b29131e-a3b3-45c1-945c-b28cdfff2773\" (UID: \"2b29131e-a3b3-45c1-945c-b28cdfff2773\") " Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.408750 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b29131e-a3b3-45c1-945c-b28cdfff2773-kube-api-access-67n7c" (OuterVolumeSpecName: "kube-api-access-67n7c") pod "2b29131e-a3b3-45c1-945c-b28cdfff2773" (UID: "2b29131e-a3b3-45c1-945c-b28cdfff2773"). InnerVolumeSpecName "kube-api-access-67n7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.409225 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-kube-api-access-vzg4p" (OuterVolumeSpecName: "kube-api-access-vzg4p") pod "d6adeaf7-f94a-4c32-9069-822bcb8d31b8" (UID: "d6adeaf7-f94a-4c32-9069-822bcb8d31b8"). InnerVolumeSpecName "kube-api-access-vzg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.411151 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d6adeaf7-f94a-4c32-9069-822bcb8d31b8" (UID: "d6adeaf7-f94a-4c32-9069-822bcb8d31b8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.429101 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b29131e-a3b3-45c1-945c-b28cdfff2773" (UID: "2b29131e-a3b3-45c1-945c-b28cdfff2773"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.435644 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6adeaf7-f94a-4c32-9069-822bcb8d31b8" (UID: "d6adeaf7-f94a-4c32-9069-822bcb8d31b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.450138 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-config-data" (OuterVolumeSpecName: "config-data") pod "d6adeaf7-f94a-4c32-9069-822bcb8d31b8" (UID: "d6adeaf7-f94a-4c32-9069-822bcb8d31b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.460851 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-config-data" (OuterVolumeSpecName: "config-data") pod "2b29131e-a3b3-45c1-945c-b28cdfff2773" (UID: "2b29131e-a3b3-45c1-945c-b28cdfff2773"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.504259 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.504291 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.504300 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzg4p\" (UniqueName: \"kubernetes.io/projected/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-kube-api-access-vzg4p\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.504311 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b29131e-a3b3-45c1-945c-b28cdfff2773-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.504320 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.504328 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67n7c\" (UniqueName: \"kubernetes.io/projected/2b29131e-a3b3-45c1-945c-b28cdfff2773-kube-api-access-67n7c\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.504336 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6adeaf7-f94a-4c32-9069-822bcb8d31b8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.561868 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b51b465-3b1c-48de-8f8c-672f96eb07c9" path="/var/lib/kubelet/pods/7b51b465-3b1c-48de-8f8c-672f96eb07c9/volumes" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.893350 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n6qr2" event={"ID":"2b29131e-a3b3-45c1-945c-b28cdfff2773","Type":"ContainerDied","Data":"c021e341a62768733b4024c3b74fc68a180ded77e89adc8396493aca3bca6b65"} Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.893391 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c021e341a62768733b4024c3b74fc68a180ded77e89adc8396493aca3bca6b65" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.895374 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n6qr2" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.897350 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zg25m" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.897472 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zg25m" event={"ID":"d6adeaf7-f94a-4c32-9069-822bcb8d31b8","Type":"ContainerDied","Data":"d7409927243a6a4975527cda43da1c4577f92f4435984931a5019e1f7df83318"} Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.897505 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7409927243a6a4975527cda43da1c4577f92f4435984931a5019e1f7df83318" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.902558 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-75x79" event={"ID":"498308e1-0d01-4899-8b74-dd393b4cad4b","Type":"ContainerStarted","Data":"995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a"} Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.903136 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:08 crc kubenswrapper[4786]: I0313 15:24:08.946622 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67754df655-75x79" podStartSLOduration=2.946605625 podStartE2EDuration="2.946605625s" podCreationTimestamp="2026-03-13 15:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:08.933887925 +0000 UTC m=+1279.097099736" watchObservedRunningTime="2026-03-13 15:24:08.946605625 +0000 UTC m=+1279.109817436" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.164806 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67754df655-75x79"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.233777 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r5jfk"] Mar 13 15:24:09 crc kubenswrapper[4786]: E0313 15:24:09.234350 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6adeaf7-f94a-4c32-9069-822bcb8d31b8" containerName="glance-db-sync" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.234377 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6adeaf7-f94a-4c32-9069-822bcb8d31b8" containerName="glance-db-sync" Mar 13 15:24:09 crc kubenswrapper[4786]: E0313 15:24:09.234412 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b29131e-a3b3-45c1-945c-b28cdfff2773" containerName="keystone-db-sync" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.234422 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b29131e-a3b3-45c1-945c-b28cdfff2773" containerName="keystone-db-sync" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.234645 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b29131e-a3b3-45c1-945c-b28cdfff2773" containerName="keystone-db-sync" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.234667 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6adeaf7-f94a-4c32-9069-822bcb8d31b8" containerName="glance-db-sync" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.235364 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.238650 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.238742 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.238818 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.238650 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.239197 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dsq8r" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.262141 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-wvqj6"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.263846 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.279611 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r5jfk"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.301396 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-wvqj6"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.321903 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt296\" (UniqueName: \"kubernetes.io/projected/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-kube-api-access-vt296\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.322085 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-config-data\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.322119 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-credential-keys\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.322159 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-scripts\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.322201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-combined-ca-bundle\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.322243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-fernet-keys\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.401359 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nwdnc"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.402436 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.404932 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.405098 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5vs2w" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.405207 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.418690 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nwdnc"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423335 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgjw7\" (UniqueName: \"kubernetes.io/projected/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-kube-api-access-jgjw7\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt296\" (UniqueName: \"kubernetes.io/projected/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-kube-api-access-vt296\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423441 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-config\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423483 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-sb\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423519 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-config-data\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423549 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-credential-keys\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423594 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-scripts\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423624 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-svc\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423667 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-swift-storage-0\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423690 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-combined-ca-bundle\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-fernet-keys\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.423761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-nb\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.437551 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-credential-keys\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.438536 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-scripts\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.440453 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-fernet-keys\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.442562 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-config-data\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.444228 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-combined-ca-bundle\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.453648 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt296\" (UniqueName: \"kubernetes.io/projected/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-kube-api-access-vt296\") pod \"keystone-bootstrap-r5jfk\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.512916 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-wvqj6"] Mar 13 15:24:09 crc kubenswrapper[4786]: E0313 15:24:09.515991 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-jgjw7 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" podUID="f4ef63c9-d6d3-4885-a4ac-ff54415c25f4" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.526742 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-db-sync-config-data\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.526795 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-combined-ca-bundle\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.526817 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-config-data\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.526865 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-svc\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.526912 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-etc-machine-id\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.526943 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-swift-storage-0\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.526988 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-nb\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.527025 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-scripts\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.527055 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgjw7\" (UniqueName: \"kubernetes.io/projected/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-kube-api-access-jgjw7\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.527078 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-config\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.527093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4x57\" (UniqueName: \"kubernetes.io/projected/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-kube-api-access-r4x57\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.527126 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-sb\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.528223 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-sb\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.528740 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-config\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.528739 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-nb\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.529303 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-svc\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.529428 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-swift-storage-0\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.563295 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.578663 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-7tr8p"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.580958 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.605765 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n7hpw" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.606050 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.607202 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgjw7\" (UniqueName: \"kubernetes.io/projected/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-kube-api-access-jgjw7\") pod \"dnsmasq-dns-b9fb8978c-wvqj6\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.610836 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7tr8p"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.633307 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-scripts\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.633407 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4x57\" (UniqueName: \"kubernetes.io/projected/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-kube-api-access-r4x57\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.633686 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-db-sync-config-data\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.633718 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-combined-ca-bundle\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.633748 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-config-data\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.633818 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-etc-machine-id\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.633983 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-etc-machine-id\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.649186 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.652428 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-combined-ca-bundle\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.657453 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-db-sync-config-data\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.658912 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.664100 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-scripts\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.669864 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-config-data\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.684946 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.696435 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4x57\" (UniqueName: \"kubernetes.io/projected/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-kube-api-access-r4x57\") pod \"cinder-db-sync-nwdnc\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.725958 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.737834 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-combined-ca-bundle\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.737899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmxkk\" (UniqueName: \"kubernetes.io/projected/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-kube-api-access-qmxkk\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.738022 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-db-sync-config-data\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.741646 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qqx9f"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.768223 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.772070 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.781835 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.782065 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gqv7f" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.782688 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.783411 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.790525 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.790851 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.816792 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846460 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846534 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-db-sync-config-data\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846582 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-combined-ca-bundle\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846606 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmxkk\" (UniqueName: \"kubernetes.io/projected/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-kube-api-access-qmxkk\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846649 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846678 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846717 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-config\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846789 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkh5\" (UniqueName: \"kubernetes.io/projected/39a00422-6428-4d2c-bae7-0389d36d7492-kube-api-access-7nkh5\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.846849 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.856419 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qqx9f"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.862535 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-db-sync-config-data\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.875427 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-combined-ca-bundle\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.914944 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ft8mk"] Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.916230 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.917503 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmxkk\" (UniqueName: \"kubernetes.io/projected/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-kube-api-access-qmxkk\") pod \"barbican-db-sync-7tr8p\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.921514 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ltrgl" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.922199 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.922391 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.929818 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh"] Mar 13 15:24:09 crc kubenswrapper[4786]: E0313 15:24:09.930603 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-7nkh5 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" podUID="39a00422-6428-4d2c-bae7-0389d36d7492" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.952543 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.957956 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-svc\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.962760 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.967930 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968067 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968111 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-config\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968160 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-run-httpd\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968259 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkh5\" (UniqueName: \"kubernetes.io/projected/39a00422-6428-4d2c-bae7-0389d36d7492-kube-api-access-7nkh5\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clc8z\" (UniqueName: \"kubernetes.io/projected/5fb2cca9-aeef-4bce-9307-02429aae556d-kube-api-access-clc8z\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968360 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-log-httpd\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968441 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-combined-ca-bundle\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968483 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-config-data\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968519 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968552 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-config\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968596 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-scripts\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968623 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968679 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.968802 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpjc\" (UniqueName: \"kubernetes.io/projected/ed27af11-89ca-4d9c-b654-800740dfc742-kube-api-access-rlpjc\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.969701 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-config\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.970728 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.970783 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:09 crc kubenswrapper[4786]: I0313 15:24:09.972958 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ft8mk"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.018267 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.022935 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkh5\" (UniqueName: \"kubernetes.io/projected/39a00422-6428-4d2c-bae7-0389d36d7492-kube-api-access-7nkh5\") pod \"dnsmasq-dns-5cb4dcfdd7-z7kgh\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.066504 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.078801 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clc8z\" (UniqueName: \"kubernetes.io/projected/5fb2cca9-aeef-4bce-9307-02429aae556d-kube-api-access-clc8z\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.078969 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-log-httpd\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079059 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-combined-ca-bundle\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079106 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-combined-ca-bundle\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-config-data\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079163 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-config-data\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079234 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-logs\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079277 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-config\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079317 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-scripts\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079487 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-scripts\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079524 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpjc\" (UniqueName: \"kubernetes.io/projected/ed27af11-89ca-4d9c-b654-800740dfc742-kube-api-access-rlpjc\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079614 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d96hw\" (UniqueName: \"kubernetes.io/projected/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-kube-api-access-d96hw\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079642 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.079691 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-run-httpd\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.085450 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-config\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.099653 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-scripts\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.100526 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-run-httpd\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.101269 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-log-httpd\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.118813 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.119739 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-config-data\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.121188 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-bl29c"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.139608 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.140527 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.155993 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-combined-ca-bundle\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.158808 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-bl29c"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.161319 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clc8z\" (UniqueName: \"kubernetes.io/projected/5fb2cca9-aeef-4bce-9307-02429aae556d-kube-api-access-clc8z\") pod \"ceilometer-0\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.174186 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpjc\" (UniqueName: \"kubernetes.io/projected/ed27af11-89ca-4d9c-b654-800740dfc742-kube-api-access-rlpjc\") pod \"neutron-db-sync-qqx9f\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.182539 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.240583 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-scripts\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.240657 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9hpq\" (UniqueName: \"kubernetes.io/projected/91915175-6b85-4031-b7ba-6da229b76766-kube-api-access-n9hpq\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.240721 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d96hw\" (UniqueName: \"kubernetes.io/projected/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-kube-api-access-d96hw\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.240798 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.240931 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-combined-ca-bundle\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.240958 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-config-data\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.240981 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.241011 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.241035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-logs\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.241067 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-svc\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.241116 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-config\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.244549 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-logs\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.246162 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-combined-ca-bundle\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.249037 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-config-data\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.256332 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-scripts\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.282615 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d96hw\" (UniqueName: \"kubernetes.io/projected/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-kube-api-access-d96hw\") pod \"placement-db-sync-ft8mk\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.341711 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.341754 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.341776 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-svc\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.341802 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-config\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.341855 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9hpq\" (UniqueName: \"kubernetes.io/projected/91915175-6b85-4031-b7ba-6da229b76766-kube-api-access-n9hpq\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.341916 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.346745 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-swift-storage-0\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.347939 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-svc\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.350739 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-sb\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.356784 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-config\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.373669 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9hpq\" (UniqueName: \"kubernetes.io/projected/91915175-6b85-4031-b7ba-6da229b76766-kube-api-access-n9hpq\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.388677 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-nb\") pod \"dnsmasq-dns-759cc7f497-bl29c\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.413043 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.414913 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.431545 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.431783 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.431949 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cd7b7" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.450885 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.456159 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.487257 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.514634 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r5jfk"] Mar 13 15:24:10 crc kubenswrapper[4786]: W0313 15:24:10.524784 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c380742_07d9_4f6f_a8c4_3e8f3593c3fa.slice/crio-277527eb80c7b0fd06be7a7b3ab64a32a731d0dad98a5b336432d8fa260f76e4 WatchSource:0}: Error finding container 277527eb80c7b0fd06be7a7b3ab64a32a731d0dad98a5b336432d8fa260f76e4: Status 404 returned error can't find the container with id 277527eb80c7b0fd06be7a7b3ab64a32a731d0dad98a5b336432d8fa260f76e4 Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.529776 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.542593 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.544519 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-config-data\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.544689 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq59g\" (UniqueName: \"kubernetes.io/projected/e937b394-6278-4f37-b898-9f7bf592a43c-kube-api-access-bq59g\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.544815 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.544970 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.545040 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-scripts\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.545119 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.545192 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-logs\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.598463 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.602663 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.602810 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.605734 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.649497 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgjw7\" (UniqueName: \"kubernetes.io/projected/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-kube-api-access-jgjw7\") pod \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.650022 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-swift-storage-0\") pod \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.650089 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-config\") pod \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.650251 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-sb\") pod \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.650336 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-nb\") pod \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.650406 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-svc\") pod \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\" (UID: \"f4ef63c9-d6d3-4885-a4ac-ff54415c25f4\") " Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.650891 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651041 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651072 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-scripts\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651130 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651182 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-logs\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651209 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4" (UID: "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651431 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-config-data\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651497 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq59g\" (UniqueName: \"kubernetes.io/projected/e937b394-6278-4f37-b898-9f7bf592a43c-kube-api-access-bq59g\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651603 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4" (UID: "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651635 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.651659 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-config" (OuterVolumeSpecName: "config") pod "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4" (UID: "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.652270 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.652520 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-logs\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.653532 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4" (UID: "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.653885 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4" (UID: "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.654321 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.660986 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.663964 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-config-data\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.665481 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-kube-api-access-jgjw7" (OuterVolumeSpecName: "kube-api-access-jgjw7") pod "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4" (UID: "f4ef63c9-d6d3-4885-a4ac-ff54415c25f4"). InnerVolumeSpecName "kube-api-access-jgjw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.667206 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-scripts\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.694834 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq59g\" (UniqueName: \"kubernetes.io/projected/e937b394-6278-4f37-b898-9f7bf592a43c-kube-api-access-bq59g\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.720022 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.752644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.752703 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.752744 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.752765 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.752829 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.752913 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvjh\" (UniqueName: \"kubernetes.io/projected/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-kube-api-access-xfvjh\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.752953 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.753011 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgjw7\" (UniqueName: \"kubernetes.io/projected/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-kube-api-access-jgjw7\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.753026 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.753038 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.753049 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.753060 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.759069 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nwdnc"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.781440 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7tr8p"] Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.855493 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.855687 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.855791 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.857967 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvjh\" (UniqueName: \"kubernetes.io/projected/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-kube-api-access-xfvjh\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.858068 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.858110 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.858149 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.858456 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.858622 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.858835 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.863019 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.865908 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.888167 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvjh\" (UniqueName: \"kubernetes.io/projected/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-kube-api-access-xfvjh\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.904564 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.905860 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.908562 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.939298 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:10 crc kubenswrapper[4786]: I0313 15:24:10.949629 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qqx9f"] Mar 13 15:24:10 crc kubenswrapper[4786]: W0313 15:24:10.967617 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded27af11_89ca_4d9c_b654_800740dfc742.slice/crio-a619a64e38dc5d974d66ed292bf67568d185dc6a3e6a611c7d663734ae77d97f WatchSource:0}: Error finding container a619a64e38dc5d974d66ed292bf67568d185dc6a3e6a611c7d663734ae77d97f: Status 404 returned error can't find the container with id a619a64e38dc5d974d66ed292bf67568d185dc6a3e6a611c7d663734ae77d97f Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.115365 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqx9f" event={"ID":"ed27af11-89ca-4d9c-b654-800740dfc742","Type":"ContainerStarted","Data":"a619a64e38dc5d974d66ed292bf67568d185dc6a3e6a611c7d663734ae77d97f"} Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.128112 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nwdnc" event={"ID":"fb8a98ec-82a3-418d-82ea-d0ff210dd78d","Type":"ContainerStarted","Data":"7a6f4224455dd3968dbd67b1dadfe6f1f8e5753a7bda2b9879fe9aea34f9679e"} Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.154297 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ft8mk"] Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.161396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r5jfk" event={"ID":"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa","Type":"ContainerStarted","Data":"b288097f55569918eb38ccdfc6bf25211a56db02df55dbab27d3c330c057a809"} Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.161442 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r5jfk" event={"ID":"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa","Type":"ContainerStarted","Data":"277527eb80c7b0fd06be7a7b3ab64a32a731d0dad98a5b336432d8fa260f76e4"} Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.166109 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.166686 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7tr8p" event={"ID":"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118","Type":"ContainerStarted","Data":"800f75f62917be4211d5d60a245f29539c7d9f91f590687bd6c07ff1753ca4c7"} Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.166723 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9fb8978c-wvqj6" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.167693 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.167969 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67754df655-75x79" podUID="498308e1-0d01-4899-8b74-dd393b4cad4b" containerName="dnsmasq-dns" containerID="cri-o://995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a" gracePeriod=10 Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.220828 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r5jfk" podStartSLOduration=2.220809684 podStartE2EDuration="2.220809684s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:11.190668016 +0000 UTC m=+1281.353879827" watchObservedRunningTime="2026-03-13 15:24:11.220809684 +0000 UTC m=+1281.384021495" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.300508 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.306471 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-wvqj6"] Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.318306 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9fb8978c-wvqj6"] Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.344700 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-bl29c"] Mar 13 15:24:11 crc kubenswrapper[4786]: W0313 15:24:11.355781 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91915175_6b85_4031_b7ba_6da229b76766.slice/crio-cbfda679a3e4904761ee7ffc3ad75a9c682d33e272bc63f5b3a3b8fabdf97086 WatchSource:0}: Error finding container cbfda679a3e4904761ee7ffc3ad75a9c682d33e272bc63f5b3a3b8fabdf97086: Status 404 returned error can't find the container with id cbfda679a3e4904761ee7ffc3ad75a9c682d33e272bc63f5b3a3b8fabdf97086 Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.366671 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-sb\") pod \"39a00422-6428-4d2c-bae7-0389d36d7492\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.366734 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-svc\") pod \"39a00422-6428-4d2c-bae7-0389d36d7492\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.366751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-config\") pod \"39a00422-6428-4d2c-bae7-0389d36d7492\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.366821 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-nb\") pod \"39a00422-6428-4d2c-bae7-0389d36d7492\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.366909 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-swift-storage-0\") pod \"39a00422-6428-4d2c-bae7-0389d36d7492\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.366973 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nkh5\" (UniqueName: \"kubernetes.io/projected/39a00422-6428-4d2c-bae7-0389d36d7492-kube-api-access-7nkh5\") pod \"39a00422-6428-4d2c-bae7-0389d36d7492\" (UID: \"39a00422-6428-4d2c-bae7-0389d36d7492\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.367342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39a00422-6428-4d2c-bae7-0389d36d7492" (UID: "39a00422-6428-4d2c-bae7-0389d36d7492"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.367412 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-config" (OuterVolumeSpecName: "config") pod "39a00422-6428-4d2c-bae7-0389d36d7492" (UID: "39a00422-6428-4d2c-bae7-0389d36d7492"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.368135 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39a00422-6428-4d2c-bae7-0389d36d7492" (UID: "39a00422-6428-4d2c-bae7-0389d36d7492"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.368646 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39a00422-6428-4d2c-bae7-0389d36d7492" (UID: "39a00422-6428-4d2c-bae7-0389d36d7492"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.373156 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39a00422-6428-4d2c-bae7-0389d36d7492" (UID: "39a00422-6428-4d2c-bae7-0389d36d7492"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.374200 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a00422-6428-4d2c-bae7-0389d36d7492-kube-api-access-7nkh5" (OuterVolumeSpecName: "kube-api-access-7nkh5") pod "39a00422-6428-4d2c-bae7-0389d36d7492" (UID: "39a00422-6428-4d2c-bae7-0389d36d7492"). InnerVolumeSpecName "kube-api-access-7nkh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.469263 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.469747 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.469848 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.469973 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.470029 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a00422-6428-4d2c-bae7-0389d36d7492-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.470092 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nkh5\" (UniqueName: \"kubernetes.io/projected/39a00422-6428-4d2c-bae7-0389d36d7492-kube-api-access-7nkh5\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.705059 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.769691 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.774018 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-config\") pod \"498308e1-0d01-4899-8b74-dd393b4cad4b\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.774061 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-swift-storage-0\") pod \"498308e1-0d01-4899-8b74-dd393b4cad4b\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.774268 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-svc\") pod \"498308e1-0d01-4899-8b74-dd393b4cad4b\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.774307 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvvb7\" (UniqueName: \"kubernetes.io/projected/498308e1-0d01-4899-8b74-dd393b4cad4b-kube-api-access-vvvb7\") pod \"498308e1-0d01-4899-8b74-dd393b4cad4b\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.774384 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-sb\") pod \"498308e1-0d01-4899-8b74-dd393b4cad4b\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.774441 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-nb\") pod \"498308e1-0d01-4899-8b74-dd393b4cad4b\" (UID: \"498308e1-0d01-4899-8b74-dd393b4cad4b\") " Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.802138 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498308e1-0d01-4899-8b74-dd393b4cad4b-kube-api-access-vvvb7" (OuterVolumeSpecName: "kube-api-access-vvvb7") pod "498308e1-0d01-4899-8b74-dd393b4cad4b" (UID: "498308e1-0d01-4899-8b74-dd393b4cad4b"). InnerVolumeSpecName "kube-api-access-vvvb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.857050 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "498308e1-0d01-4899-8b74-dd393b4cad4b" (UID: "498308e1-0d01-4899-8b74-dd393b4cad4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.878533 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.878580 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvvb7\" (UniqueName: \"kubernetes.io/projected/498308e1-0d01-4899-8b74-dd393b4cad4b-kube-api-access-vvvb7\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.879877 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "498308e1-0d01-4899-8b74-dd393b4cad4b" (UID: "498308e1-0d01-4899-8b74-dd393b4cad4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.897676 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.908340 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "498308e1-0d01-4899-8b74-dd393b4cad4b" (UID: "498308e1-0d01-4899-8b74-dd393b4cad4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.931515 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "498308e1-0d01-4899-8b74-dd393b4cad4b" (UID: "498308e1-0d01-4899-8b74-dd393b4cad4b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.931824 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-config" (OuterVolumeSpecName: "config") pod "498308e1-0d01-4899-8b74-dd393b4cad4b" (UID: "498308e1-0d01-4899-8b74-dd393b4cad4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.980633 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.980942 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.980960 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:11 crc kubenswrapper[4786]: I0313 15:24:11.980973 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/498308e1-0d01-4899-8b74-dd393b4cad4b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.178355 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.198808 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.236192 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a088f5-c8ac-4467-95ce-aaefb7df9f76","Type":"ContainerStarted","Data":"95e226472fbe14308d7346a56324c16fc717fec6530b68149369d6a88a2dd2f0"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.239004 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqx9f" event={"ID":"ed27af11-89ca-4d9c-b654-800740dfc742","Type":"ContainerStarted","Data":"99d73e1c554b9b1b11403cbffe4ce8b73fd05d82195bf4be2a43decef93be656"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.243126 4786 generic.go:334] "Generic (PLEG): container finished" podID="498308e1-0d01-4899-8b74-dd393b4cad4b" containerID="995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a" exitCode=0 Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.243199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-75x79" event={"ID":"498308e1-0d01-4899-8b74-dd393b4cad4b","Type":"ContainerDied","Data":"995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.243221 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67754df655-75x79" event={"ID":"498308e1-0d01-4899-8b74-dd393b4cad4b","Type":"ContainerDied","Data":"bfdf1e66f0919b5c902725e210399772ebf43998736f5d16fa57ae5a6f54aea1"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.243236 4786 scope.go:117] "RemoveContainer" containerID="995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.243378 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67754df655-75x79" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.258591 4786 generic.go:334] "Generic (PLEG): container finished" podID="91915175-6b85-4031-b7ba-6da229b76766" containerID="250c2eefdad7b0f67f6e4e296986c9f2f5f788bae99d2a0556226e954301a48d" exitCode=0 Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.258660 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" event={"ID":"91915175-6b85-4031-b7ba-6da229b76766","Type":"ContainerDied","Data":"250c2eefdad7b0f67f6e4e296986c9f2f5f788bae99d2a0556226e954301a48d"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.258684 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" event={"ID":"91915175-6b85-4031-b7ba-6da229b76766","Type":"ContainerStarted","Data":"cbfda679a3e4904761ee7ffc3ad75a9c682d33e272bc63f5b3a3b8fabdf97086"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.266293 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.277791 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerStarted","Data":"a556a7fdcfb5675178fd50b08207e97dd42a775d6478b4d6d77c2a3d5a52dbd1"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.280263 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e937b394-6278-4f37-b898-9f7bf592a43c","Type":"ContainerStarted","Data":"42c670f7e20e04c49fdcf1064cecf4467cb027efb5b52e21fd64bc9b6653ab02"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.282015 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.282688 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qqx9f" podStartSLOduration=3.282664962 podStartE2EDuration="3.282664962s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:12.261369907 +0000 UTC m=+1282.424581718" watchObservedRunningTime="2026-03-13 15:24:12.282664962 +0000 UTC m=+1282.445876773" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.282926 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ft8mk" event={"ID":"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae","Type":"ContainerStarted","Data":"47f58c4d0f9305cea33b36ff5a5a0b8436f38a2e30991bc881d7cc465e0e6aa8"} Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.338609 4786 scope.go:117] "RemoveContainer" containerID="d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.384241 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67754df655-75x79"] Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.395058 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67754df655-75x79"] Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.509107 4786 scope.go:117] "RemoveContainer" containerID="995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a" Mar 13 15:24:12 crc kubenswrapper[4786]: E0313 15:24:12.525045 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a\": container with ID starting with 995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a not found: ID does not exist" containerID="995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.525100 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a"} err="failed to get container status \"995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a\": rpc error: code = NotFound desc = could not find container \"995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a\": container with ID starting with 995d51c852283391badd3c843576c5d9c97f017623bf7113d4313c12dbee214a not found: ID does not exist" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.525135 4786 scope.go:117] "RemoveContainer" containerID="d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5" Mar 13 15:24:12 crc kubenswrapper[4786]: E0313 15:24:12.528037 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5\": container with ID starting with d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5 not found: ID does not exist" containerID="d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.528079 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5"} err="failed to get container status \"d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5\": rpc error: code = NotFound desc = could not find container \"d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5\": container with ID starting with d87f9ed54dfd83435737602d2d51c73959b29a27e0b72e5c13229cf5d101d4e5 not found: ID does not exist" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.542934 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh"] Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.567923 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="498308e1-0d01-4899-8b74-dd393b4cad4b" path="/var/lib/kubelet/pods/498308e1-0d01-4899-8b74-dd393b4cad4b/volumes" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.568991 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ef63c9-d6d3-4885-a4ac-ff54415c25f4" path="/var/lib/kubelet/pods/f4ef63c9-d6d3-4885-a4ac-ff54415c25f4/volumes" Mar 13 15:24:12 crc kubenswrapper[4786]: I0313 15:24:12.570630 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cb4dcfdd7-z7kgh"] Mar 13 15:24:13 crc kubenswrapper[4786]: I0313 15:24:13.297813 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" event={"ID":"91915175-6b85-4031-b7ba-6da229b76766","Type":"ContainerStarted","Data":"efa8ceb87e42c5af0c21bddf580fc49cf21cb5991d662bea5fc37db325d0c177"} Mar 13 15:24:13 crc kubenswrapper[4786]: I0313 15:24:13.299602 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:13 crc kubenswrapper[4786]: I0313 15:24:13.302718 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e937b394-6278-4f37-b898-9f7bf592a43c","Type":"ContainerStarted","Data":"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb"} Mar 13 15:24:13 crc kubenswrapper[4786]: I0313 15:24:13.313048 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a088f5-c8ac-4467-95ce-aaefb7df9f76","Type":"ContainerStarted","Data":"5d1eb731fe90a2d7967613024c86cdc88087f71bf2c0f7ac579978edd63efcfb"} Mar 13 15:24:13 crc kubenswrapper[4786]: I0313 15:24:13.324383 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" podStartSLOduration=4.324365973 podStartE2EDuration="4.324365973s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:13.321137852 +0000 UTC m=+1283.484349663" watchObservedRunningTime="2026-03-13 15:24:13.324365973 +0000 UTC m=+1283.487577784" Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.338729 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e937b394-6278-4f37-b898-9f7bf592a43c","Type":"ContainerStarted","Data":"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229"} Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.338851 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" containerName="glance-httpd" containerID="cri-o://a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229" gracePeriod=30 Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.338813 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" containerName="glance-log" containerID="cri-o://8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb" gracePeriod=30 Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.347390 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a088f5-c8ac-4467-95ce-aaefb7df9f76","Type":"ContainerStarted","Data":"e063f03566777526887cc68e28abb25b31c947cbed49669016b5a1af97607695"} Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.347632 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerName="glance-log" containerID="cri-o://5d1eb731fe90a2d7967613024c86cdc88087f71bf2c0f7ac579978edd63efcfb" gracePeriod=30 Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.347825 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerName="glance-httpd" containerID="cri-o://e063f03566777526887cc68e28abb25b31c947cbed49669016b5a1af97607695" gracePeriod=30 Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.409603 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.409583599 podStartE2EDuration="5.409583599s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:14.380132839 +0000 UTC m=+1284.543344650" watchObservedRunningTime="2026-03-13 15:24:14.409583599 +0000 UTC m=+1284.572795400" Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.412494 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.412488852 podStartE2EDuration="5.412488852s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:14.402608334 +0000 UTC m=+1284.565820145" watchObservedRunningTime="2026-03-13 15:24:14.412488852 +0000 UTC m=+1284.575700663" Mar 13 15:24:14 crc kubenswrapper[4786]: I0313 15:24:14.567991 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a00422-6428-4d2c-bae7-0389d36d7492" path="/var/lib/kubelet/pods/39a00422-6428-4d2c-bae7-0389d36d7492/volumes" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.075420 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.262184 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-scripts\") pod \"e937b394-6278-4f37-b898-9f7bf592a43c\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.262247 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq59g\" (UniqueName: \"kubernetes.io/projected/e937b394-6278-4f37-b898-9f7bf592a43c-kube-api-access-bq59g\") pod \"e937b394-6278-4f37-b898-9f7bf592a43c\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.262279 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-config-data\") pod \"e937b394-6278-4f37-b898-9f7bf592a43c\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.262303 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-logs\") pod \"e937b394-6278-4f37-b898-9f7bf592a43c\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.262329 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-httpd-run\") pod \"e937b394-6278-4f37-b898-9f7bf592a43c\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.262357 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-combined-ca-bundle\") pod \"e937b394-6278-4f37-b898-9f7bf592a43c\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.262417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"e937b394-6278-4f37-b898-9f7bf592a43c\" (UID: \"e937b394-6278-4f37-b898-9f7bf592a43c\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.270276 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e937b394-6278-4f37-b898-9f7bf592a43c" (UID: "e937b394-6278-4f37-b898-9f7bf592a43c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.270468 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-logs" (OuterVolumeSpecName: "logs") pod "e937b394-6278-4f37-b898-9f7bf592a43c" (UID: "e937b394-6278-4f37-b898-9f7bf592a43c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.293154 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-scripts" (OuterVolumeSpecName: "scripts") pod "e937b394-6278-4f37-b898-9f7bf592a43c" (UID: "e937b394-6278-4f37-b898-9f7bf592a43c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.293893 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e937b394-6278-4f37-b898-9f7bf592a43c-kube-api-access-bq59g" (OuterVolumeSpecName: "kube-api-access-bq59g") pod "e937b394-6278-4f37-b898-9f7bf592a43c" (UID: "e937b394-6278-4f37-b898-9f7bf592a43c"). InnerVolumeSpecName "kube-api-access-bq59g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.293997 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "e937b394-6278-4f37-b898-9f7bf592a43c" (UID: "e937b394-6278-4f37-b898-9f7bf592a43c"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.339102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e937b394-6278-4f37-b898-9f7bf592a43c" (UID: "e937b394-6278-4f37-b898-9f7bf592a43c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.349756 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-config-data" (OuterVolumeSpecName: "config-data") pod "e937b394-6278-4f37-b898-9f7bf592a43c" (UID: "e937b394-6278-4f37-b898-9f7bf592a43c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.370006 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.370090 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq59g\" (UniqueName: \"kubernetes.io/projected/e937b394-6278-4f37-b898-9f7bf592a43c-kube-api-access-bq59g\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.370106 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.370119 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.370130 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e937b394-6278-4f37-b898-9f7bf592a43c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.370141 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e937b394-6278-4f37-b898-9f7bf592a43c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.370172 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.379280 4786 generic.go:334] "Generic (PLEG): container finished" podID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerID="e063f03566777526887cc68e28abb25b31c947cbed49669016b5a1af97607695" exitCode=0 Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.379316 4786 generic.go:334] "Generic (PLEG): container finished" podID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerID="5d1eb731fe90a2d7967613024c86cdc88087f71bf2c0f7ac579978edd63efcfb" exitCode=143 Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.379393 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a088f5-c8ac-4467-95ce-aaefb7df9f76","Type":"ContainerDied","Data":"e063f03566777526887cc68e28abb25b31c947cbed49669016b5a1af97607695"} Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.379435 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a088f5-c8ac-4467-95ce-aaefb7df9f76","Type":"ContainerDied","Data":"5d1eb731fe90a2d7967613024c86cdc88087f71bf2c0f7ac579978edd63efcfb"} Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.390425 4786 generic.go:334] "Generic (PLEG): container finished" podID="e937b394-6278-4f37-b898-9f7bf592a43c" containerID="a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229" exitCode=0 Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.390514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e937b394-6278-4f37-b898-9f7bf592a43c","Type":"ContainerDied","Data":"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229"} Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.390548 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e937b394-6278-4f37-b898-9f7bf592a43c","Type":"ContainerDied","Data":"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb"} Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.390495 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.390524 4786 generic.go:334] "Generic (PLEG): container finished" podID="e937b394-6278-4f37-b898-9f7bf592a43c" containerID="8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb" exitCode=143 Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.390636 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e937b394-6278-4f37-b898-9f7bf592a43c","Type":"ContainerDied","Data":"42c670f7e20e04c49fdcf1064cecf4467cb027efb5b52e21fd64bc9b6653ab02"} Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.390688 4786 scope.go:117] "RemoveContainer" containerID="a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.398955 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.443091 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.453756 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.466538 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.471472 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfvjh\" (UniqueName: \"kubernetes.io/projected/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-kube-api-access-xfvjh\") pod \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.471531 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-scripts\") pod \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.471663 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.471707 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-combined-ca-bundle\") pod \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.471752 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-httpd-run\") pod \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.471821 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-logs\") pod \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.471935 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-config-data\") pod \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\" (UID: \"c9a088f5-c8ac-4467-95ce-aaefb7df9f76\") " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.472490 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.475424 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "c9a088f5-c8ac-4467-95ce-aaefb7df9f76" (UID: "c9a088f5-c8ac-4467-95ce-aaefb7df9f76"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.479520 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-logs" (OuterVolumeSpecName: "logs") pod "c9a088f5-c8ac-4467-95ce-aaefb7df9f76" (UID: "c9a088f5-c8ac-4467-95ce-aaefb7df9f76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.479765 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c9a088f5-c8ac-4467-95ce-aaefb7df9f76" (UID: "c9a088f5-c8ac-4467-95ce-aaefb7df9f76"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.480738 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:15 crc kubenswrapper[4786]: E0313 15:24:15.481423 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerName="glance-httpd" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.481531 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerName="glance-httpd" Mar 13 15:24:15 crc kubenswrapper[4786]: E0313 15:24:15.481614 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" containerName="glance-httpd" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.481683 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" containerName="glance-httpd" Mar 13 15:24:15 crc kubenswrapper[4786]: E0313 15:24:15.481753 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" containerName="glance-log" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.481813 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" containerName="glance-log" Mar 13 15:24:15 crc kubenswrapper[4786]: E0313 15:24:15.481900 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerName="glance-log" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.481973 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerName="glance-log" Mar 13 15:24:15 crc kubenswrapper[4786]: E0313 15:24:15.482049 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498308e1-0d01-4899-8b74-dd393b4cad4b" containerName="init" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.482113 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="498308e1-0d01-4899-8b74-dd393b4cad4b" containerName="init" Mar 13 15:24:15 crc kubenswrapper[4786]: E0313 15:24:15.482588 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498308e1-0d01-4899-8b74-dd393b4cad4b" containerName="dnsmasq-dns" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.482695 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="498308e1-0d01-4899-8b74-dd393b4cad4b" containerName="dnsmasq-dns" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.482967 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" containerName="glance-httpd" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.483112 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="498308e1-0d01-4899-8b74-dd393b4cad4b" containerName="dnsmasq-dns" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.483189 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerName="glance-httpd" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.483261 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" containerName="glance-log" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.483338 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" containerName="glance-log" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.484941 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.486950 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-scripts" (OuterVolumeSpecName: "scripts") pod "c9a088f5-c8ac-4467-95ce-aaefb7df9f76" (UID: "c9a088f5-c8ac-4467-95ce-aaefb7df9f76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.487192 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-kube-api-access-xfvjh" (OuterVolumeSpecName: "kube-api-access-xfvjh") pod "c9a088f5-c8ac-4467-95ce-aaefb7df9f76" (UID: "c9a088f5-c8ac-4467-95ce-aaefb7df9f76"). InnerVolumeSpecName "kube-api-access-xfvjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.489713 4786 scope.go:117] "RemoveContainer" containerID="8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.491471 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.509053 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9a088f5-c8ac-4467-95ce-aaefb7df9f76" (UID: "c9a088f5-c8ac-4467-95ce-aaefb7df9f76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.520875 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.574784 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-config-data" (OuterVolumeSpecName: "config-data") pod "c9a088f5-c8ac-4467-95ce-aaefb7df9f76" (UID: "c9a088f5-c8ac-4467-95ce-aaefb7df9f76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.574799 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.574858 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.574886 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfvjh\" (UniqueName: \"kubernetes.io/projected/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-kube-api-access-xfvjh\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.574901 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.574932 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.574953 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.599938 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.676779 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-config-data\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.676818 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.676887 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.678000 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-logs\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.678104 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.678189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-scripts\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.678224 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g7kf\" (UniqueName: \"kubernetes.io/projected/30b88754-9006-436b-b296-be043d373a34-kube-api-access-8g7kf\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.678313 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a088f5-c8ac-4467-95ce-aaefb7df9f76-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.678335 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.780203 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-logs\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.780281 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.780321 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g7kf\" (UniqueName: \"kubernetes.io/projected/30b88754-9006-436b-b296-be043d373a34-kube-api-access-8g7kf\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.780337 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-scripts\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.780368 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-config-data\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.780383 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.780418 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.780760 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.781332 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-logs\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.781391 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.787305 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.791090 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-config-data\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.803915 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-scripts\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.810507 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g7kf\" (UniqueName: \"kubernetes.io/projected/30b88754-9006-436b-b296-be043d373a34-kube-api-access-8g7kf\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.827017 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:15 crc kubenswrapper[4786]: I0313 15:24:15.978293 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.403726 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a088f5-c8ac-4467-95ce-aaefb7df9f76","Type":"ContainerDied","Data":"95e226472fbe14308d7346a56324c16fc717fec6530b68149369d6a88a2dd2f0"} Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.403766 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.408368 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" containerID="b288097f55569918eb38ccdfc6bf25211a56db02df55dbab27d3c330c057a809" exitCode=0 Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.408412 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r5jfk" event={"ID":"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa","Type":"ContainerDied","Data":"b288097f55569918eb38ccdfc6bf25211a56db02df55dbab27d3c330c057a809"} Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.459894 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.479807 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.488448 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.491624 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.496074 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.506728 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.562535 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a088f5-c8ac-4467-95ce-aaefb7df9f76" path="/var/lib/kubelet/pods/c9a088f5-c8ac-4467-95ce-aaefb7df9f76/volumes" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.563267 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e937b394-6278-4f37-b898-9f7bf592a43c" path="/var/lib/kubelet/pods/e937b394-6278-4f37-b898-9f7bf592a43c/volumes" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.597124 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.597206 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.597275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.597323 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzj2v\" (UniqueName: \"kubernetes.io/projected/84776ef2-e0db-47aa-9135-e11f92c291b7-kube-api-access-rzj2v\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.597398 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.597444 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.597477 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.699207 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.699259 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.699300 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.699323 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.699391 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.699421 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzj2v\" (UniqueName: \"kubernetes.io/projected/84776ef2-e0db-47aa-9135-e11f92c291b7-kube-api-access-rzj2v\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.699467 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.700397 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.701323 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.704241 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.705391 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.706002 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.707796 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.731208 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzj2v\" (UniqueName: \"kubernetes.io/projected/84776ef2-e0db-47aa-9135-e11f92c291b7-kube-api-access-rzj2v\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.743983 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:16 crc kubenswrapper[4786]: I0313 15:24:16.837991 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.893639 4786 scope.go:117] "RemoveContainer" containerID="a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229" Mar 13 15:24:17 crc kubenswrapper[4786]: E0313 15:24:17.895492 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229\": container with ID starting with a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229 not found: ID does not exist" containerID="a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.895544 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229"} err="failed to get container status \"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229\": rpc error: code = NotFound desc = could not find container \"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229\": container with ID starting with a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229 not found: ID does not exist" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.895569 4786 scope.go:117] "RemoveContainer" containerID="8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb" Mar 13 15:24:17 crc kubenswrapper[4786]: E0313 15:24:17.895907 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb\": container with ID starting with 8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb not found: ID does not exist" containerID="8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.895936 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb"} err="failed to get container status \"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb\": rpc error: code = NotFound desc = could not find container \"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb\": container with ID starting with 8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb not found: ID does not exist" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.895954 4786 scope.go:117] "RemoveContainer" containerID="a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.896381 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229"} err="failed to get container status \"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229\": rpc error: code = NotFound desc = could not find container \"a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229\": container with ID starting with a458de2e321adb71fce618d48324314c15f44bc41f38468f978932ecb1738229 not found: ID does not exist" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.896496 4786 scope.go:117] "RemoveContainer" containerID="8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.897692 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb"} err="failed to get container status \"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb\": rpc error: code = NotFound desc = could not find container \"8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb\": container with ID starting with 8c1a0a954f45a4d34cc202614a19ce5d44811166b3b42fca4d8a6c26e0291deb not found: ID does not exist" Mar 13 15:24:17 crc kubenswrapper[4786]: I0313 15:24:17.897716 4786 scope.go:117] "RemoveContainer" containerID="e063f03566777526887cc68e28abb25b31c947cbed49669016b5a1af97607695" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.024508 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.128480 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-combined-ca-bundle\") pod \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.128781 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-fernet-keys\") pod \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.129905 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-credential-keys\") pod \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.129994 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-scripts\") pod \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.130031 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-config-data\") pod \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.130116 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt296\" (UniqueName: \"kubernetes.io/projected/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-kube-api-access-vt296\") pod \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\" (UID: \"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa\") " Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.152133 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-scripts" (OuterVolumeSpecName: "scripts") pod "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" (UID: "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.166982 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" (UID: "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.179024 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" (UID: "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.179149 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-kube-api-access-vt296" (OuterVolumeSpecName: "kube-api-access-vt296") pod "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" (UID: "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa"). InnerVolumeSpecName "kube-api-access-vt296". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.182066 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-config-data" (OuterVolumeSpecName: "config-data") pod "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" (UID: "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.207013 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" (UID: "0c380742-07d9-4f6f-a8c4-3e8f3593c3fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.231618 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.231649 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.231659 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt296\" (UniqueName: \"kubernetes.io/projected/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-kube-api-access-vt296\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.231669 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.231678 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.231686 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.458449 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r5jfk" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.458950 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r5jfk" event={"ID":"0c380742-07d9-4f6f-a8c4-3e8f3593c3fa","Type":"ContainerDied","Data":"277527eb80c7b0fd06be7a7b3ab64a32a731d0dad98a5b336432d8fa260f76e4"} Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.460139 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="277527eb80c7b0fd06be7a7b3ab64a32a731d0dad98a5b336432d8fa260f76e4" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.575181 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r5jfk"] Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.584925 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r5jfk"] Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.663668 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cn57v"] Mar 13 15:24:18 crc kubenswrapper[4786]: E0313 15:24:18.664310 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" containerName="keystone-bootstrap" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.664334 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" containerName="keystone-bootstrap" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.664579 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" containerName="keystone-bootstrap" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.665336 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.667652 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.667926 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.668103 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dsq8r" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.668300 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.668647 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.672759 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cn57v"] Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.756755 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-credential-keys\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.756815 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj9lz\" (UniqueName: \"kubernetes.io/projected/880a2246-5ead-4e58-b471-d6d006ee3053-kube-api-access-mj9lz\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.756878 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-combined-ca-bundle\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.758164 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-scripts\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.758757 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-fernet-keys\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.759082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-config-data\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.860675 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-credential-keys\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.860720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj9lz\" (UniqueName: \"kubernetes.io/projected/880a2246-5ead-4e58-b471-d6d006ee3053-kube-api-access-mj9lz\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.860745 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-combined-ca-bundle\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.860790 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-scripts\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.860813 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-fernet-keys\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.860894 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-config-data\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.865909 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-fernet-keys\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.866353 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-config-data\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.867709 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-credential-keys\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.869449 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-combined-ca-bundle\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.876168 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-scripts\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.877276 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj9lz\" (UniqueName: \"kubernetes.io/projected/880a2246-5ead-4e58-b471-d6d006ee3053-kube-api-access-mj9lz\") pod \"keystone-bootstrap-cn57v\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:18 crc kubenswrapper[4786]: I0313 15:24:18.993720 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:19 crc kubenswrapper[4786]: I0313 15:24:19.917854 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:19 crc kubenswrapper[4786]: I0313 15:24:19.988754 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:20 crc kubenswrapper[4786]: I0313 15:24:20.546127 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:24:20 crc kubenswrapper[4786]: I0313 15:24:20.567093 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c380742-07d9-4f6f-a8c4-3e8f3593c3fa" path="/var/lib/kubelet/pods/0c380742-07d9-4f6f-a8c4-3e8f3593c3fa/volumes" Mar 13 15:24:20 crc kubenswrapper[4786]: I0313 15:24:20.612834 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-hszcv"] Mar 13 15:24:20 crc kubenswrapper[4786]: I0313 15:24:20.613207 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" containerName="dnsmasq-dns" containerID="cri-o://a5ee6500159e95cd3b50a5f36867482dcb62b6e4af81c6c2bc25a4b78e2ac686" gracePeriod=10 Mar 13 15:24:21 crc kubenswrapper[4786]: I0313 15:24:21.484717 4786 generic.go:334] "Generic (PLEG): container finished" podID="39015b4e-70c0-48e9-aad2-14cc102da742" containerID="a5ee6500159e95cd3b50a5f36867482dcb62b6e4af81c6c2bc25a4b78e2ac686" exitCode=0 Mar 13 15:24:21 crc kubenswrapper[4786]: I0313 15:24:21.484818 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" event={"ID":"39015b4e-70c0-48e9-aad2-14cc102da742","Type":"ContainerDied","Data":"a5ee6500159e95cd3b50a5f36867482dcb62b6e4af81c6c2bc25a4b78e2ac686"} Mar 13 15:24:30 crc kubenswrapper[4786]: I0313 15:24:30.623775 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: i/o timeout" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.210557 4786 scope.go:117] "RemoveContainer" containerID="5d1eb731fe90a2d7967613024c86cdc88087f71bf2c0f7ac579978edd63efcfb" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.295698 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.434117 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-dns-svc\") pod \"39015b4e-70c0-48e9-aad2-14cc102da742\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.434186 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6w87\" (UniqueName: \"kubernetes.io/projected/39015b4e-70c0-48e9-aad2-14cc102da742-kube-api-access-n6w87\") pod \"39015b4e-70c0-48e9-aad2-14cc102da742\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.434236 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-config\") pod \"39015b4e-70c0-48e9-aad2-14cc102da742\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.434387 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-nb\") pod \"39015b4e-70c0-48e9-aad2-14cc102da742\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.434431 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-sb\") pod \"39015b4e-70c0-48e9-aad2-14cc102da742\" (UID: \"39015b4e-70c0-48e9-aad2-14cc102da742\") " Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.440089 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39015b4e-70c0-48e9-aad2-14cc102da742-kube-api-access-n6w87" (OuterVolumeSpecName: "kube-api-access-n6w87") pod "39015b4e-70c0-48e9-aad2-14cc102da742" (UID: "39015b4e-70c0-48e9-aad2-14cc102da742"). InnerVolumeSpecName "kube-api-access-n6w87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.477431 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39015b4e-70c0-48e9-aad2-14cc102da742" (UID: "39015b4e-70c0-48e9-aad2-14cc102da742"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.490618 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-config" (OuterVolumeSpecName: "config") pod "39015b4e-70c0-48e9-aad2-14cc102da742" (UID: "39015b4e-70c0-48e9-aad2-14cc102da742"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.506463 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39015b4e-70c0-48e9-aad2-14cc102da742" (UID: "39015b4e-70c0-48e9-aad2-14cc102da742"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.513448 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39015b4e-70c0-48e9-aad2-14cc102da742" (UID: "39015b4e-70c0-48e9-aad2-14cc102da742"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.536979 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.537011 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.537021 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.537032 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6w87\" (UniqueName: \"kubernetes.io/projected/39015b4e-70c0-48e9-aad2-14cc102da742-kube-api-access-n6w87\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.537044 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39015b4e-70c0-48e9-aad2-14cc102da742-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.568514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" event={"ID":"39015b4e-70c0-48e9-aad2-14cc102da742","Type":"ContainerDied","Data":"aa1eebac2e81fd4b92eb0019d24ba92b1f1f5983093cecb9a98deee4250d3a08"} Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.568590 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.598869 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-hszcv"] Mar 13 15:24:32 crc kubenswrapper[4786]: I0313 15:24:32.605969 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9fd7d84c-hszcv"] Mar 13 15:24:33 crc kubenswrapper[4786]: I0313 15:24:33.336300 4786 scope.go:117] "RemoveContainer" containerID="a5ee6500159e95cd3b50a5f36867482dcb62b6e4af81c6c2bc25a4b78e2ac686" Mar 13 15:24:33 crc kubenswrapper[4786]: E0313 15:24:33.355735 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 13 15:24:33 crc kubenswrapper[4786]: E0313 15:24:33.356612 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4x57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nwdnc_openstack(fb8a98ec-82a3-418d-82ea-d0ff210dd78d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:24:33 crc kubenswrapper[4786]: E0313 15:24:33.357969 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nwdnc" podUID="fb8a98ec-82a3-418d-82ea-d0ff210dd78d" Mar 13 15:24:33 crc kubenswrapper[4786]: I0313 15:24:33.550913 4786 scope.go:117] "RemoveContainer" containerID="dab01b53dc911359bf134047f019339fcdd078355c6fa62d83d79ebf64888823" Mar 13 15:24:33 crc kubenswrapper[4786]: E0313 15:24:33.608118 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-nwdnc" podUID="fb8a98ec-82a3-418d-82ea-d0ff210dd78d" Mar 13 15:24:33 crc kubenswrapper[4786]: I0313 15:24:33.891387 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cn57v"] Mar 13 15:24:33 crc kubenswrapper[4786]: I0313 15:24:33.933541 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.026559 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.563322 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" path="/var/lib/kubelet/pods/39015b4e-70c0-48e9-aad2-14cc102da742/volumes" Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.615056 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ft8mk" event={"ID":"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae","Type":"ContainerStarted","Data":"531d2a78c9146820b5a2df89d9ec1e84eb160d49878916f2704fc6e9e8fd25f6"} Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.617096 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30b88754-9006-436b-b296-be043d373a34","Type":"ContainerStarted","Data":"3f279d09e4c4b5c3d3a2ee8919d734190091b9592625025ae88089f95bfd84cf"} Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.623167 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7tr8p" event={"ID":"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118","Type":"ContainerStarted","Data":"c56b988be8539efcb5b56a2c8c1a56c0540f0cc6b1d6619fa49940db16f80c4e"} Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.638646 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84776ef2-e0db-47aa-9135-e11f92c291b7","Type":"ContainerStarted","Data":"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2"} Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.639299 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84776ef2-e0db-47aa-9135-e11f92c291b7","Type":"ContainerStarted","Data":"aa9e6bd6cee84547d7bbc0850357cb2a23ca807e974eac0915f99763c7fb231a"} Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.639218 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ft8mk" podStartSLOduration=3.534154582 podStartE2EDuration="25.639198398s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="2026-03-13 15:24:11.203451088 +0000 UTC m=+1281.366662899" lastFinishedPulling="2026-03-13 15:24:33.308494914 +0000 UTC m=+1303.471706715" observedRunningTime="2026-03-13 15:24:34.633878425 +0000 UTC m=+1304.797090236" watchObservedRunningTime="2026-03-13 15:24:34.639198398 +0000 UTC m=+1304.802410209" Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.644075 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerStarted","Data":"34a3de5ae0339bade5ea43cf68217e18b65117c08c10bedbf883a4d025fffaba"} Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.656135 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cn57v" event={"ID":"880a2246-5ead-4e58-b471-d6d006ee3053","Type":"ContainerStarted","Data":"127103e9eac9cb6727d347c51ea0538ede6898b83fbb9348b73c4b1c3cb88cb9"} Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.656179 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cn57v" event={"ID":"880a2246-5ead-4e58-b471-d6d006ee3053","Type":"ContainerStarted","Data":"24bdfef5211b9b7989f872e8db095c9f75d56bbf499b3df2e979f98ad09aab5f"} Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.656689 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-7tr8p" podStartSLOduration=3.082143881 podStartE2EDuration="25.656670437s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="2026-03-13 15:24:10.767114021 +0000 UTC m=+1280.930325832" lastFinishedPulling="2026-03-13 15:24:33.341640577 +0000 UTC m=+1303.504852388" observedRunningTime="2026-03-13 15:24:34.651806915 +0000 UTC m=+1304.815018726" watchObservedRunningTime="2026-03-13 15:24:34.656670437 +0000 UTC m=+1304.819882248" Mar 13 15:24:34 crc kubenswrapper[4786]: I0313 15:24:34.684110 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cn57v" podStartSLOduration=16.684093327 podStartE2EDuration="16.684093327s" podCreationTimestamp="2026-03-13 15:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:34.67745962 +0000 UTC m=+1304.840671441" watchObservedRunningTime="2026-03-13 15:24:34.684093327 +0000 UTC m=+1304.847305138" Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.625529 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b9fd7d84c-hszcv" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: i/o timeout" Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.665393 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30b88754-9006-436b-b296-be043d373a34","Type":"ContainerStarted","Data":"557f9aa1c90b5a13bd1dc319a865a36bc8988547c99faf8329292c9df6a33e89"} Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.665444 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30b88754-9006-436b-b296-be043d373a34","Type":"ContainerStarted","Data":"ccf3a42e02455a4c5c2d40fc0dffe9dd73180cb033621483c1f76aaa02773c01"} Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.665536 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="30b88754-9006-436b-b296-be043d373a34" containerName="glance-log" containerID="cri-o://557f9aa1c90b5a13bd1dc319a865a36bc8988547c99faf8329292c9df6a33e89" gracePeriod=30 Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.665725 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="30b88754-9006-436b-b296-be043d373a34" containerName="glance-httpd" containerID="cri-o://ccf3a42e02455a4c5c2d40fc0dffe9dd73180cb033621483c1f76aaa02773c01" gracePeriod=30 Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.675005 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84776ef2-e0db-47aa-9135-e11f92c291b7","Type":"ContainerStarted","Data":"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2"} Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.675186 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerName="glance-log" containerID="cri-o://24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2" gracePeriod=30 Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.675309 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerName="glance-httpd" containerID="cri-o://5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2" gracePeriod=30 Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.682722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerStarted","Data":"39e39b70e9ae5a6af51f20dbd41d61866f1bf24a116705d8f43525abe21e7766"} Mar 13 15:24:35 crc kubenswrapper[4786]: I0313 15:24:35.692171 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.692143613 podStartE2EDuration="20.692143613s" podCreationTimestamp="2026-03-13 15:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:35.686703476 +0000 UTC m=+1305.849915287" watchObservedRunningTime="2026-03-13 15:24:35.692143613 +0000 UTC m=+1305.855355424" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.412704 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.518462 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-httpd-run\") pod \"84776ef2-e0db-47aa-9135-e11f92c291b7\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.518499 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-scripts\") pod \"84776ef2-e0db-47aa-9135-e11f92c291b7\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.518530 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-logs\") pod \"84776ef2-e0db-47aa-9135-e11f92c291b7\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.518547 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"84776ef2-e0db-47aa-9135-e11f92c291b7\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.518566 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-combined-ca-bundle\") pod \"84776ef2-e0db-47aa-9135-e11f92c291b7\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.518604 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-config-data\") pod \"84776ef2-e0db-47aa-9135-e11f92c291b7\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.518627 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzj2v\" (UniqueName: \"kubernetes.io/projected/84776ef2-e0db-47aa-9135-e11f92c291b7-kube-api-access-rzj2v\") pod \"84776ef2-e0db-47aa-9135-e11f92c291b7\" (UID: \"84776ef2-e0db-47aa-9135-e11f92c291b7\") " Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.519840 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-logs" (OuterVolumeSpecName: "logs") pod "84776ef2-e0db-47aa-9135-e11f92c291b7" (UID: "84776ef2-e0db-47aa-9135-e11f92c291b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.520195 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "84776ef2-e0db-47aa-9135-e11f92c291b7" (UID: "84776ef2-e0db-47aa-9135-e11f92c291b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.528048 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "84776ef2-e0db-47aa-9135-e11f92c291b7" (UID: "84776ef2-e0db-47aa-9135-e11f92c291b7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.550259 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-scripts" (OuterVolumeSpecName: "scripts") pod "84776ef2-e0db-47aa-9135-e11f92c291b7" (UID: "84776ef2-e0db-47aa-9135-e11f92c291b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.550429 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84776ef2-e0db-47aa-9135-e11f92c291b7-kube-api-access-rzj2v" (OuterVolumeSpecName: "kube-api-access-rzj2v") pod "84776ef2-e0db-47aa-9135-e11f92c291b7" (UID: "84776ef2-e0db-47aa-9135-e11f92c291b7"). InnerVolumeSpecName "kube-api-access-rzj2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.564355 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84776ef2-e0db-47aa-9135-e11f92c291b7" (UID: "84776ef2-e0db-47aa-9135-e11f92c291b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.605043 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-config-data" (OuterVolumeSpecName: "config-data") pod "84776ef2-e0db-47aa-9135-e11f92c291b7" (UID: "84776ef2-e0db-47aa-9135-e11f92c291b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.621648 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.621684 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.621695 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84776ef2-e0db-47aa-9135-e11f92c291b7-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.621734 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.621746 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.621758 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84776ef2-e0db-47aa-9135-e11f92c291b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.621768 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzj2v\" (UniqueName: \"kubernetes.io/projected/84776ef2-e0db-47aa-9135-e11f92c291b7-kube-api-access-rzj2v\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.650533 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.699275 4786 generic.go:334] "Generic (PLEG): container finished" podID="30b88754-9006-436b-b296-be043d373a34" containerID="ccf3a42e02455a4c5c2d40fc0dffe9dd73180cb033621483c1f76aaa02773c01" exitCode=0 Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.699316 4786 generic.go:334] "Generic (PLEG): container finished" podID="30b88754-9006-436b-b296-be043d373a34" containerID="557f9aa1c90b5a13bd1dc319a865a36bc8988547c99faf8329292c9df6a33e89" exitCode=143 Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.699381 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30b88754-9006-436b-b296-be043d373a34","Type":"ContainerDied","Data":"ccf3a42e02455a4c5c2d40fc0dffe9dd73180cb033621483c1f76aaa02773c01"} Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.699431 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30b88754-9006-436b-b296-be043d373a34","Type":"ContainerDied","Data":"557f9aa1c90b5a13bd1dc319a865a36bc8988547c99faf8329292c9df6a33e89"} Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.703368 4786 generic.go:334] "Generic (PLEG): container finished" podID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerID="5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2" exitCode=0 Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.703392 4786 generic.go:334] "Generic (PLEG): container finished" podID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerID="24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2" exitCode=143 Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.703409 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84776ef2-e0db-47aa-9135-e11f92c291b7","Type":"ContainerDied","Data":"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2"} Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.703430 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84776ef2-e0db-47aa-9135-e11f92c291b7","Type":"ContainerDied","Data":"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2"} Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.703441 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84776ef2-e0db-47aa-9135-e11f92c291b7","Type":"ContainerDied","Data":"aa9e6bd6cee84547d7bbc0850357cb2a23ca807e974eac0915f99763c7fb231a"} Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.703457 4786 scope.go:117] "RemoveContainer" containerID="5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.703586 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.725625 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.753275 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.772173 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.781577 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:36 crc kubenswrapper[4786]: E0313 15:24:36.782035 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerName="glance-log" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.782050 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerName="glance-log" Mar 13 15:24:36 crc kubenswrapper[4786]: E0313 15:24:36.782069 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" containerName="dnsmasq-dns" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.782075 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" containerName="dnsmasq-dns" Mar 13 15:24:36 crc kubenswrapper[4786]: E0313 15:24:36.782085 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" containerName="init" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.782091 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" containerName="init" Mar 13 15:24:36 crc kubenswrapper[4786]: E0313 15:24:36.782101 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerName="glance-httpd" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.782106 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerName="glance-httpd" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.782262 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerName="glance-log" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.782273 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" containerName="glance-httpd" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.782288 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="39015b4e-70c0-48e9-aad2-14cc102da742" containerName="dnsmasq-dns" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.783198 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.786284 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.786512 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.820286 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.826897 4786 scope.go:117] "RemoveContainer" containerID="24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.854289 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.870877 4786 scope.go:117] "RemoveContainer" containerID="5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2" Mar 13 15:24:36 crc kubenswrapper[4786]: E0313 15:24:36.871422 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2\": container with ID starting with 5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2 not found: ID does not exist" containerID="5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.871457 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2"} err="failed to get container status \"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2\": rpc error: code = NotFound desc = could not find container \"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2\": container with ID starting with 5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2 not found: ID does not exist" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.871476 4786 scope.go:117] "RemoveContainer" containerID="24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2" Mar 13 15:24:36 crc kubenswrapper[4786]: E0313 15:24:36.871804 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2\": container with ID starting with 24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2 not found: ID does not exist" containerID="24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.871822 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2"} err="failed to get container status \"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2\": rpc error: code = NotFound desc = could not find container \"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2\": container with ID starting with 24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2 not found: ID does not exist" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.871834 4786 scope.go:117] "RemoveContainer" containerID="5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.872221 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2"} err="failed to get container status \"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2\": rpc error: code = NotFound desc = could not find container \"5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2\": container with ID starting with 5b48eda2f3a9f17eb9ba088396aaecdea8bf9c9028069abcda6d1dcbdff5f2e2 not found: ID does not exist" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.872235 4786 scope.go:117] "RemoveContainer" containerID="24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.872375 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2"} err="failed to get container status \"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2\": rpc error: code = NotFound desc = could not find container \"24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2\": container with ID starting with 24863ef8948bcc1b7b0b0fae987334c663343c8570eb9ca2197200f943d097a2 not found: ID does not exist" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.936792 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.936867 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.936899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.937018 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kl2l\" (UniqueName: \"kubernetes.io/projected/24def402-fa10-4192-a42c-fb38e387247c-kube-api-access-6kl2l\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.937145 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.937189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.937218 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-logs\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:36 crc kubenswrapper[4786]: I0313 15:24:36.937298 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.038658 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-scripts\") pod \"30b88754-9006-436b-b296-be043d373a34\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.038757 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-config-data\") pod \"30b88754-9006-436b-b296-be043d373a34\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.038936 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-combined-ca-bundle\") pod \"30b88754-9006-436b-b296-be043d373a34\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039075 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-logs\") pod \"30b88754-9006-436b-b296-be043d373a34\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039101 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g7kf\" (UniqueName: \"kubernetes.io/projected/30b88754-9006-436b-b296-be043d373a34-kube-api-access-8g7kf\") pod \"30b88754-9006-436b-b296-be043d373a34\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039172 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-httpd-run\") pod \"30b88754-9006-436b-b296-be043d373a34\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039237 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"30b88754-9006-436b-b296-be043d373a34\" (UID: \"30b88754-9006-436b-b296-be043d373a34\") " Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039556 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039607 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039634 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-logs\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039636 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-logs" (OuterVolumeSpecName: "logs") pod "30b88754-9006-436b-b296-be043d373a34" (UID: "30b88754-9006-436b-b296-be043d373a34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039657 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "30b88754-9006-436b-b296-be043d373a34" (UID: "30b88754-9006-436b-b296-be043d373a34"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039891 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039920 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.039967 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.040125 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kl2l\" (UniqueName: \"kubernetes.io/projected/24def402-fa10-4192-a42c-fb38e387247c-kube-api-access-6kl2l\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.040173 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-logs\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.040386 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.040401 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30b88754-9006-436b-b296-be043d373a34-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.040494 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.043189 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-scripts" (OuterVolumeSpecName: "scripts") pod "30b88754-9006-436b-b296-be043d373a34" (UID: "30b88754-9006-436b-b296-be043d373a34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.043677 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "30b88754-9006-436b-b296-be043d373a34" (UID: "30b88754-9006-436b-b296-be043d373a34"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.044069 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.044192 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.045794 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.047089 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.048223 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.055901 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b88754-9006-436b-b296-be043d373a34-kube-api-access-8g7kf" (OuterVolumeSpecName: "kube-api-access-8g7kf") pod "30b88754-9006-436b-b296-be043d373a34" (UID: "30b88754-9006-436b-b296-be043d373a34"). InnerVolumeSpecName "kube-api-access-8g7kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.064758 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kl2l\" (UniqueName: \"kubernetes.io/projected/24def402-fa10-4192-a42c-fb38e387247c-kube-api-access-6kl2l\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.071795 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.080151 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30b88754-9006-436b-b296-be043d373a34" (UID: "30b88754-9006-436b-b296-be043d373a34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.089107 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-config-data" (OuterVolumeSpecName: "config-data") pod "30b88754-9006-436b-b296-be043d373a34" (UID: "30b88754-9006-436b-b296-be043d373a34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.125080 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.142968 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.143005 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.143020 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b88754-9006-436b-b296-be043d373a34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.143035 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g7kf\" (UniqueName: \"kubernetes.io/projected/30b88754-9006-436b-b296-be043d373a34-kube-api-access-8g7kf\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.143150 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.160950 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.246615 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.640775 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.713779 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30b88754-9006-436b-b296-be043d373a34","Type":"ContainerDied","Data":"3f279d09e4c4b5c3d3a2ee8919d734190091b9592625025ae88089f95bfd84cf"} Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.713826 4786 scope.go:117] "RemoveContainer" containerID="ccf3a42e02455a4c5c2d40fc0dffe9dd73180cb033621483c1f76aaa02773c01" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.713879 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.717289 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24def402-fa10-4192-a42c-fb38e387247c","Type":"ContainerStarted","Data":"2f8d27bcaec4964450f42a81d712977e0da415a522514d51baee575f33adb275"} Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.722813 4786 generic.go:334] "Generic (PLEG): container finished" podID="880a2246-5ead-4e58-b471-d6d006ee3053" containerID="127103e9eac9cb6727d347c51ea0538ede6898b83fbb9348b73c4b1c3cb88cb9" exitCode=0 Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.722885 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cn57v" event={"ID":"880a2246-5ead-4e58-b471-d6d006ee3053","Type":"ContainerDied","Data":"127103e9eac9cb6727d347c51ea0538ede6898b83fbb9348b73c4b1c3cb88cb9"} Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.725582 4786 generic.go:334] "Generic (PLEG): container finished" podID="b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" containerID="531d2a78c9146820b5a2df89d9ec1e84eb160d49878916f2704fc6e9e8fd25f6" exitCode=0 Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.725623 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ft8mk" event={"ID":"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae","Type":"ContainerDied","Data":"531d2a78c9146820b5a2df89d9ec1e84eb160d49878916f2704fc6e9e8fd25f6"} Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.753558 4786 scope.go:117] "RemoveContainer" containerID="557f9aa1c90b5a13bd1dc319a865a36bc8988547c99faf8329292c9df6a33e89" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.793698 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.811982 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.829468 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:37 crc kubenswrapper[4786]: E0313 15:24:37.829888 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b88754-9006-436b-b296-be043d373a34" containerName="glance-log" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.829903 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b88754-9006-436b-b296-be043d373a34" containerName="glance-log" Mar 13 15:24:37 crc kubenswrapper[4786]: E0313 15:24:37.829916 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b88754-9006-436b-b296-be043d373a34" containerName="glance-httpd" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.829923 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b88754-9006-436b-b296-be043d373a34" containerName="glance-httpd" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.830142 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b88754-9006-436b-b296-be043d373a34" containerName="glance-httpd" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.830174 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b88754-9006-436b-b296-be043d373a34" containerName="glance-log" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.831242 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.836416 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.836665 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.837940 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.868526 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.868589 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.959190 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.959258 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-scripts\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.959320 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-logs\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.959340 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.959366 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.959560 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.959606 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8zdc\" (UniqueName: \"kubernetes.io/projected/eeb9deff-99f7-4425-a84c-a520d67430ed-kube-api-access-r8zdc\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:37 crc kubenswrapper[4786]: I0313 15:24:37.959671 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-config-data\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.061454 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-scripts\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.061524 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-logs\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.061571 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.061981 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-logs\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.061590 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.062056 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.062081 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8zdc\" (UniqueName: \"kubernetes.io/projected/eeb9deff-99f7-4425-a84c-a520d67430ed-kube-api-access-r8zdc\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.062154 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.062684 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-config-data\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.062974 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.063847 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.067520 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-scripts\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.067981 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-config-data\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.068752 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.080526 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.096560 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8zdc\" (UniqueName: \"kubernetes.io/projected/eeb9deff-99f7-4425-a84c-a520d67430ed-kube-api-access-r8zdc\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.099585 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.159291 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.567890 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b88754-9006-436b-b296-be043d373a34" path="/var/lib/kubelet/pods/30b88754-9006-436b-b296-be043d373a34/volumes" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.569261 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84776ef2-e0db-47aa-9135-e11f92c291b7" path="/var/lib/kubelet/pods/84776ef2-e0db-47aa-9135-e11f92c291b7/volumes" Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.727697 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.738600 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24def402-fa10-4192-a42c-fb38e387247c","Type":"ContainerStarted","Data":"df580d52b6d0ead148d878f686b085d8186767fa428863bb7af2712c22cea3ab"} Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.741095 4786 generic.go:334] "Generic (PLEG): container finished" podID="8e4376c9-9b7f-4c5b-b2bc-91f1390dc118" containerID="c56b988be8539efcb5b56a2c8c1a56c0540f0cc6b1d6619fa49940db16f80c4e" exitCode=0 Mar 13 15:24:38 crc kubenswrapper[4786]: I0313 15:24:38.741233 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7tr8p" event={"ID":"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118","Type":"ContainerDied","Data":"c56b988be8539efcb5b56a2c8c1a56c0540f0cc6b1d6619fa49940db16f80c4e"} Mar 13 15:24:39 crc kubenswrapper[4786]: I0313 15:24:39.754234 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24def402-fa10-4192-a42c-fb38e387247c","Type":"ContainerStarted","Data":"093d9f552c30bdec922a9a62149d9ced8460871a61712767fba2e030b08a33c1"} Mar 13 15:24:39 crc kubenswrapper[4786]: I0313 15:24:39.790760 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.790739735 podStartE2EDuration="3.790739735s" podCreationTimestamp="2026-03-13 15:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:39.779845041 +0000 UTC m=+1309.943056852" watchObservedRunningTime="2026-03-13 15:24:39.790739735 +0000 UTC m=+1309.953951546" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.137304 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.144780 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.156003 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.230845 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-scripts\") pod \"880a2246-5ead-4e58-b471-d6d006ee3053\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.230923 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmxkk\" (UniqueName: \"kubernetes.io/projected/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-kube-api-access-qmxkk\") pod \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.230974 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-config-data\") pod \"880a2246-5ead-4e58-b471-d6d006ee3053\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.231024 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-fernet-keys\") pod \"880a2246-5ead-4e58-b471-d6d006ee3053\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.231046 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-combined-ca-bundle\") pod \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.231794 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-combined-ca-bundle\") pod \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.231876 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-config-data\") pod \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.231912 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj9lz\" (UniqueName: \"kubernetes.io/projected/880a2246-5ead-4e58-b471-d6d006ee3053-kube-api-access-mj9lz\") pod \"880a2246-5ead-4e58-b471-d6d006ee3053\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.231940 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-credential-keys\") pod \"880a2246-5ead-4e58-b471-d6d006ee3053\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.231963 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-db-sync-config-data\") pod \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\" (UID: \"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.231993 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d96hw\" (UniqueName: \"kubernetes.io/projected/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-kube-api-access-d96hw\") pod \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.232021 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-scripts\") pod \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.232048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-combined-ca-bundle\") pod \"880a2246-5ead-4e58-b471-d6d006ee3053\" (UID: \"880a2246-5ead-4e58-b471-d6d006ee3053\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.232080 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-logs\") pod \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\" (UID: \"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae\") " Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.232970 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-logs" (OuterVolumeSpecName: "logs") pod "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" (UID: "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.242798 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-scripts" (OuterVolumeSpecName: "scripts") pod "880a2246-5ead-4e58-b471-d6d006ee3053" (UID: "880a2246-5ead-4e58-b471-d6d006ee3053"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.245647 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "880a2246-5ead-4e58-b471-d6d006ee3053" (UID: "880a2246-5ead-4e58-b471-d6d006ee3053"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.255543 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-kube-api-access-qmxkk" (OuterVolumeSpecName: "kube-api-access-qmxkk") pod "8e4376c9-9b7f-4c5b-b2bc-91f1390dc118" (UID: "8e4376c9-9b7f-4c5b-b2bc-91f1390dc118"). InnerVolumeSpecName "kube-api-access-qmxkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.266978 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "880a2246-5ead-4e58-b471-d6d006ee3053" (UID: "880a2246-5ead-4e58-b471-d6d006ee3053"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.267991 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880a2246-5ead-4e58-b471-d6d006ee3053-kube-api-access-mj9lz" (OuterVolumeSpecName: "kube-api-access-mj9lz") pod "880a2246-5ead-4e58-b471-d6d006ee3053" (UID: "880a2246-5ead-4e58-b471-d6d006ee3053"). InnerVolumeSpecName "kube-api-access-mj9lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.280159 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-kube-api-access-d96hw" (OuterVolumeSpecName: "kube-api-access-d96hw") pod "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" (UID: "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae"). InnerVolumeSpecName "kube-api-access-d96hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.283035 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8e4376c9-9b7f-4c5b-b2bc-91f1390dc118" (UID: "8e4376c9-9b7f-4c5b-b2bc-91f1390dc118"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.291052 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-scripts" (OuterVolumeSpecName: "scripts") pod "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" (UID: "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.295611 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-config-data" (OuterVolumeSpecName: "config-data") pod "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" (UID: "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.304194 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e4376c9-9b7f-4c5b-b2bc-91f1390dc118" (UID: "8e4376c9-9b7f-4c5b-b2bc-91f1390dc118"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.310515 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" (UID: "b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.314011 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "880a2246-5ead-4e58-b471-d6d006ee3053" (UID: "880a2246-5ead-4e58-b471-d6d006ee3053"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.314102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-config-data" (OuterVolumeSpecName: "config-data") pod "880a2246-5ead-4e58-b471-d6d006ee3053" (UID: "880a2246-5ead-4e58-b471-d6d006ee3053"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333764 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333805 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333821 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj9lz\" (UniqueName: \"kubernetes.io/projected/880a2246-5ead-4e58-b471-d6d006ee3053-kube-api-access-mj9lz\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333835 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333847 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333875 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d96hw\" (UniqueName: \"kubernetes.io/projected/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-kube-api-access-d96hw\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333887 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333898 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333908 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333919 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333930 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmxkk\" (UniqueName: \"kubernetes.io/projected/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-kube-api-access-qmxkk\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333943 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333954 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.333964 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/880a2246-5ead-4e58-b471-d6d006ee3053-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.770312 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7tr8p" event={"ID":"8e4376c9-9b7f-4c5b-b2bc-91f1390dc118","Type":"ContainerDied","Data":"800f75f62917be4211d5d60a245f29539c7d9f91f590687bd6c07ff1753ca4c7"} Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.770736 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800f75f62917be4211d5d60a245f29539c7d9f91f590687bd6c07ff1753ca4c7" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.770358 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7tr8p" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.771765 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eeb9deff-99f7-4425-a84c-a520d67430ed","Type":"ContainerStarted","Data":"01124ab6c992a23e424e39162e51555f9ad606ca75fce6a2ce1d8b36aa46df44"} Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.773182 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ft8mk" event={"ID":"b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae","Type":"ContainerDied","Data":"47f58c4d0f9305cea33b36ff5a5a0b8436f38a2e30991bc881d7cc465e0e6aa8"} Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.773206 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47f58c4d0f9305cea33b36ff5a5a0b8436f38a2e30991bc881d7cc465e0e6aa8" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.773366 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ft8mk" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.774729 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cn57v" event={"ID":"880a2246-5ead-4e58-b471-d6d006ee3053","Type":"ContainerDied","Data":"24bdfef5211b9b7989f872e8db095c9f75d56bbf499b3df2e979f98ad09aab5f"} Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.774752 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24bdfef5211b9b7989f872e8db095c9f75d56bbf499b3df2e979f98ad09aab5f" Mar 13 15:24:41 crc kubenswrapper[4786]: I0313 15:24:41.774782 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cn57v" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.370563 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78cf8bc498-pb2xv"] Mar 13 15:24:42 crc kubenswrapper[4786]: E0313 15:24:42.370996 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880a2246-5ead-4e58-b471-d6d006ee3053" containerName="keystone-bootstrap" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.371013 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="880a2246-5ead-4e58-b471-d6d006ee3053" containerName="keystone-bootstrap" Mar 13 15:24:42 crc kubenswrapper[4786]: E0313 15:24:42.371033 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4376c9-9b7f-4c5b-b2bc-91f1390dc118" containerName="barbican-db-sync" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.371041 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4376c9-9b7f-4c5b-b2bc-91f1390dc118" containerName="barbican-db-sync" Mar 13 15:24:42 crc kubenswrapper[4786]: E0313 15:24:42.371049 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" containerName="placement-db-sync" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.371057 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" containerName="placement-db-sync" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.371262 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="880a2246-5ead-4e58-b471-d6d006ee3053" containerName="keystone-bootstrap" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.371279 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" containerName="placement-db-sync" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.371306 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4376c9-9b7f-4c5b-b2bc-91f1390dc118" containerName="barbican-db-sync" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.372308 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.376424 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ltrgl" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.376574 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.376601 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.376753 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.376903 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.386804 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7bd8dddbcb-wmbct"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.388311 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.396377 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.396604 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.396739 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.396878 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dsq8r" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.397047 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.397149 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.408059 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78cf8bc498-pb2xv"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.418442 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bd8dddbcb-wmbct"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452103 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-credential-keys\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452388 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-scripts\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452455 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-internal-tls-certs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452531 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-public-tls-certs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452599 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5td5\" (UniqueName: \"kubernetes.io/projected/9f459571-980e-439d-9dc2-72c0461a20c9-kube-api-access-q5td5\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-config-data\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452746 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-combined-ca-bundle\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452818 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-fernet-keys\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452901 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-scripts\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.452975 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-internal-tls-certs\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.453044 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-combined-ca-bundle\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.453155 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52da1f88-a7fd-4c07-9db5-6651531c94a2-logs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.453314 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppv5q\" (UniqueName: \"kubernetes.io/projected/52da1f88-a7fd-4c07-9db5-6651531c94a2-kube-api-access-ppv5q\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.453352 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-config-data\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.453369 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-public-tls-certs\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.523793 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-679bcd6995-7xnv6"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.538782 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5cd697874d-c2gtp"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.544710 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.538980 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.549087 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.549376 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n7hpw" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.549506 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.554763 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52da1f88-a7fd-4c07-9db5-6651531c94a2-logs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555000 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppv5q\" (UniqueName: \"kubernetes.io/projected/52da1f88-a7fd-4c07-9db5-6651531c94a2-kube-api-access-ppv5q\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555093 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-config-data\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555157 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-public-tls-certs\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-credential-keys\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555519 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-scripts\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555581 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-internal-tls-certs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555648 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-public-tls-certs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555712 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5td5\" (UniqueName: \"kubernetes.io/projected/9f459571-980e-439d-9dc2-72c0461a20c9-kube-api-access-q5td5\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555776 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-config-data\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555842 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-combined-ca-bundle\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.555938 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-fernet-keys\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.557305 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-scripts\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.557410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-internal-tls-certs\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.557517 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-combined-ca-bundle\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.559300 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.560404 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52da1f88-a7fd-4c07-9db5-6651531c94a2-logs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.562222 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-scripts\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.568553 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-scripts\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.569663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-config-data\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.569750 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-internal-tls-certs\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.570067 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-combined-ca-bundle\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.570506 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-config-data\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.571189 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-combined-ca-bundle\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.572006 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-credential-keys\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.578404 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-fernet-keys\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.588282 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-679bcd6995-7xnv6"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.591543 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-internal-tls-certs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.592055 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-public-tls-certs\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.592453 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-public-tls-certs\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.597735 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5td5\" (UniqueName: \"kubernetes.io/projected/9f459571-980e-439d-9dc2-72c0461a20c9-kube-api-access-q5td5\") pod \"keystone-7bd8dddbcb-wmbct\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.598609 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5cd697874d-c2gtp"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.599662 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppv5q\" (UniqueName: \"kubernetes.io/projected/52da1f88-a7fd-4c07-9db5-6651531c94a2-kube-api-access-ppv5q\") pod \"placement-78cf8bc498-pb2xv\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.675726 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676078 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdx8f\" (UniqueName: \"kubernetes.io/projected/dea0a108-dc5a-4700-a956-563674797beb-kube-api-access-pdx8f\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676119 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-combined-ca-bundle\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676141 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea0a108-dc5a-4700-a956-563674797beb-logs\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676168 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwl5h\" (UniqueName: \"kubernetes.io/projected/1063e48c-fed7-49b9-89f2-186b4627caea-kube-api-access-bwl5h\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676219 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-combined-ca-bundle\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676261 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data-custom\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676279 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676296 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1063e48c-fed7-49b9-89f2-186b4627caea-logs\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.676321 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data-custom\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.704829 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.718719 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.737434 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79fd7f986f-ff2xn"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.738927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.754515 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fd7f986f-ff2xn"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.777969 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data-custom\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778163 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1063e48c-fed7-49b9-89f2-186b4627caea-logs\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778302 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data-custom\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778392 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778498 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-svc\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778579 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdx8f\" (UniqueName: \"kubernetes.io/projected/dea0a108-dc5a-4700-a956-563674797beb-kube-api-access-pdx8f\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778649 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-swift-storage-0\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778722 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-nb\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778795 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-combined-ca-bundle\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778893 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea0a108-dc5a-4700-a956-563674797beb-logs\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.778963 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-config\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.779031 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwl5h\" (UniqueName: \"kubernetes.io/projected/1063e48c-fed7-49b9-89f2-186b4627caea-kube-api-access-bwl5h\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.779111 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-combined-ca-bundle\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.779176 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcwk5\" (UniqueName: \"kubernetes.io/projected/460f99f9-139a-4c17-8ee1-6beaecb83af5-kube-api-access-zcwk5\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.779252 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-sb\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.799105 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea0a108-dc5a-4700-a956-563674797beb-logs\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.801196 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.801712 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data-custom\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.801713 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.803653 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1063e48c-fed7-49b9-89f2-186b4627caea-logs\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.806519 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data-custom\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.812437 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-combined-ca-bundle\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.828446 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-combined-ca-bundle\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.844290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwl5h\" (UniqueName: \"kubernetes.io/projected/1063e48c-fed7-49b9-89f2-186b4627caea-kube-api-access-bwl5h\") pod \"barbican-keystone-listener-5cd697874d-c2gtp\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.847804 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdx8f\" (UniqueName: \"kubernetes.io/projected/dea0a108-dc5a-4700-a956-563674797beb-kube-api-access-pdx8f\") pod \"barbican-worker-679bcd6995-7xnv6\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.873928 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7579f6547f-hnpzx"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.875522 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.882264 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-svc\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.882318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-swift-storage-0\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.882344 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-nb\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.882378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-config\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.882406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcwk5\" (UniqueName: \"kubernetes.io/projected/460f99f9-139a-4c17-8ee1-6beaecb83af5-kube-api-access-zcwk5\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.882429 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-sb\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.883219 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-sb\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.883707 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-svc\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.884224 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-swift-storage-0\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.884719 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-nb\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.885959 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-config\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.908844 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-57d8bd5bb-fsm9r"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.910253 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.932885 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcwk5\" (UniqueName: \"kubernetes.io/projected/460f99f9-139a-4c17-8ee1-6beaecb83af5-kube-api-access-zcwk5\") pod \"dnsmasq-dns-79fd7f986f-ff2xn\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.966911 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d5545f5f4-6p7nq"] Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.968321 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:42 crc kubenswrapper[4786]: I0313 15:24:42.997220 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-57d8bd5bb-fsm9r"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.998620 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7579f6547f-hnpzx"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999463 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70dc1403-e7e9-4200-9a87-e3538a17c350-logs\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999585 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-combined-ca-bundle\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999627 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-combined-ca-bundle\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999651 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31158646-2c0c-4098-bd3e-ea307fa78716-logs\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999682 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999733 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfn6l\" (UniqueName: \"kubernetes.io/projected/70dc1403-e7e9-4200-9a87-e3538a17c350-kube-api-access-kfn6l\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999759 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data-custom\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999780 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data-custom\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:42.999821 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xnb7\" (UniqueName: \"kubernetes.io/projected/31158646-2c0c-4098-bd3e-ea307fa78716-kube-api-access-9xnb7\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.006099 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.006706 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d5545f5f4-6p7nq"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.015651 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.038452 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.057091 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-865786b7bb-9cnjb"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.058376 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.074464 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-865786b7bb-9cnjb"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.074926 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.133775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.133872 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70dc1403-e7e9-4200-9a87-e3538a17c350-logs\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.133947 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-combined-ca-bundle\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.133995 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-combined-ca-bundle\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134049 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-combined-ca-bundle\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134079 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31158646-2c0c-4098-bd3e-ea307fa78716-logs\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134112 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134172 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfn6l\" (UniqueName: \"kubernetes.io/projected/70dc1403-e7e9-4200-9a87-e3538a17c350-kube-api-access-kfn6l\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134208 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data-custom\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134235 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data-custom\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134266 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwslc\" (UniqueName: \"kubernetes.io/projected/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-kube-api-access-gwslc\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134299 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-logs\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134336 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134370 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xnb7\" (UniqueName: \"kubernetes.io/projected/31158646-2c0c-4098-bd3e-ea307fa78716-kube-api-access-9xnb7\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.134414 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data-custom\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.141590 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70dc1403-e7e9-4200-9a87-e3538a17c350-logs\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.150364 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31158646-2c0c-4098-bd3e-ea307fa78716-logs\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.165212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-combined-ca-bundle\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.173336 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.197541 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.211898 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-combined-ca-bundle\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.214710 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfn6l\" (UniqueName: \"kubernetes.io/projected/70dc1403-e7e9-4200-9a87-e3538a17c350-kube-api-access-kfn6l\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.235030 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data-custom\") pod \"barbican-worker-7579f6547f-hnpzx\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.236792 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-combined-ca-bundle\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.236925 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-combined-ca-bundle\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237000 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt99p\" (UniqueName: \"kubernetes.io/projected/826794b9-41ec-4cab-bc85-d426d8e2a38b-kube-api-access-zt99p\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237101 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-config-data\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237183 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/826794b9-41ec-4cab-bc85-d426d8e2a38b-logs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237271 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-public-tls-certs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237360 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-scripts\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237440 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwslc\" (UniqueName: \"kubernetes.io/projected/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-kube-api-access-gwslc\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237515 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-logs\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237609 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-internal-tls-certs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.237794 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data-custom\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.238987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-logs\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.245188 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data-custom\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.252058 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data-custom\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.270807 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-combined-ca-bundle\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.272459 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xnb7\" (UniqueName: \"kubernetes.io/projected/31158646-2c0c-4098-bd3e-ea307fa78716-kube-api-access-9xnb7\") pod \"barbican-keystone-listener-57d8bd5bb-fsm9r\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.290077 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.301281 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwslc\" (UniqueName: \"kubernetes.io/projected/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-kube-api-access-gwslc\") pod \"barbican-api-7d5545f5f4-6p7nq\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.331628 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b6bd986b8-jdtlc"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.349111 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.349133 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b6bd986b8-jdtlc"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.351815 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/826794b9-41ec-4cab-bc85-d426d8e2a38b-logs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.351966 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-public-tls-certs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.352038 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-scripts\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.352099 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-internal-tls-certs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.352210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-combined-ca-bundle\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.352256 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt99p\" (UniqueName: \"kubernetes.io/projected/826794b9-41ec-4cab-bc85-d426d8e2a38b-kube-api-access-zt99p\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.352332 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-config-data\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.353266 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/826794b9-41ec-4cab-bc85-d426d8e2a38b-logs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.358822 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-public-tls-certs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.366029 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-config-data\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.366386 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-scripts\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.367708 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-internal-tls-certs\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.368847 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-combined-ca-bundle\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.399816 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.426119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt99p\" (UniqueName: \"kubernetes.io/projected/826794b9-41ec-4cab-bc85-d426d8e2a38b-kube-api-access-zt99p\") pod \"placement-865786b7bb-9cnjb\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.453757 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data-custom\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.453844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq2ds\" (UniqueName: \"kubernetes.io/projected/50f3dfe4-74a4-4d75-83a2-0109a8dda909-kube-api-access-fq2ds\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.454128 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-combined-ca-bundle\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.458340 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f3dfe4-74a4-4d75-83a2-0109a8dda909-logs\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.458376 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.515007 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.556606 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.560578 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2ds\" (UniqueName: \"kubernetes.io/projected/50f3dfe4-74a4-4d75-83a2-0109a8dda909-kube-api-access-fq2ds\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.560626 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-combined-ca-bundle\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.560665 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f3dfe4-74a4-4d75-83a2-0109a8dda909-logs\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.560682 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.560767 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data-custom\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.561784 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f3dfe4-74a4-4d75-83a2-0109a8dda909-logs\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.567212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-combined-ca-bundle\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.569610 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data-custom\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.572267 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.594300 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bd8dddbcb-wmbct"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.594750 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq2ds\" (UniqueName: \"kubernetes.io/projected/50f3dfe4-74a4-4d75-83a2-0109a8dda909-kube-api-access-fq2ds\") pod \"barbican-api-6b6bd986b8-jdtlc\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.628043 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.701945 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.745102 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78cf8bc498-pb2xv"] Mar 13 15:24:43 crc kubenswrapper[4786]: W0313 15:24:43.770222 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52da1f88_a7fd_4c07_9db5_6651531c94a2.slice/crio-5ad552dae743cb431681da6d6371026673f4c330d48ac7da71d3966f92d025a8 WatchSource:0}: Error finding container 5ad552dae743cb431681da6d6371026673f4c330d48ac7da71d3966f92d025a8: Status 404 returned error can't find the container with id 5ad552dae743cb431681da6d6371026673f4c330d48ac7da71d3966f92d025a8 Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.827649 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eeb9deff-99f7-4425-a84c-a520d67430ed","Type":"ContainerStarted","Data":"7bfe1d57d354079630c40a517042fed84f22086574bff47d637020a997e365c0"} Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.848339 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cf8bc498-pb2xv" event={"ID":"52da1f88-a7fd-4c07-9db5-6651531c94a2","Type":"ContainerStarted","Data":"5ad552dae743cb431681da6d6371026673f4c330d48ac7da71d3966f92d025a8"} Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.862912 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd8dddbcb-wmbct" event={"ID":"9f459571-980e-439d-9dc2-72c0461a20c9","Type":"ContainerStarted","Data":"cfa37fbb41c94d30c55519c33ef449409328ab30bb5ccdb5d0c677e069ab5d33"} Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.873329 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5cd697874d-c2gtp"] Mar 13 15:24:43 crc kubenswrapper[4786]: I0313 15:24:43.984014 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-679bcd6995-7xnv6"] Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.096942 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d5545f5f4-6p7nq"] Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.105895 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fd7f986f-ff2xn"] Mar 13 15:24:44 crc kubenswrapper[4786]: W0313 15:24:44.106512 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod460f99f9_139a_4c17_8ee1_6beaecb83af5.slice/crio-56744cd66a325bc4406e5e297ceffae2cb6b01e26bf8025f092cfd7ea4495602 WatchSource:0}: Error finding container 56744cd66a325bc4406e5e297ceffae2cb6b01e26bf8025f092cfd7ea4495602: Status 404 returned error can't find the container with id 56744cd66a325bc4406e5e297ceffae2cb6b01e26bf8025f092cfd7ea4495602 Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.301652 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-57d8bd5bb-fsm9r"] Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.314550 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7579f6547f-hnpzx"] Mar 13 15:24:44 crc kubenswrapper[4786]: W0313 15:24:44.381394 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31158646_2c0c_4098_bd3e_ea307fa78716.slice/crio-1627c51a27d60b2c85430bf40f8807202c1ec2d2b8cc26ed99cdfaf0fb58beed WatchSource:0}: Error finding container 1627c51a27d60b2c85430bf40f8807202c1ec2d2b8cc26ed99cdfaf0fb58beed: Status 404 returned error can't find the container with id 1627c51a27d60b2c85430bf40f8807202c1ec2d2b8cc26ed99cdfaf0fb58beed Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.447576 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b6bd986b8-jdtlc"] Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.457057 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-865786b7bb-9cnjb"] Mar 13 15:24:44 crc kubenswrapper[4786]: W0313 15:24:44.476807 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50f3dfe4_74a4_4d75_83a2_0109a8dda909.slice/crio-0331ed221dc34d922ae87c6ab10cf4423462bccd75ac827e47ee6d5c3abded23 WatchSource:0}: Error finding container 0331ed221dc34d922ae87c6ab10cf4423462bccd75ac827e47ee6d5c3abded23: Status 404 returned error can't find the container with id 0331ed221dc34d922ae87c6ab10cf4423462bccd75ac827e47ee6d5c3abded23 Mar 13 15:24:44 crc kubenswrapper[4786]: W0313 15:24:44.522848 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod826794b9_41ec_4cab_bc85_d426d8e2a38b.slice/crio-011b25deaef5769111da43bda3c7b349931dde136cd29387f48e5f16283a07da WatchSource:0}: Error finding container 011b25deaef5769111da43bda3c7b349931dde136cd29387f48e5f16283a07da: Status 404 returned error can't find the container with id 011b25deaef5769111da43bda3c7b349931dde136cd29387f48e5f16283a07da Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.879805 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" event={"ID":"1063e48c-fed7-49b9-89f2-186b4627caea","Type":"ContainerStarted","Data":"6ebb37d204b7d57efc1a565452a94fc943aa32d22e889c150727ef28f081ad31"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.883664 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679bcd6995-7xnv6" event={"ID":"dea0a108-dc5a-4700-a956-563674797beb","Type":"ContainerStarted","Data":"576b4bea50e3b02bb64d7f78f21b6343aee734fed3bab91b49757abb0935b2bf"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.886091 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cf8bc498-pb2xv" event={"ID":"52da1f88-a7fd-4c07-9db5-6651531c94a2","Type":"ContainerStarted","Data":"46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.886114 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cf8bc498-pb2xv" event={"ID":"52da1f88-a7fd-4c07-9db5-6651531c94a2","Type":"ContainerStarted","Data":"6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.887238 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.887259 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.888440 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5545f5f4-6p7nq" event={"ID":"f10b99b0-adf8-4e02-a0f7-d551c8b4c748","Type":"ContainerStarted","Data":"6043cccaaf2e268e4940676e68304b5eac90355dcf2078f37620763889987707"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.891588 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579f6547f-hnpzx" event={"ID":"70dc1403-e7e9-4200-9a87-e3538a17c350","Type":"ContainerStarted","Data":"0a0d673bcf1c7d505deff9320e646f50ec818815f39bf1e9f9a5801612c21ae9"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.906712 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" event={"ID":"31158646-2c0c-4098-bd3e-ea307fa78716","Type":"ContainerStarted","Data":"1627c51a27d60b2c85430bf40f8807202c1ec2d2b8cc26ed99cdfaf0fb58beed"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.913194 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78cf8bc498-pb2xv" podStartSLOduration=2.913177099 podStartE2EDuration="2.913177099s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:44.9124146 +0000 UTC m=+1315.075626411" watchObservedRunningTime="2026-03-13 15:24:44.913177099 +0000 UTC m=+1315.076388910" Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.928071 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eeb9deff-99f7-4425-a84c-a520d67430ed","Type":"ContainerStarted","Data":"0399f2db3f72283f5c6cc3ac379a8523fcbcc9d2bccb605cb969c082a8cacc8f"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.941319 4786 generic.go:334] "Generic (PLEG): container finished" podID="460f99f9-139a-4c17-8ee1-6beaecb83af5" containerID="01a645eac9a46ffa0fa581fe0d32400425ac783a642a5a07582375c1c3522791" exitCode=0 Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.941417 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" event={"ID":"460f99f9-139a-4c17-8ee1-6beaecb83af5","Type":"ContainerDied","Data":"01a645eac9a46ffa0fa581fe0d32400425ac783a642a5a07582375c1c3522791"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.941448 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" event={"ID":"460f99f9-139a-4c17-8ee1-6beaecb83af5","Type":"ContainerStarted","Data":"56744cd66a325bc4406e5e297ceffae2cb6b01e26bf8025f092cfd7ea4495602"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.946653 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd8dddbcb-wmbct" event={"ID":"9f459571-980e-439d-9dc2-72c0461a20c9","Type":"ContainerStarted","Data":"9ac0d8df2995d1804d0af60083262cacb5da80d9832a9709ff3c01990e8e9cea"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.947515 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.948772 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6bd986b8-jdtlc" event={"ID":"50f3dfe4-74a4-4d75-83a2-0109a8dda909","Type":"ContainerStarted","Data":"0331ed221dc34d922ae87c6ab10cf4423462bccd75ac827e47ee6d5c3abded23"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.949985 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-865786b7bb-9cnjb" event={"ID":"826794b9-41ec-4cab-bc85-d426d8e2a38b","Type":"ContainerStarted","Data":"011b25deaef5769111da43bda3c7b349931dde136cd29387f48e5f16283a07da"} Mar 13 15:24:44 crc kubenswrapper[4786]: I0313 15:24:44.957414 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.95739433 podStartE2EDuration="7.95739433s" podCreationTimestamp="2026-03-13 15:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:44.956354364 +0000 UTC m=+1315.119566175" watchObservedRunningTime="2026-03-13 15:24:44.95739433 +0000 UTC m=+1315.120606141" Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.011226 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7bd8dddbcb-wmbct" podStartSLOduration=3.011206723 podStartE2EDuration="3.011206723s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:45.00912552 +0000 UTC m=+1315.172337331" watchObservedRunningTime="2026-03-13 15:24:45.011206723 +0000 UTC m=+1315.174418534" Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.962912 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6bd986b8-jdtlc" event={"ID":"50f3dfe4-74a4-4d75-83a2-0109a8dda909","Type":"ContainerStarted","Data":"63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.963515 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6bd986b8-jdtlc" event={"ID":"50f3dfe4-74a4-4d75-83a2-0109a8dda909","Type":"ContainerStarted","Data":"878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.963579 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.963605 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.974616 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerStarted","Data":"c2b0b2ac11b40e7fb06e5cffc1a3823448827504835d20bd965320283d7e4736"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.977517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5545f5f4-6p7nq" event={"ID":"f10b99b0-adf8-4e02-a0f7-d551c8b4c748","Type":"ContainerStarted","Data":"e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.977556 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5545f5f4-6p7nq" event={"ID":"f10b99b0-adf8-4e02-a0f7-d551c8b4c748","Type":"ContainerStarted","Data":"e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.977688 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.979496 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-865786b7bb-9cnjb" event={"ID":"826794b9-41ec-4cab-bc85-d426d8e2a38b","Type":"ContainerStarted","Data":"d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.979551 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-865786b7bb-9cnjb" event={"ID":"826794b9-41ec-4cab-bc85-d426d8e2a38b","Type":"ContainerStarted","Data":"241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.979655 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.979677 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.981212 4786 generic.go:334] "Generic (PLEG): container finished" podID="ed27af11-89ca-4d9c-b654-800740dfc742" containerID="99d73e1c554b9b1b11403cbffe4ce8b73fd05d82195bf4be2a43decef93be656" exitCode=0 Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.981263 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqx9f" event={"ID":"ed27af11-89ca-4d9c-b654-800740dfc742","Type":"ContainerDied","Data":"99d73e1c554b9b1b11403cbffe4ce8b73fd05d82195bf4be2a43decef93be656"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.984595 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" event={"ID":"460f99f9-139a-4c17-8ee1-6beaecb83af5","Type":"ContainerStarted","Data":"8e521c44f16b5d0cb30b4346c240c3602de70cccd11ce3d203cbbef4d12c108e"} Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.984718 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:45 crc kubenswrapper[4786]: I0313 15:24:45.994972 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b6bd986b8-jdtlc" podStartSLOduration=2.9949569780000003 podStartE2EDuration="2.994956978s" podCreationTimestamp="2026-03-13 15:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:45.98865266 +0000 UTC m=+1316.151864471" watchObservedRunningTime="2026-03-13 15:24:45.994956978 +0000 UTC m=+1316.158168789" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.039547 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d5545f5f4-6p7nq" podStartSLOduration=4.039531118 podStartE2EDuration="4.039531118s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:46.036967094 +0000 UTC m=+1316.200178895" watchObservedRunningTime="2026-03-13 15:24:46.039531118 +0000 UTC m=+1316.202742929" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.076633 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-865786b7bb-9cnjb" podStartSLOduration=4.076619521 podStartE2EDuration="4.076619521s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:46.065406989 +0000 UTC m=+1316.228618800" watchObservedRunningTime="2026-03-13 15:24:46.076619521 +0000 UTC m=+1316.239831332" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.106237 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" podStartSLOduration=4.106215914 podStartE2EDuration="4.106215914s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:46.098824779 +0000 UTC m=+1316.262036590" watchObservedRunningTime="2026-03-13 15:24:46.106215914 +0000 UTC m=+1316.269427725" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.317357 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d5545f5f4-6p7nq"] Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.348124 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54bc4948fd-47bbp"] Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.350002 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.351808 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.351994 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.364200 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54bc4948fd-47bbp"] Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.424025 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-combined-ca-bundle\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.424079 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data-custom\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.424124 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-internal-tls-certs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.424205 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.424237 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd7qd\" (UniqueName: \"kubernetes.io/projected/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-kube-api-access-jd7qd\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.424271 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-public-tls-certs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.424292 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-logs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.525826 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd7qd\" (UniqueName: \"kubernetes.io/projected/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-kube-api-access-jd7qd\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.525913 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-public-tls-certs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.525939 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-logs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.525986 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-combined-ca-bundle\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.526012 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data-custom\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.526062 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-internal-tls-certs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.526117 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.526507 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-logs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.531832 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-public-tls-certs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.532731 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data-custom\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.533537 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.535244 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-combined-ca-bundle\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.548153 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-internal-tls-certs\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.560389 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd7qd\" (UniqueName: \"kubernetes.io/projected/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-kube-api-access-jd7qd\") pod \"barbican-api-54bc4948fd-47bbp\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:46 crc kubenswrapper[4786]: I0313 15:24:46.670602 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.005123 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.126287 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.126627 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.200339 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.218179 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.428830 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54bc4948fd-47bbp"] Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.580118 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.650281 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlpjc\" (UniqueName: \"kubernetes.io/projected/ed27af11-89ca-4d9c-b654-800740dfc742-kube-api-access-rlpjc\") pod \"ed27af11-89ca-4d9c-b654-800740dfc742\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.650470 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-combined-ca-bundle\") pod \"ed27af11-89ca-4d9c-b654-800740dfc742\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.650656 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-config\") pod \"ed27af11-89ca-4d9c-b654-800740dfc742\" (UID: \"ed27af11-89ca-4d9c-b654-800740dfc742\") " Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.661052 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed27af11-89ca-4d9c-b654-800740dfc742-kube-api-access-rlpjc" (OuterVolumeSpecName: "kube-api-access-rlpjc") pod "ed27af11-89ca-4d9c-b654-800740dfc742" (UID: "ed27af11-89ca-4d9c-b654-800740dfc742"). InnerVolumeSpecName "kube-api-access-rlpjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.726046 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-config" (OuterVolumeSpecName: "config") pod "ed27af11-89ca-4d9c-b654-800740dfc742" (UID: "ed27af11-89ca-4d9c-b654-800740dfc742"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.758536 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.758565 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlpjc\" (UniqueName: \"kubernetes.io/projected/ed27af11-89ca-4d9c-b654-800740dfc742-kube-api-access-rlpjc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.806713 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed27af11-89ca-4d9c-b654-800740dfc742" (UID: "ed27af11-89ca-4d9c-b654-800740dfc742"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:47 crc kubenswrapper[4786]: I0313 15:24:47.861987 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed27af11-89ca-4d9c-b654-800740dfc742-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.017902 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" event={"ID":"1063e48c-fed7-49b9-89f2-186b4627caea","Type":"ContainerStarted","Data":"60a51f9d609a82f2f63e61bdebf44e301bc76385b779ef4d3c4e833b60346e07"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.019183 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" event={"ID":"1063e48c-fed7-49b9-89f2-186b4627caea","Type":"ContainerStarted","Data":"a4f3f4569c5f0cc8dfdcbbd90eb9b6d5c153327bfd23daa7ca118edda5baf004"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.024665 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679bcd6995-7xnv6" event={"ID":"dea0a108-dc5a-4700-a956-563674797beb","Type":"ContainerStarted","Data":"e1b6010fd0cac4c86214670d608b7906e9b44204e2d697e0a76ab25d512038bd"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.024707 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679bcd6995-7xnv6" event={"ID":"dea0a108-dc5a-4700-a956-563674797beb","Type":"ContainerStarted","Data":"bdb0179cb2a6bc6617c488f820d28e7e3a40df645ba79cc6d779129a882afa34"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.040880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqx9f" event={"ID":"ed27af11-89ca-4d9c-b654-800740dfc742","Type":"ContainerDied","Data":"a619a64e38dc5d974d66ed292bf67568d185dc6a3e6a611c7d663734ae77d97f"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.040914 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a619a64e38dc5d974d66ed292bf67568d185dc6a3e6a611c7d663734ae77d97f" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.041225 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqx9f" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.048789 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54bc4948fd-47bbp" event={"ID":"94c381c8-c97e-4159-9bb4-3ede8f12d6e0","Type":"ContainerStarted","Data":"72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.048843 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54bc4948fd-47bbp" event={"ID":"94c381c8-c97e-4159-9bb4-3ede8f12d6e0","Type":"ContainerStarted","Data":"62377fae63cb2c4eef5f4e3219777937337e511debb2239e060c7b16f2d22b75"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.049749 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.049784 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.052489 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579f6547f-hnpzx" event={"ID":"70dc1403-e7e9-4200-9a87-e3538a17c350","Type":"ContainerStarted","Data":"26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.052523 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579f6547f-hnpzx" event={"ID":"70dc1403-e7e9-4200-9a87-e3538a17c350","Type":"ContainerStarted","Data":"1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.069661 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" event={"ID":"31158646-2c0c-4098-bd3e-ea307fa78716","Type":"ContainerStarted","Data":"3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.071473 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" event={"ID":"31158646-2c0c-4098-bd3e-ea307fa78716","Type":"ContainerStarted","Data":"fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e"} Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.072213 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.072350 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.070222 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d5545f5f4-6p7nq" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerName="barbican-api" containerID="cri-o://e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f" gracePeriod=30 Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.069721 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d5545f5f4-6p7nq" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerName="barbican-api-log" containerID="cri-o://e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f" gracePeriod=30 Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.091779 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" podStartSLOduration=3.077798338 podStartE2EDuration="6.091761129s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="2026-03-13 15:24:43.872203956 +0000 UTC m=+1314.035415767" lastFinishedPulling="2026-03-13 15:24:46.886166737 +0000 UTC m=+1317.049378558" observedRunningTime="2026-03-13 15:24:48.035478924 +0000 UTC m=+1318.198690745" watchObservedRunningTime="2026-03-13 15:24:48.091761129 +0000 UTC m=+1318.254972940" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.114076 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-679bcd6995-7xnv6" podStartSLOduration=3.217615151 podStartE2EDuration="6.114051929s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="2026-03-13 15:24:43.990611002 +0000 UTC m=+1314.153822813" lastFinishedPulling="2026-03-13 15:24:46.88704778 +0000 UTC m=+1317.050259591" observedRunningTime="2026-03-13 15:24:48.063259472 +0000 UTC m=+1318.226471293" watchObservedRunningTime="2026-03-13 15:24:48.114051929 +0000 UTC m=+1318.277263740" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.136651 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54bc4948fd-47bbp" podStartSLOduration=2.136626576 podStartE2EDuration="2.136626576s" podCreationTimestamp="2026-03-13 15:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:48.090693232 +0000 UTC m=+1318.253905063" watchObservedRunningTime="2026-03-13 15:24:48.136626576 +0000 UTC m=+1318.299838387" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.145582 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7579f6547f-hnpzx" podStartSLOduration=3.649496186 podStartE2EDuration="6.145556581s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="2026-03-13 15:24:44.390108063 +0000 UTC m=+1314.553319874" lastFinishedPulling="2026-03-13 15:24:46.886168458 +0000 UTC m=+1317.049380269" observedRunningTime="2026-03-13 15:24:48.114432819 +0000 UTC m=+1318.277644630" watchObservedRunningTime="2026-03-13 15:24:48.145556581 +0000 UTC m=+1318.308768392" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.163793 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.164973 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.168803 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" podStartSLOduration=3.672796933 podStartE2EDuration="6.168781715s" podCreationTimestamp="2026-03-13 15:24:42 +0000 UTC" firstStartedPulling="2026-03-13 15:24:44.390143454 +0000 UTC m=+1314.553355265" lastFinishedPulling="2026-03-13 15:24:46.886128236 +0000 UTC m=+1317.049340047" observedRunningTime="2026-03-13 15:24:48.138558355 +0000 UTC m=+1318.301770166" watchObservedRunningTime="2026-03-13 15:24:48.168781715 +0000 UTC m=+1318.331993516" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.191771 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-679bcd6995-7xnv6"] Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.205017 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5cd697874d-c2gtp"] Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.217703 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.266998 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.339587 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79fd7f986f-ff2xn"] Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.342494 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" podUID="460f99f9-139a-4c17-8ee1-6beaecb83af5" containerName="dnsmasq-dns" containerID="cri-o://8e521c44f16b5d0cb30b4346c240c3602de70cccd11ce3d203cbbef4d12c108e" gracePeriod=10 Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.398514 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-htkvf"] Mar 13 15:24:48 crc kubenswrapper[4786]: E0313 15:24:48.400719 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed27af11-89ca-4d9c-b654-800740dfc742" containerName="neutron-db-sync" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.400749 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed27af11-89ca-4d9c-b654-800740dfc742" containerName="neutron-db-sync" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.404365 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed27af11-89ca-4d9c-b654-800740dfc742" containerName="neutron-db-sync" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.407611 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.416728 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-htkvf"] Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.490236 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.490309 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k65h6\" (UniqueName: \"kubernetes.io/projected/e2a499a4-86b9-4122-aa89-cb5a77be5227-kube-api-access-k65h6\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.490342 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.490386 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.490488 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-config\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.490513 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.541585 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6555f55d84-r89pv"] Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.543023 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.553412 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.553781 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.554070 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.554097 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gqv7f" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598425 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-ovndb-tls-certs\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598479 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-config\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598498 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598522 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598551 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-combined-ca-bundle\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598588 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k65h6\" (UniqueName: \"kubernetes.io/projected/e2a499a4-86b9-4122-aa89-cb5a77be5227-kube-api-access-k65h6\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598635 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598701 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-config\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598769 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-httpd-config\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.598820 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8jj\" (UniqueName: \"kubernetes.io/projected/20b9e933-6f29-489e-92df-ac8ed12ae33d-kube-api-access-8b8jj\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.599812 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-config\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.600351 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.600880 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-svc\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.602424 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.603547 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.610832 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6555f55d84-r89pv"] Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.641793 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k65h6\" (UniqueName: \"kubernetes.io/projected/e2a499a4-86b9-4122-aa89-cb5a77be5227-kube-api-access-k65h6\") pod \"dnsmasq-dns-7fc46d7df7-htkvf\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.699993 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-httpd-config\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.700099 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8jj\" (UniqueName: \"kubernetes.io/projected/20b9e933-6f29-489e-92df-ac8ed12ae33d-kube-api-access-8b8jj\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.700154 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-ovndb-tls-certs\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.700206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-combined-ca-bundle\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.700282 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-config\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.710557 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-config\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.729648 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-combined-ca-bundle\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.734348 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-ovndb-tls-certs\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.749673 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-httpd-config\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.775877 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8jj\" (UniqueName: \"kubernetes.io/projected/20b9e933-6f29-489e-92df-ac8ed12ae33d-kube-api-access-8b8jj\") pod \"neutron-6555f55d84-r89pv\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.827415 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:48 crc kubenswrapper[4786]: I0313 15:24:48.868265 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.102328 4786 generic.go:334] "Generic (PLEG): container finished" podID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerID="e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f" exitCode=143 Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.102686 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5545f5f4-6p7nq" event={"ID":"f10b99b0-adf8-4e02-a0f7-d551c8b4c748","Type":"ContainerDied","Data":"e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f"} Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.130945 4786 generic.go:334] "Generic (PLEG): container finished" podID="460f99f9-139a-4c17-8ee1-6beaecb83af5" containerID="8e521c44f16b5d0cb30b4346c240c3602de70cccd11ce3d203cbbef4d12c108e" exitCode=0 Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.131028 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" event={"ID":"460f99f9-139a-4c17-8ee1-6beaecb83af5","Type":"ContainerDied","Data":"8e521c44f16b5d0cb30b4346c240c3602de70cccd11ce3d203cbbef4d12c108e"} Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.134777 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54bc4948fd-47bbp" event={"ID":"94c381c8-c97e-4159-9bb4-3ede8f12d6e0","Type":"ContainerStarted","Data":"ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87"} Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.136379 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.136464 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.420888 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.432940 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-swift-storage-0\") pod \"460f99f9-139a-4c17-8ee1-6beaecb83af5\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.432994 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcwk5\" (UniqueName: \"kubernetes.io/projected/460f99f9-139a-4c17-8ee1-6beaecb83af5-kube-api-access-zcwk5\") pod \"460f99f9-139a-4c17-8ee1-6beaecb83af5\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.433015 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-sb\") pod \"460f99f9-139a-4c17-8ee1-6beaecb83af5\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.433174 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-svc\") pod \"460f99f9-139a-4c17-8ee1-6beaecb83af5\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.433224 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-nb\") pod \"460f99f9-139a-4c17-8ee1-6beaecb83af5\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.433258 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-config\") pod \"460f99f9-139a-4c17-8ee1-6beaecb83af5\" (UID: \"460f99f9-139a-4c17-8ee1-6beaecb83af5\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.461133 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460f99f9-139a-4c17-8ee1-6beaecb83af5-kube-api-access-zcwk5" (OuterVolumeSpecName: "kube-api-access-zcwk5") pod "460f99f9-139a-4c17-8ee1-6beaecb83af5" (UID: "460f99f9-139a-4c17-8ee1-6beaecb83af5"). InnerVolumeSpecName "kube-api-access-zcwk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.512627 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "460f99f9-139a-4c17-8ee1-6beaecb83af5" (UID: "460f99f9-139a-4c17-8ee1-6beaecb83af5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.516434 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "460f99f9-139a-4c17-8ee1-6beaecb83af5" (UID: "460f99f9-139a-4c17-8ee1-6beaecb83af5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.532676 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "460f99f9-139a-4c17-8ee1-6beaecb83af5" (UID: "460f99f9-139a-4c17-8ee1-6beaecb83af5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.537057 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.537094 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcwk5\" (UniqueName: \"kubernetes.io/projected/460f99f9-139a-4c17-8ee1-6beaecb83af5-kube-api-access-zcwk5\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.537105 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.537114 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: E0313 15:24:49.545133 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10b99b0_adf8_4e02_a0f7_d551c8b4c748.slice/crio-e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10b99b0_adf8_4e02_a0f7_d551c8b4c748.slice/crio-conmon-e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f.scope\": RecentStats: unable to find data in memory cache]" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.579051 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-config" (OuterVolumeSpecName: "config") pod "460f99f9-139a-4c17-8ee1-6beaecb83af5" (UID: "460f99f9-139a-4c17-8ee1-6beaecb83af5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.580097 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "460f99f9-139a-4c17-8ee1-6beaecb83af5" (UID: "460f99f9-139a-4c17-8ee1-6beaecb83af5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.641740 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.642026 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460f99f9-139a-4c17-8ee1-6beaecb83af5-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.753476 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.847430 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-combined-ca-bundle\") pod \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.847511 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data-custom\") pod \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.847557 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwslc\" (UniqueName: \"kubernetes.io/projected/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-kube-api-access-gwslc\") pod \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.847596 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data\") pod \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.847617 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-logs\") pod \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\" (UID: \"f10b99b0-adf8-4e02-a0f7-d551c8b4c748\") " Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.848355 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-logs" (OuterVolumeSpecName: "logs") pod "f10b99b0-adf8-4e02-a0f7-d551c8b4c748" (UID: "f10b99b0-adf8-4e02-a0f7-d551c8b4c748"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.868522 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f10b99b0-adf8-4e02-a0f7-d551c8b4c748" (UID: "f10b99b0-adf8-4e02-a0f7-d551c8b4c748"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.868558 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-kube-api-access-gwslc" (OuterVolumeSpecName: "kube-api-access-gwslc") pod "f10b99b0-adf8-4e02-a0f7-d551c8b4c748" (UID: "f10b99b0-adf8-4e02-a0f7-d551c8b4c748"). InnerVolumeSpecName "kube-api-access-gwslc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.894922 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f10b99b0-adf8-4e02-a0f7-d551c8b4c748" (UID: "f10b99b0-adf8-4e02-a0f7-d551c8b4c748"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.906776 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-htkvf"] Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.915992 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data" (OuterVolumeSpecName: "config-data") pod "f10b99b0-adf8-4e02-a0f7-d551c8b4c748" (UID: "f10b99b0-adf8-4e02-a0f7-d551c8b4c748"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.949619 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.949655 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwslc\" (UniqueName: \"kubernetes.io/projected/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-kube-api-access-gwslc\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.949669 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.949683 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:49 crc kubenswrapper[4786]: I0313 15:24:49.949693 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10b99b0-adf8-4e02-a0f7-d551c8b4c748-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.003494 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6555f55d84-r89pv"] Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.215319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" event={"ID":"e2a499a4-86b9-4122-aa89-cb5a77be5227","Type":"ContainerStarted","Data":"0b30cd48cd2bcf8db03c790e2a9745e6657f7f88cfd1fdaa043b29731c0e28db"} Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.219884 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6555f55d84-r89pv" event={"ID":"20b9e933-6f29-489e-92df-ac8ed12ae33d","Type":"ContainerStarted","Data":"ba78f3604cd999b75741463b390584086cde93602d009ffb6a7133fbe710ffe6"} Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.231050 4786 generic.go:334] "Generic (PLEG): container finished" podID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerID="e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f" exitCode=0 Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.231147 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d5545f5f4-6p7nq" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.231163 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5545f5f4-6p7nq" event={"ID":"f10b99b0-adf8-4e02-a0f7-d551c8b4c748","Type":"ContainerDied","Data":"e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f"} Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.231225 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5545f5f4-6p7nq" event={"ID":"f10b99b0-adf8-4e02-a0f7-d551c8b4c748","Type":"ContainerDied","Data":"6043cccaaf2e268e4940676e68304b5eac90355dcf2078f37620763889987707"} Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.231245 4786 scope.go:117] "RemoveContainer" containerID="e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.295473 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.297214 4786 scope.go:117] "RemoveContainer" containerID="e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.297326 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fd7f986f-ff2xn" event={"ID":"460f99f9-139a-4c17-8ee1-6beaecb83af5","Type":"ContainerDied","Data":"56744cd66a325bc4406e5e297ceffae2cb6b01e26bf8025f092cfd7ea4495602"} Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.297358 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d5545f5f4-6p7nq"] Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.297913 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.297923 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.298012 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" containerName="barbican-keystone-listener-log" containerID="cri-o://a4f3f4569c5f0cc8dfdcbbd90eb9b6d5c153327bfd23daa7ca118edda5baf004" gracePeriod=30 Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.298072 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" containerName="barbican-keystone-listener" containerID="cri-o://60a51f9d609a82f2f63e61bdebf44e301bc76385b779ef4d3c4e833b60346e07" gracePeriod=30 Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.298196 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-679bcd6995-7xnv6" podUID="dea0a108-dc5a-4700-a956-563674797beb" containerName="barbican-worker-log" containerID="cri-o://bdb0179cb2a6bc6617c488f820d28e7e3a40df645ba79cc6d779129a882afa34" gracePeriod=30 Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.298334 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-679bcd6995-7xnv6" podUID="dea0a108-dc5a-4700-a956-563674797beb" containerName="barbican-worker" containerID="cri-o://e1b6010fd0cac4c86214670d608b7906e9b44204e2d697e0a76ab25d512038bd" gracePeriod=30 Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.305879 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7d5545f5f4-6p7nq"] Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.372721 4786 scope.go:117] "RemoveContainer" containerID="e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f" Mar 13 15:24:50 crc kubenswrapper[4786]: E0313 15:24:50.374049 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f\": container with ID starting with e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f not found: ID does not exist" containerID="e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.374083 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f"} err="failed to get container status \"e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f\": rpc error: code = NotFound desc = could not find container \"e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f\": container with ID starting with e0c8b1ce883cf68dbe8cd0a4ad3fa4d0a39b64b919a9f2cdefdf475de44fce0f not found: ID does not exist" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.374102 4786 scope.go:117] "RemoveContainer" containerID="e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f" Mar 13 15:24:50 crc kubenswrapper[4786]: E0313 15:24:50.374568 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f\": container with ID starting with e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f not found: ID does not exist" containerID="e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.374589 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f"} err="failed to get container status \"e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f\": rpc error: code = NotFound desc = could not find container \"e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f\": container with ID starting with e14ca7a069029e9ca5f7e68060e1835c33fc6e3ede7d51155babb7cea541271f not found: ID does not exist" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.374601 4786 scope.go:117] "RemoveContainer" containerID="8e521c44f16b5d0cb30b4346c240c3602de70cccd11ce3d203cbbef4d12c108e" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.519118 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79fd7f986f-ff2xn"] Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.520654 4786 scope.go:117] "RemoveContainer" containerID="01a645eac9a46ffa0fa581fe0d32400425ac783a642a5a07582375c1c3522791" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.543205 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79fd7f986f-ff2xn"] Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.635827 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460f99f9-139a-4c17-8ee1-6beaecb83af5" path="/var/lib/kubelet/pods/460f99f9-139a-4c17-8ee1-6beaecb83af5/volumes" Mar 13 15:24:50 crc kubenswrapper[4786]: I0313 15:24:50.636815 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" path="/var/lib/kubelet/pods/f10b99b0-adf8-4e02-a0f7-d551c8b4c748/volumes" Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.366315 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nwdnc" event={"ID":"fb8a98ec-82a3-418d-82ea-d0ff210dd78d","Type":"ContainerStarted","Data":"8bbd1bdba96d406cca8c0659cce4f0f05a674440ee5ba65657adc3d6eb97a621"} Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.392190 4786 generic.go:334] "Generic (PLEG): container finished" podID="e2a499a4-86b9-4122-aa89-cb5a77be5227" containerID="2e43c3ba6da42e6f49b59d157e8257b6f1b3141ac02eac3859694d691b7b31c1" exitCode=0 Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.392274 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" event={"ID":"e2a499a4-86b9-4122-aa89-cb5a77be5227","Type":"ContainerDied","Data":"2e43c3ba6da42e6f49b59d157e8257b6f1b3141ac02eac3859694d691b7b31c1"} Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.403734 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nwdnc" podStartSLOduration=3.9770342149999998 podStartE2EDuration="42.40371306s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="2026-03-13 15:24:10.764355872 +0000 UTC m=+1280.927567683" lastFinishedPulling="2026-03-13 15:24:49.191034717 +0000 UTC m=+1319.354246528" observedRunningTime="2026-03-13 15:24:51.396918019 +0000 UTC m=+1321.560129830" watchObservedRunningTime="2026-03-13 15:24:51.40371306 +0000 UTC m=+1321.566924871" Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.411626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6555f55d84-r89pv" event={"ID":"20b9e933-6f29-489e-92df-ac8ed12ae33d","Type":"ContainerStarted","Data":"a1ebd20687d6a618027792e71e4c3c2956017515d7c8b4635255e2dd1bd90cea"} Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.411917 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6555f55d84-r89pv" event={"ID":"20b9e933-6f29-489e-92df-ac8ed12ae33d","Type":"ContainerStarted","Data":"de3445a1fdb5e7bffe16b21eca305829708edd7a5fbf7c071f4f2f3e27ebbf3c"} Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.412788 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.428812 4786 generic.go:334] "Generic (PLEG): container finished" podID="1063e48c-fed7-49b9-89f2-186b4627caea" containerID="a4f3f4569c5f0cc8dfdcbbd90eb9b6d5c153327bfd23daa7ca118edda5baf004" exitCode=143 Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.428893 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" event={"ID":"1063e48c-fed7-49b9-89f2-186b4627caea","Type":"ContainerDied","Data":"a4f3f4569c5f0cc8dfdcbbd90eb9b6d5c153327bfd23daa7ca118edda5baf004"} Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.447108 4786 generic.go:334] "Generic (PLEG): container finished" podID="dea0a108-dc5a-4700-a956-563674797beb" containerID="bdb0179cb2a6bc6617c488f820d28e7e3a40df645ba79cc6d779129a882afa34" exitCode=143 Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.447205 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.447214 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.447991 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679bcd6995-7xnv6" event={"ID":"dea0a108-dc5a-4700-a956-563674797beb","Type":"ContainerDied","Data":"bdb0179cb2a6bc6617c488f820d28e7e3a40df645ba79cc6d779129a882afa34"} Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.489247 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6555f55d84-r89pv" podStartSLOduration=3.489227519 podStartE2EDuration="3.489227519s" podCreationTimestamp="2026-03-13 15:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:51.466521248 +0000 UTC m=+1321.629733059" watchObservedRunningTime="2026-03-13 15:24:51.489227519 +0000 UTC m=+1321.652439330" Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.975527 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:51 crc kubenswrapper[4786]: I0313 15:24:51.976018 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.046895 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.161266 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.168977 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.386480 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bd4f4c6c-zg28d"] Mar 13 15:24:52 crc kubenswrapper[4786]: E0313 15:24:52.387079 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerName="barbican-api" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.387095 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerName="barbican-api" Mar 13 15:24:52 crc kubenswrapper[4786]: E0313 15:24:52.387123 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerName="barbican-api-log" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.387130 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerName="barbican-api-log" Mar 13 15:24:52 crc kubenswrapper[4786]: E0313 15:24:52.387142 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460f99f9-139a-4c17-8ee1-6beaecb83af5" containerName="init" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.387150 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="460f99f9-139a-4c17-8ee1-6beaecb83af5" containerName="init" Mar 13 15:24:52 crc kubenswrapper[4786]: E0313 15:24:52.387163 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460f99f9-139a-4c17-8ee1-6beaecb83af5" containerName="dnsmasq-dns" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.387170 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="460f99f9-139a-4c17-8ee1-6beaecb83af5" containerName="dnsmasq-dns" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.387375 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="460f99f9-139a-4c17-8ee1-6beaecb83af5" containerName="dnsmasq-dns" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.387400 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerName="barbican-api" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.387433 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10b99b0-adf8-4e02-a0f7-d551c8b4c748" containerName="barbican-api-log" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.392514 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.395354 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.395484 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.419329 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd4f4c6c-zg28d"] Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.465793 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" event={"ID":"e2a499a4-86b9-4122-aa89-cb5a77be5227","Type":"ContainerStarted","Data":"115cc988ac6213615130a52b40951fa9cbd17fe3ad5b7bc5c3c5aa52c5d086a4"} Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.498145 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" podStartSLOduration=4.498066484 podStartE2EDuration="4.498066484s" podCreationTimestamp="2026-03-13 15:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:24:52.486658967 +0000 UTC m=+1322.649870778" watchObservedRunningTime="2026-03-13 15:24:52.498066484 +0000 UTC m=+1322.661278295" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.514569 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-ovndb-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.514645 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-config\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.514718 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-httpd-config\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.515102 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-combined-ca-bundle\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.515322 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-internal-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.515350 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdlvj\" (UniqueName: \"kubernetes.io/projected/21658ad3-b8e8-4743-b2c7-da4782850abc-kube-api-access-fdlvj\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.515432 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-public-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.617711 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-combined-ca-bundle\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.617972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-internal-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.617996 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdlvj\" (UniqueName: \"kubernetes.io/projected/21658ad3-b8e8-4743-b2c7-da4782850abc-kube-api-access-fdlvj\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.618087 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-public-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.618166 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-ovndb-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.618223 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-config\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.620499 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-httpd-config\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.627591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-combined-ca-bundle\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.628681 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-config\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.632408 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-ovndb-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.632456 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-public-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.632894 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-internal-tls-certs\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.637529 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-httpd-config\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.642255 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdlvj\" (UniqueName: \"kubernetes.io/projected/21658ad3-b8e8-4743-b2c7-da4782850abc-kube-api-access-fdlvj\") pod \"neutron-6bd4f4c6c-zg28d\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:52 crc kubenswrapper[4786]: I0313 15:24:52.713019 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:24:53 crc kubenswrapper[4786]: I0313 15:24:53.312597 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd4f4c6c-zg28d"] Mar 13 15:24:53 crc kubenswrapper[4786]: I0313 15:24:53.476509 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd4f4c6c-zg28d" event={"ID":"21658ad3-b8e8-4743-b2c7-da4782850abc","Type":"ContainerStarted","Data":"f35831a82cadd9eced7b3e06ee89906804990645daeb4e203ee4130dad3126e1"} Mar 13 15:24:53 crc kubenswrapper[4786]: I0313 15:24:53.477801 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:54 crc kubenswrapper[4786]: I0313 15:24:54.499809 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd4f4c6c-zg28d" event={"ID":"21658ad3-b8e8-4743-b2c7-da4782850abc","Type":"ContainerStarted","Data":"92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060"} Mar 13 15:24:55 crc kubenswrapper[4786]: I0313 15:24:55.510195 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:55 crc kubenswrapper[4786]: I0313 15:24:55.557208 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:24:56 crc kubenswrapper[4786]: I0313 15:24:56.519293 4786 generic.go:334] "Generic (PLEG): container finished" podID="fb8a98ec-82a3-418d-82ea-d0ff210dd78d" containerID="8bbd1bdba96d406cca8c0659cce4f0f05a674440ee5ba65657adc3d6eb97a621" exitCode=0 Mar 13 15:24:56 crc kubenswrapper[4786]: I0313 15:24:56.520321 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nwdnc" event={"ID":"fb8a98ec-82a3-418d-82ea-d0ff210dd78d","Type":"ContainerDied","Data":"8bbd1bdba96d406cca8c0659cce4f0f05a674440ee5ba65657adc3d6eb97a621"} Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.022734 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.135222 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-combined-ca-bundle\") pod \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.135263 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-config-data\") pod \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.135333 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-db-sync-config-data\") pod \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.135393 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-etc-machine-id\") pod \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.135417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-scripts\") pod \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.135496 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fb8a98ec-82a3-418d-82ea-d0ff210dd78d" (UID: "fb8a98ec-82a3-418d-82ea-d0ff210dd78d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.135703 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4x57\" (UniqueName: \"kubernetes.io/projected/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-kube-api-access-r4x57\") pod \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\" (UID: \"fb8a98ec-82a3-418d-82ea-d0ff210dd78d\") " Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.136514 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.142981 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fb8a98ec-82a3-418d-82ea-d0ff210dd78d" (UID: "fb8a98ec-82a3-418d-82ea-d0ff210dd78d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.143314 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-kube-api-access-r4x57" (OuterVolumeSpecName: "kube-api-access-r4x57") pod "fb8a98ec-82a3-418d-82ea-d0ff210dd78d" (UID: "fb8a98ec-82a3-418d-82ea-d0ff210dd78d"). InnerVolumeSpecName "kube-api-access-r4x57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.145438 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-scripts" (OuterVolumeSpecName: "scripts") pod "fb8a98ec-82a3-418d-82ea-d0ff210dd78d" (UID: "fb8a98ec-82a3-418d-82ea-d0ff210dd78d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.168956 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb8a98ec-82a3-418d-82ea-d0ff210dd78d" (UID: "fb8a98ec-82a3-418d-82ea-d0ff210dd78d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.194810 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-config-data" (OuterVolumeSpecName: "config-data") pod "fb8a98ec-82a3-418d-82ea-d0ff210dd78d" (UID: "fb8a98ec-82a3-418d-82ea-d0ff210dd78d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.238606 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4x57\" (UniqueName: \"kubernetes.io/projected/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-kube-api-access-r4x57\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.238643 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.238654 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.238662 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.238670 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8a98ec-82a3-418d-82ea-d0ff210dd78d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.296955 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.301715 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.432938 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b6bd986b8-jdtlc"] Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.433144 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b6bd986b8-jdtlc" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api-log" containerID="cri-o://878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9" gracePeriod=30 Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.433539 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b6bd986b8-jdtlc" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api" containerID="cri-o://63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e" gracePeriod=30 Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.542561 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nwdnc" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.542728 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nwdnc" event={"ID":"fb8a98ec-82a3-418d-82ea-d0ff210dd78d","Type":"ContainerDied","Data":"7a6f4224455dd3968dbd67b1dadfe6f1f8e5753a7bda2b9879fe9aea34f9679e"} Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.542990 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a6f4224455dd3968dbd67b1dadfe6f1f8e5753a7bda2b9879fe9aea34f9679e" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.801305 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:24:58 crc kubenswrapper[4786]: E0313 15:24:58.802670 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8a98ec-82a3-418d-82ea-d0ff210dd78d" containerName="cinder-db-sync" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.802691 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8a98ec-82a3-418d-82ea-d0ff210dd78d" containerName="cinder-db-sync" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.803096 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8a98ec-82a3-418d-82ea-d0ff210dd78d" containerName="cinder-db-sync" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.804331 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.809544 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5vs2w" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.809724 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.809823 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.810357 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.834029 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.858114 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.864712 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbhdr\" (UniqueName: \"kubernetes.io/projected/281f0a05-c7f3-4c5f-aad4-f953a8521233-kube-api-access-sbhdr\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.864880 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.865054 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.865103 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/281f0a05-c7f3-4c5f-aad4-f953a8521233-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.865362 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-scripts\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.865426 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.940372 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-htkvf"] Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.967004 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.967067 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/281f0a05-c7f3-4c5f-aad4-f953a8521233-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.967156 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-scripts\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.967177 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.967219 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbhdr\" (UniqueName: \"kubernetes.io/projected/281f0a05-c7f3-4c5f-aad4-f953a8521233-kube-api-access-sbhdr\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.967282 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.971668 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/281f0a05-c7f3-4c5f-aad4-f953a8521233-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.976204 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.976989 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.977526 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-4jq58"] Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.979327 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.984155 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.986906 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-scripts\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:58 crc kubenswrapper[4786]: I0313 15:24:58.995578 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbhdr\" (UniqueName: \"kubernetes.io/projected/281f0a05-c7f3-4c5f-aad4-f953a8521233-kube-api-access-sbhdr\") pod \"cinder-scheduler-0\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.015207 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-4jq58"] Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.073836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.077147 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.077217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.077430 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-config\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.077508 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr5f\" (UniqueName: \"kubernetes.io/projected/ab472ede-43a0-40ac-8e23-81798838d0dc-kube-api-access-fcr5f\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.077755 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.102811 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.104179 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.111252 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.141358 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.151951 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180266 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180371 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a6781ef-43aa-4084-9e32-3667cd7a7d18-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180460 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180536 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-scripts\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180596 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180665 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180745 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6781ef-43aa-4084-9e32-3667cd7a7d18-logs\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180812 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-config\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180898 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr5f\" (UniqueName: \"kubernetes.io/projected/ab472ede-43a0-40ac-8e23-81798838d0dc-kube-api-access-fcr5f\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.180949 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rllfj\" (UniqueName: \"kubernetes.io/projected/1a6781ef-43aa-4084-9e32-3667cd7a7d18-kube-api-access-rllfj\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.181014 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.181067 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.181183 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.181762 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-config\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.182160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-swift-storage-0\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.182814 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-sb\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.183151 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-svc\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.183296 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-nb\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.198712 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr5f\" (UniqueName: \"kubernetes.io/projected/ab472ede-43a0-40ac-8e23-81798838d0dc-kube-api-access-fcr5f\") pod \"dnsmasq-dns-58b85ccffc-4jq58\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.282786 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.282892 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.283597 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.283626 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a6781ef-43aa-4084-9e32-3667cd7a7d18-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.283939 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-scripts\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.284002 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6781ef-43aa-4084-9e32-3667cd7a7d18-logs\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.284011 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a6781ef-43aa-4084-9e32-3667cd7a7d18-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.284047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rllfj\" (UniqueName: \"kubernetes.io/projected/1a6781ef-43aa-4084-9e32-3667cd7a7d18-kube-api-access-rllfj\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.284566 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6781ef-43aa-4084-9e32-3667cd7a7d18-logs\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.288190 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-scripts\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.297321 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.298114 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.298513 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data-custom\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.313602 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rllfj\" (UniqueName: \"kubernetes.io/projected/1a6781ef-43aa-4084-9e32-3667cd7a7d18-kube-api-access-rllfj\") pod \"cinder-api-0\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.379511 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.425941 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.555983 4786 generic.go:334] "Generic (PLEG): container finished" podID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerID="878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9" exitCode=143 Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.556160 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" podUID="e2a499a4-86b9-4122-aa89-cb5a77be5227" containerName="dnsmasq-dns" containerID="cri-o://115cc988ac6213615130a52b40951fa9cbd17fe3ad5b7bc5c3c5aa52c5d086a4" gracePeriod=10 Mar 13 15:24:59 crc kubenswrapper[4786]: I0313 15:24:59.556410 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6bd986b8-jdtlc" event={"ID":"50f3dfe4-74a4-4d75-83a2-0109a8dda909","Type":"ContainerDied","Data":"878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9"} Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.629156 4786 generic.go:334] "Generic (PLEG): container finished" podID="e2a499a4-86b9-4122-aa89-cb5a77be5227" containerID="115cc988ac6213615130a52b40951fa9cbd17fe3ad5b7bc5c3c5aa52c5d086a4" exitCode=0 Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.629439 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" event={"ID":"e2a499a4-86b9-4122-aa89-cb5a77be5227","Type":"ContainerDied","Data":"115cc988ac6213615130a52b40951fa9cbd17fe3ad5b7bc5c3c5aa52c5d086a4"} Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.836369 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.930950 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-swift-storage-0\") pod \"e2a499a4-86b9-4122-aa89-cb5a77be5227\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.931275 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-sb\") pod \"e2a499a4-86b9-4122-aa89-cb5a77be5227\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.931510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-config\") pod \"e2a499a4-86b9-4122-aa89-cb5a77be5227\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.931584 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-nb\") pod \"e2a499a4-86b9-4122-aa89-cb5a77be5227\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.931624 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k65h6\" (UniqueName: \"kubernetes.io/projected/e2a499a4-86b9-4122-aa89-cb5a77be5227-kube-api-access-k65h6\") pod \"e2a499a4-86b9-4122-aa89-cb5a77be5227\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.931671 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-svc\") pod \"e2a499a4-86b9-4122-aa89-cb5a77be5227\" (UID: \"e2a499a4-86b9-4122-aa89-cb5a77be5227\") " Mar 13 15:25:00 crc kubenswrapper[4786]: I0313 15:25:00.959396 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a499a4-86b9-4122-aa89-cb5a77be5227-kube-api-access-k65h6" (OuterVolumeSpecName: "kube-api-access-k65h6") pod "e2a499a4-86b9-4122-aa89-cb5a77be5227" (UID: "e2a499a4-86b9-4122-aa89-cb5a77be5227"). InnerVolumeSpecName "kube-api-access-k65h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.001554 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-config" (OuterVolumeSpecName: "config") pod "e2a499a4-86b9-4122-aa89-cb5a77be5227" (UID: "e2a499a4-86b9-4122-aa89-cb5a77be5227"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.032803 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e2a499a4-86b9-4122-aa89-cb5a77be5227" (UID: "e2a499a4-86b9-4122-aa89-cb5a77be5227"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.033820 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.033900 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.033917 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k65h6\" (UniqueName: \"kubernetes.io/projected/e2a499a4-86b9-4122-aa89-cb5a77be5227-kube-api-access-k65h6\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.049557 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2a499a4-86b9-4122-aa89-cb5a77be5227" (UID: "e2a499a4-86b9-4122-aa89-cb5a77be5227"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.049756 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2a499a4-86b9-4122-aa89-cb5a77be5227" (UID: "e2a499a4-86b9-4122-aa89-cb5a77be5227"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.061610 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2a499a4-86b9-4122-aa89-cb5a77be5227" (UID: "e2a499a4-86b9-4122-aa89-cb5a77be5227"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.135497 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.135522 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.135531 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2a499a4-86b9-4122-aa89-cb5a77be5227-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.174123 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:25:01 crc kubenswrapper[4786]: W0313 15:25:01.178287 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a6781ef_43aa_4084_9e32_3667cd7a7d18.slice/crio-9c7aa0b5317bfe1c96e01132591e36aa4ee6accaf8bff15868ccea3e0e637eaf WatchSource:0}: Error finding container 9c7aa0b5317bfe1c96e01132591e36aa4ee6accaf8bff15868ccea3e0e637eaf: Status 404 returned error can't find the container with id 9c7aa0b5317bfe1c96e01132591e36aa4ee6accaf8bff15868ccea3e0e637eaf Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.329499 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-4jq58"] Mar 13 15:25:01 crc kubenswrapper[4786]: W0313 15:25:01.330748 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab472ede_43a0_40ac_8e23_81798838d0dc.slice/crio-9378f6bfc7849ff1a7c5f0242a0eec3f150a7b7938c428264fe5aaf9feaa980f WatchSource:0}: Error finding container 9378f6bfc7849ff1a7c5f0242a0eec3f150a7b7938c428264fe5aaf9feaa980f: Status 404 returned error can't find the container with id 9378f6bfc7849ff1a7c5f0242a0eec3f150a7b7938c428264fe5aaf9feaa980f Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.435653 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:25:01 crc kubenswrapper[4786]: W0313 15:25:01.452299 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod281f0a05_c7f3_4c5f_aad4_f953a8521233.slice/crio-d882f59342db2b2780ac86137e504474fc9d1eb058a7ab9e3be2fc2b793433e7 WatchSource:0}: Error finding container d882f59342db2b2780ac86137e504474fc9d1eb058a7ab9e3be2fc2b793433e7: Status 404 returned error can't find the container with id d882f59342db2b2780ac86137e504474fc9d1eb058a7ab9e3be2fc2b793433e7 Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.494556 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.608883 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b6bd986b8-jdtlc" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:42902->10.217.0.164:9311: read: connection reset by peer" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.609144 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b6bd986b8-jdtlc" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:42918->10.217.0.164:9311: read: connection reset by peer" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.639893 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerStarted","Data":"6a480a63377b2a42240127977482d6c92b7eb86f63e21712328bd6c72b540ffa"} Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.640011 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="ceilometer-central-agent" containerID="cri-o://34a3de5ae0339bade5ea43cf68217e18b65117c08c10bedbf883a4d025fffaba" gracePeriod=30 Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.640033 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="proxy-httpd" containerID="cri-o://6a480a63377b2a42240127977482d6c92b7eb86f63e21712328bd6c72b540ffa" gracePeriod=30 Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.640043 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.640079 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="sg-core" containerID="cri-o://c2b0b2ac11b40e7fb06e5cffc1a3823448827504835d20bd965320283d7e4736" gracePeriod=30 Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.640132 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="ceilometer-notification-agent" containerID="cri-o://39e39b70e9ae5a6af51f20dbd41d61866f1bf24a116705d8f43525abe21e7766" gracePeriod=30 Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.648592 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"281f0a05-c7f3-4c5f-aad4-f953a8521233","Type":"ContainerStarted","Data":"d882f59342db2b2780ac86137e504474fc9d1eb058a7ab9e3be2fc2b793433e7"} Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.661199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd4f4c6c-zg28d" event={"ID":"21658ad3-b8e8-4743-b2c7-da4782850abc","Type":"ContainerStarted","Data":"84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0"} Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.662059 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.664249 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a6781ef-43aa-4084-9e32-3667cd7a7d18","Type":"ContainerStarted","Data":"9c7aa0b5317bfe1c96e01132591e36aa4ee6accaf8bff15868ccea3e0e637eaf"} Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.667587 4786 generic.go:334] "Generic (PLEG): container finished" podID="ab472ede-43a0-40ac-8e23-81798838d0dc" containerID="2825a787b40c80d07e8d382d6f8fdf4a59b0f22bd3cbc70aa4ef72572e2f218d" exitCode=0 Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.667650 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" event={"ID":"ab472ede-43a0-40ac-8e23-81798838d0dc","Type":"ContainerDied","Data":"2825a787b40c80d07e8d382d6f8fdf4a59b0f22bd3cbc70aa4ef72572e2f218d"} Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.667678 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" event={"ID":"ab472ede-43a0-40ac-8e23-81798838d0dc","Type":"ContainerStarted","Data":"9378f6bfc7849ff1a7c5f0242a0eec3f150a7b7938c428264fe5aaf9feaa980f"} Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.673186 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" event={"ID":"e2a499a4-86b9-4122-aa89-cb5a77be5227","Type":"ContainerDied","Data":"0b30cd48cd2bcf8db03c790e2a9745e6657f7f88cfd1fdaa043b29731c0e28db"} Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.673250 4786 scope.go:117] "RemoveContainer" containerID="115cc988ac6213615130a52b40951fa9cbd17fe3ad5b7bc5c3c5aa52c5d086a4" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.673248 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc46d7df7-htkvf" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.680616 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.164750369 podStartE2EDuration="52.680601383s" podCreationTimestamp="2026-03-13 15:24:09 +0000 UTC" firstStartedPulling="2026-03-13 15:24:11.204679259 +0000 UTC m=+1281.367891070" lastFinishedPulling="2026-03-13 15:25:00.720530273 +0000 UTC m=+1330.883742084" observedRunningTime="2026-03-13 15:25:01.678447609 +0000 UTC m=+1331.841659420" watchObservedRunningTime="2026-03-13 15:25:01.680601383 +0000 UTC m=+1331.843813194" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.709590 4786 scope.go:117] "RemoveContainer" containerID="120ce3515ded4e6a9af9413eac717dc3b1eb01ded30c3453e2ae4e06dc5b62ba" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.744666 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bd4f4c6c-zg28d" podStartSLOduration=9.744650013 podStartE2EDuration="9.744650013s" podCreationTimestamp="2026-03-13 15:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:01.739271148 +0000 UTC m=+1331.902482959" watchObservedRunningTime="2026-03-13 15:25:01.744650013 +0000 UTC m=+1331.907861824" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.756678 4786 scope.go:117] "RemoveContainer" containerID="2e43c3ba6da42e6f49b59d157e8257b6f1b3141ac02eac3859694d691b7b31c1" Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.779469 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-htkvf"] Mar 13 15:25:01 crc kubenswrapper[4786]: I0313 15:25:01.798414 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc46d7df7-htkvf"] Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.168282 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.261518 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data-custom\") pod \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.262063 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq2ds\" (UniqueName: \"kubernetes.io/projected/50f3dfe4-74a4-4d75-83a2-0109a8dda909-kube-api-access-fq2ds\") pod \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.262137 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f3dfe4-74a4-4d75-83a2-0109a8dda909-logs\") pod \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.262171 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-combined-ca-bundle\") pod \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.262245 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data\") pod \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\" (UID: \"50f3dfe4-74a4-4d75-83a2-0109a8dda909\") " Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.264114 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f3dfe4-74a4-4d75-83a2-0109a8dda909-logs" (OuterVolumeSpecName: "logs") pod "50f3dfe4-74a4-4d75-83a2-0109a8dda909" (UID: "50f3dfe4-74a4-4d75-83a2-0109a8dda909"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.286512 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f3dfe4-74a4-4d75-83a2-0109a8dda909-kube-api-access-fq2ds" (OuterVolumeSpecName: "kube-api-access-fq2ds") pod "50f3dfe4-74a4-4d75-83a2-0109a8dda909" (UID: "50f3dfe4-74a4-4d75-83a2-0109a8dda909"). InnerVolumeSpecName "kube-api-access-fq2ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.292181 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "50f3dfe4-74a4-4d75-83a2-0109a8dda909" (UID: "50f3dfe4-74a4-4d75-83a2-0109a8dda909"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.326212 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50f3dfe4-74a4-4d75-83a2-0109a8dda909" (UID: "50f3dfe4-74a4-4d75-83a2-0109a8dda909"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.335719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data" (OuterVolumeSpecName: "config-data") pod "50f3dfe4-74a4-4d75-83a2-0109a8dda909" (UID: "50f3dfe4-74a4-4d75-83a2-0109a8dda909"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.365265 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.365293 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq2ds\" (UniqueName: \"kubernetes.io/projected/50f3dfe4-74a4-4d75-83a2-0109a8dda909-kube-api-access-fq2ds\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.365304 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f3dfe4-74a4-4d75-83a2-0109a8dda909-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.365312 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.365320 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f3dfe4-74a4-4d75-83a2-0109a8dda909-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.566526 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a499a4-86b9-4122-aa89-cb5a77be5227" path="/var/lib/kubelet/pods/e2a499a4-86b9-4122-aa89-cb5a77be5227/volumes" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.696451 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" event={"ID":"ab472ede-43a0-40ac-8e23-81798838d0dc","Type":"ContainerStarted","Data":"1a635506d88c5049e300f4b6bd956300aa9a7758cc6f9b567b6c714e5f08cbd1"} Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.696825 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.709110 4786 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerID="6a480a63377b2a42240127977482d6c92b7eb86f63e21712328bd6c72b540ffa" exitCode=0 Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.709181 4786 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerID="c2b0b2ac11b40e7fb06e5cffc1a3823448827504835d20bd965320283d7e4736" exitCode=2 Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.709192 4786 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerID="34a3de5ae0339bade5ea43cf68217e18b65117c08c10bedbf883a4d025fffaba" exitCode=0 Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.709343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerDied","Data":"6a480a63377b2a42240127977482d6c92b7eb86f63e21712328bd6c72b540ffa"} Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.709374 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerDied","Data":"c2b0b2ac11b40e7fb06e5cffc1a3823448827504835d20bd965320283d7e4736"} Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.709408 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerDied","Data":"34a3de5ae0339bade5ea43cf68217e18b65117c08c10bedbf883a4d025fffaba"} Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.719699 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" podStartSLOduration=4.7196799689999995 podStartE2EDuration="4.719679969s" podCreationTimestamp="2026-03-13 15:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:02.713929454 +0000 UTC m=+1332.877141275" watchObservedRunningTime="2026-03-13 15:25:02.719679969 +0000 UTC m=+1332.882891780" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.723438 4786 generic.go:334] "Generic (PLEG): container finished" podID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerID="63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e" exitCode=0 Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.723516 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b6bd986b8-jdtlc" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.723534 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6bd986b8-jdtlc" event={"ID":"50f3dfe4-74a4-4d75-83a2-0109a8dda909","Type":"ContainerDied","Data":"63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e"} Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.723563 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b6bd986b8-jdtlc" event={"ID":"50f3dfe4-74a4-4d75-83a2-0109a8dda909","Type":"ContainerDied","Data":"0331ed221dc34d922ae87c6ab10cf4423462bccd75ac827e47ee6d5c3abded23"} Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.723583 4786 scope.go:117] "RemoveContainer" containerID="63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.732131 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerName="cinder-api-log" containerID="cri-o://6eb31899fcb44ea87d84b174734072a3750561c394aa977ab3b130b0cdf09a59" gracePeriod=30 Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.732731 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a6781ef-43aa-4084-9e32-3667cd7a7d18","Type":"ContainerStarted","Data":"149ae29761fcea916eb11613dc8da76bbebea576e3bc59e445903d136869a7db"} Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.732772 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a6781ef-43aa-4084-9e32-3667cd7a7d18","Type":"ContainerStarted","Data":"6eb31899fcb44ea87d84b174734072a3750561c394aa977ab3b130b0cdf09a59"} Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.732800 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.732831 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerName="cinder-api" containerID="cri-o://149ae29761fcea916eb11613dc8da76bbebea576e3bc59e445903d136869a7db" gracePeriod=30 Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.754658 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b6bd986b8-jdtlc"] Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.759153 4786 scope.go:117] "RemoveContainer" containerID="878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.766824 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b6bd986b8-jdtlc"] Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.767183 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.767160562 podStartE2EDuration="3.767160562s" podCreationTimestamp="2026-03-13 15:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:02.763289875 +0000 UTC m=+1332.926501706" watchObservedRunningTime="2026-03-13 15:25:02.767160562 +0000 UTC m=+1332.930372373" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.785355 4786 scope.go:117] "RemoveContainer" containerID="63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e" Mar 13 15:25:02 crc kubenswrapper[4786]: E0313 15:25:02.785916 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e\": container with ID starting with 63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e not found: ID does not exist" containerID="63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.785968 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e"} err="failed to get container status \"63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e\": rpc error: code = NotFound desc = could not find container \"63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e\": container with ID starting with 63ac53cf633b59972e6625f7b7c0dad754203dbde3a0a26244390ad97efc5a6e not found: ID does not exist" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.785991 4786 scope.go:117] "RemoveContainer" containerID="878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9" Mar 13 15:25:02 crc kubenswrapper[4786]: E0313 15:25:02.786247 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9\": container with ID starting with 878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9 not found: ID does not exist" containerID="878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9" Mar 13 15:25:02 crc kubenswrapper[4786]: I0313 15:25:02.786269 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9"} err="failed to get container status \"878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9\": rpc error: code = NotFound desc = could not find container \"878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9\": container with ID starting with 878d0b021bd4b6c509487aac69dd0573c3499857a04bca35fe82fc8aba46cfa9 not found: ID does not exist" Mar 13 15:25:03 crc kubenswrapper[4786]: I0313 15:25:03.743086 4786 generic.go:334] "Generic (PLEG): container finished" podID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerID="6eb31899fcb44ea87d84b174734072a3750561c394aa977ab3b130b0cdf09a59" exitCode=143 Mar 13 15:25:03 crc kubenswrapper[4786]: I0313 15:25:03.743158 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a6781ef-43aa-4084-9e32-3667cd7a7d18","Type":"ContainerDied","Data":"6eb31899fcb44ea87d84b174734072a3750561c394aa977ab3b130b0cdf09a59"} Mar 13 15:25:03 crc kubenswrapper[4786]: I0313 15:25:03.766865 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"281f0a05-c7f3-4c5f-aad4-f953a8521233","Type":"ContainerStarted","Data":"9f0d56f2e8d225747d7a951d745bf3d3e955cfece7e60e266614992e46dd5acf"} Mar 13 15:25:03 crc kubenswrapper[4786]: I0313 15:25:03.766921 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"281f0a05-c7f3-4c5f-aad4-f953a8521233","Type":"ContainerStarted","Data":"3167d63404c6ac1fc813e053bca5f8c82d03c5e7782555575aafa93a76431795"} Mar 13 15:25:03 crc kubenswrapper[4786]: I0313 15:25:03.787362 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.974643876 podStartE2EDuration="5.787345983s" podCreationTimestamp="2026-03-13 15:24:58 +0000 UTC" firstStartedPulling="2026-03-13 15:25:01.465461906 +0000 UTC m=+1331.628673717" lastFinishedPulling="2026-03-13 15:25:02.278164013 +0000 UTC m=+1332.441375824" observedRunningTime="2026-03-13 15:25:03.783898767 +0000 UTC m=+1333.947110578" watchObservedRunningTime="2026-03-13 15:25:03.787345983 +0000 UTC m=+1333.950557794" Mar 13 15:25:04 crc kubenswrapper[4786]: I0313 15:25:04.142547 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 15:25:04 crc kubenswrapper[4786]: I0313 15:25:04.569641 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" path="/var/lib/kubelet/pods/50f3dfe4-74a4-4d75-83a2-0109a8dda909/volumes" Mar 13 15:25:05 crc kubenswrapper[4786]: I0313 15:25:05.787992 4786 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerID="39e39b70e9ae5a6af51f20dbd41d61866f1bf24a116705d8f43525abe21e7766" exitCode=0 Mar 13 15:25:05 crc kubenswrapper[4786]: I0313 15:25:05.788089 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerDied","Data":"39e39b70e9ae5a6af51f20dbd41d61866f1bf24a116705d8f43525abe21e7766"} Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.615730 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.708394 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-run-httpd\") pod \"5fb2cca9-aeef-4bce-9307-02429aae556d\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.708492 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-combined-ca-bundle\") pod \"5fb2cca9-aeef-4bce-9307-02429aae556d\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.708565 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-scripts\") pod \"5fb2cca9-aeef-4bce-9307-02429aae556d\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.708617 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-config-data\") pod \"5fb2cca9-aeef-4bce-9307-02429aae556d\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.708669 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clc8z\" (UniqueName: \"kubernetes.io/projected/5fb2cca9-aeef-4bce-9307-02429aae556d-kube-api-access-clc8z\") pod \"5fb2cca9-aeef-4bce-9307-02429aae556d\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.708744 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-log-httpd\") pod \"5fb2cca9-aeef-4bce-9307-02429aae556d\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.708838 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-sg-core-conf-yaml\") pod \"5fb2cca9-aeef-4bce-9307-02429aae556d\" (UID: \"5fb2cca9-aeef-4bce-9307-02429aae556d\") " Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.709565 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5fb2cca9-aeef-4bce-9307-02429aae556d" (UID: "5fb2cca9-aeef-4bce-9307-02429aae556d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.709696 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5fb2cca9-aeef-4bce-9307-02429aae556d" (UID: "5fb2cca9-aeef-4bce-9307-02429aae556d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.714928 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-scripts" (OuterVolumeSpecName: "scripts") pod "5fb2cca9-aeef-4bce-9307-02429aae556d" (UID: "5fb2cca9-aeef-4bce-9307-02429aae556d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.715331 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fb2cca9-aeef-4bce-9307-02429aae556d-kube-api-access-clc8z" (OuterVolumeSpecName: "kube-api-access-clc8z") pod "5fb2cca9-aeef-4bce-9307-02429aae556d" (UID: "5fb2cca9-aeef-4bce-9307-02429aae556d"). InnerVolumeSpecName "kube-api-access-clc8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.735122 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5fb2cca9-aeef-4bce-9307-02429aae556d" (UID: "5fb2cca9-aeef-4bce-9307-02429aae556d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.798160 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-config-data" (OuterVolumeSpecName: "config-data") pod "5fb2cca9-aeef-4bce-9307-02429aae556d" (UID: "5fb2cca9-aeef-4bce-9307-02429aae556d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.804646 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fb2cca9-aeef-4bce-9307-02429aae556d" (UID: "5fb2cca9-aeef-4bce-9307-02429aae556d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.811092 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.811192 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.811205 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.811230 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.811239 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clc8z\" (UniqueName: \"kubernetes.io/projected/5fb2cca9-aeef-4bce-9307-02429aae556d-kube-api-access-clc8z\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.811248 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cca9-aeef-4bce-9307-02429aae556d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.811256 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fb2cca9-aeef-4bce-9307-02429aae556d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.829248 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fb2cca9-aeef-4bce-9307-02429aae556d","Type":"ContainerDied","Data":"a556a7fdcfb5675178fd50b08207e97dd42a775d6478b4d6d77c2a3d5a52dbd1"} Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.829297 4786 scope.go:117] "RemoveContainer" containerID="6a480a63377b2a42240127977482d6c92b7eb86f63e21712328bd6c72b540ffa" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.829297 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.867405 4786 scope.go:117] "RemoveContainer" containerID="c2b0b2ac11b40e7fb06e5cffc1a3823448827504835d20bd965320283d7e4736" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.868743 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.868825 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.868887 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.869006 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.870501 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cedf575f572a8d2fa7d4acff7bfb9c6086d44a2d58bd68733b103bae3b833d49"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.870610 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://cedf575f572a8d2fa7d4acff7bfb9c6086d44a2d58bd68733b103bae3b833d49" gracePeriod=600 Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.889139 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.896697 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:07 crc kubenswrapper[4786]: E0313 15:25:07.900400 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="ceilometer-central-agent" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900451 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="ceilometer-central-agent" Mar 13 15:25:07 crc kubenswrapper[4786]: E0313 15:25:07.900495 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900504 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api" Mar 13 15:25:07 crc kubenswrapper[4786]: E0313 15:25:07.900519 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="sg-core" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900526 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="sg-core" Mar 13 15:25:07 crc kubenswrapper[4786]: E0313 15:25:07.900547 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api-log" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900554 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api-log" Mar 13 15:25:07 crc kubenswrapper[4786]: E0313 15:25:07.900573 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a499a4-86b9-4122-aa89-cb5a77be5227" containerName="dnsmasq-dns" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900580 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a499a4-86b9-4122-aa89-cb5a77be5227" containerName="dnsmasq-dns" Mar 13 15:25:07 crc kubenswrapper[4786]: E0313 15:25:07.900598 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a499a4-86b9-4122-aa89-cb5a77be5227" containerName="init" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900606 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a499a4-86b9-4122-aa89-cb5a77be5227" containerName="init" Mar 13 15:25:07 crc kubenswrapper[4786]: E0313 15:25:07.900620 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="proxy-httpd" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900629 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="proxy-httpd" Mar 13 15:25:07 crc kubenswrapper[4786]: E0313 15:25:07.900640 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="ceilometer-notification-agent" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900648 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="ceilometer-notification-agent" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900963 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="ceilometer-central-agent" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.900986 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a499a4-86b9-4122-aa89-cb5a77be5227" containerName="dnsmasq-dns" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.901002 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api-log" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.901014 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="proxy-httpd" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.901038 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="ceilometer-notification-agent" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.901051 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f3dfe4-74a4-4d75-83a2-0109a8dda909" containerName="barbican-api" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.901062 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" containerName="sg-core" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.903759 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.905236 4786 scope.go:117] "RemoveContainer" containerID="39e39b70e9ae5a6af51f20dbd41d61866f1bf24a116705d8f43525abe21e7766" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.907664 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.907932 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.908982 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:07 crc kubenswrapper[4786]: I0313 15:25:07.944127 4786 scope.go:117] "RemoveContainer" containerID="34a3de5ae0339bade5ea43cf68217e18b65117c08c10bedbf883a4d025fffaba" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.014371 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-scripts\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.014434 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2drvp\" (UniqueName: \"kubernetes.io/projected/043063bf-c580-4ded-9e1f-8351b6f5f600-kube-api-access-2drvp\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.014826 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.015028 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-run-httpd\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.015180 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.015273 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-log-httpd\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.015434 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-config-data\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.117585 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-run-httpd\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.117907 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.118158 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-log-httpd\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.118264 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-config-data\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.118351 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-run-httpd\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.118367 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-scripts\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.118469 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2drvp\" (UniqueName: \"kubernetes.io/projected/043063bf-c580-4ded-9e1f-8351b6f5f600-kube-api-access-2drvp\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.118718 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.119648 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-log-httpd\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.124550 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.127355 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.128106 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-config-data\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.130635 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-scripts\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.140611 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2drvp\" (UniqueName: \"kubernetes.io/projected/043063bf-c580-4ded-9e1f-8351b6f5f600-kube-api-access-2drvp\") pod \"ceilometer-0\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.241289 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.564160 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fb2cca9-aeef-4bce-9307-02429aae556d" path="/var/lib/kubelet/pods/5fb2cca9-aeef-4bce-9307-02429aae556d/volumes" Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.718788 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.837685 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerStarted","Data":"cbef2ba03ecf90a2cbc0cc92e956e9bd2d498eb92a9259ed8b7a9d94a403efeb"} Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.841178 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="cedf575f572a8d2fa7d4acff7bfb9c6086d44a2d58bd68733b103bae3b833d49" exitCode=0 Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.841210 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"cedf575f572a8d2fa7d4acff7bfb9c6086d44a2d58bd68733b103bae3b833d49"} Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.841230 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"a5ef85bd6fdd298e112745a45310d0fedfe424a8161d80698615752d010dc319"} Mar 13 15:25:08 crc kubenswrapper[4786]: I0313 15:25:08.841250 4786 scope.go:117] "RemoveContainer" containerID="a926826d2fa94d740a03a1a08b36f6e48f1ce5a8cc37acd7e0fef98af56e6473" Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.370230 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.381085 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.440359 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.510820 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-bl29c"] Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.511071 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" podUID="91915175-6b85-4031-b7ba-6da229b76766" containerName="dnsmasq-dns" containerID="cri-o://efa8ceb87e42c5af0c21bddf580fc49cf21cb5991d662bea5fc37db325d0c177" gracePeriod=10 Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.907225 4786 generic.go:334] "Generic (PLEG): container finished" podID="91915175-6b85-4031-b7ba-6da229b76766" containerID="efa8ceb87e42c5af0c21bddf580fc49cf21cb5991d662bea5fc37db325d0c177" exitCode=0 Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.907401 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" event={"ID":"91915175-6b85-4031-b7ba-6da229b76766","Type":"ContainerDied","Data":"efa8ceb87e42c5af0c21bddf580fc49cf21cb5991d662bea5fc37db325d0c177"} Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.908128 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerName="cinder-scheduler" containerID="cri-o://3167d63404c6ac1fc813e053bca5f8c82d03c5e7782555575aafa93a76431795" gracePeriod=30 Mar 13 15:25:09 crc kubenswrapper[4786]: I0313 15:25:09.908561 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerName="probe" containerID="cri-o://9f0d56f2e8d225747d7a951d745bf3d3e955cfece7e60e266614992e46dd5acf" gracePeriod=30 Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.093528 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.162317 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-sb\") pod \"91915175-6b85-4031-b7ba-6da229b76766\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.162409 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-svc\") pod \"91915175-6b85-4031-b7ba-6da229b76766\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.162431 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-nb\") pod \"91915175-6b85-4031-b7ba-6da229b76766\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.162485 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9hpq\" (UniqueName: \"kubernetes.io/projected/91915175-6b85-4031-b7ba-6da229b76766-kube-api-access-n9hpq\") pod \"91915175-6b85-4031-b7ba-6da229b76766\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.162509 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-config\") pod \"91915175-6b85-4031-b7ba-6da229b76766\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.162582 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-swift-storage-0\") pod \"91915175-6b85-4031-b7ba-6da229b76766\" (UID: \"91915175-6b85-4031-b7ba-6da229b76766\") " Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.171005 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91915175-6b85-4031-b7ba-6da229b76766-kube-api-access-n9hpq" (OuterVolumeSpecName: "kube-api-access-n9hpq") pod "91915175-6b85-4031-b7ba-6da229b76766" (UID: "91915175-6b85-4031-b7ba-6da229b76766"). InnerVolumeSpecName "kube-api-access-n9hpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.207221 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "91915175-6b85-4031-b7ba-6da229b76766" (UID: "91915175-6b85-4031-b7ba-6da229b76766"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.211879 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "91915175-6b85-4031-b7ba-6da229b76766" (UID: "91915175-6b85-4031-b7ba-6da229b76766"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.215338 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-config" (OuterVolumeSpecName: "config") pod "91915175-6b85-4031-b7ba-6da229b76766" (UID: "91915175-6b85-4031-b7ba-6da229b76766"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.215781 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "91915175-6b85-4031-b7ba-6da229b76766" (UID: "91915175-6b85-4031-b7ba-6da229b76766"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.238842 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91915175-6b85-4031-b7ba-6da229b76766" (UID: "91915175-6b85-4031-b7ba-6da229b76766"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.265536 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.265575 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.265587 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9hpq\" (UniqueName: \"kubernetes.io/projected/91915175-6b85-4031-b7ba-6da229b76766-kube-api-access-n9hpq\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.265597 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.265605 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.265614 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91915175-6b85-4031-b7ba-6da229b76766-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.917883 4786 generic.go:334] "Generic (PLEG): container finished" podID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerID="9f0d56f2e8d225747d7a951d745bf3d3e955cfece7e60e266614992e46dd5acf" exitCode=0 Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.917981 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"281f0a05-c7f3-4c5f-aad4-f953a8521233","Type":"ContainerDied","Data":"9f0d56f2e8d225747d7a951d745bf3d3e955cfece7e60e266614992e46dd5acf"} Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.922823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerStarted","Data":"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f"} Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.922892 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerStarted","Data":"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de"} Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.926054 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" event={"ID":"91915175-6b85-4031-b7ba-6da229b76766","Type":"ContainerDied","Data":"cbfda679a3e4904761ee7ffc3ad75a9c682d33e272bc63f5b3a3b8fabdf97086"} Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.926110 4786 scope.go:117] "RemoveContainer" containerID="efa8ceb87e42c5af0c21bddf580fc49cf21cb5991d662bea5fc37db325d0c177" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.926264 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759cc7f497-bl29c" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.962564 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-bl29c"] Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.965094 4786 scope.go:117] "RemoveContainer" containerID="250c2eefdad7b0f67f6e4e296986c9f2f5f788bae99d2a0556226e954301a48d" Mar 13 15:25:10 crc kubenswrapper[4786]: I0313 15:25:10.975156 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759cc7f497-bl29c"] Mar 13 15:25:11 crc kubenswrapper[4786]: I0313 15:25:11.637230 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 15:25:11 crc kubenswrapper[4786]: I0313 15:25:11.939505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerStarted","Data":"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c"} Mar 13 15:25:12 crc kubenswrapper[4786]: I0313 15:25:12.562236 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91915175-6b85-4031-b7ba-6da229b76766" path="/var/lib/kubelet/pods/91915175-6b85-4031-b7ba-6da229b76766/volumes" Mar 13 15:25:13 crc kubenswrapper[4786]: I0313 15:25:13.812352 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:25:13 crc kubenswrapper[4786]: I0313 15:25:13.963717 4786 generic.go:334] "Generic (PLEG): container finished" podID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerID="3167d63404c6ac1fc813e053bca5f8c82d03c5e7782555575aafa93a76431795" exitCode=0 Mar 13 15:25:13 crc kubenswrapper[4786]: I0313 15:25:13.963774 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"281f0a05-c7f3-4c5f-aad4-f953a8521233","Type":"ContainerDied","Data":"3167d63404c6ac1fc813e053bca5f8c82d03c5e7782555575aafa93a76431795"} Mar 13 15:25:13 crc kubenswrapper[4786]: I0313 15:25:13.967625 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerStarted","Data":"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc"} Mar 13 15:25:13 crc kubenswrapper[4786]: I0313 15:25:13.967775 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:25:13 crc kubenswrapper[4786]: I0313 15:25:13.988603 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.106517672 podStartE2EDuration="6.988584625s" podCreationTimestamp="2026-03-13 15:25:07 +0000 UTC" firstStartedPulling="2026-03-13 15:25:08.712252203 +0000 UTC m=+1338.875464014" lastFinishedPulling="2026-03-13 15:25:13.594319166 +0000 UTC m=+1343.757530967" observedRunningTime="2026-03-13 15:25:13.986521483 +0000 UTC m=+1344.149733294" watchObservedRunningTime="2026-03-13 15:25:13.988584625 +0000 UTC m=+1344.151796436" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.010815 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.310974 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.457443 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-scripts\") pod \"281f0a05-c7f3-4c5f-aad4-f953a8521233\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.457784 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-combined-ca-bundle\") pod \"281f0a05-c7f3-4c5f-aad4-f953a8521233\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.457940 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data-custom\") pod \"281f0a05-c7f3-4c5f-aad4-f953a8521233\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.458139 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data\") pod \"281f0a05-c7f3-4c5f-aad4-f953a8521233\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.458260 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbhdr\" (UniqueName: \"kubernetes.io/projected/281f0a05-c7f3-4c5f-aad4-f953a8521233-kube-api-access-sbhdr\") pod \"281f0a05-c7f3-4c5f-aad4-f953a8521233\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.458382 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/281f0a05-c7f3-4c5f-aad4-f953a8521233-etc-machine-id\") pod \"281f0a05-c7f3-4c5f-aad4-f953a8521233\" (UID: \"281f0a05-c7f3-4c5f-aad4-f953a8521233\") " Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.459098 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/281f0a05-c7f3-4c5f-aad4-f953a8521233-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "281f0a05-c7f3-4c5f-aad4-f953a8521233" (UID: "281f0a05-c7f3-4c5f-aad4-f953a8521233"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.464885 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "281f0a05-c7f3-4c5f-aad4-f953a8521233" (UID: "281f0a05-c7f3-4c5f-aad4-f953a8521233"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.467165 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-scripts" (OuterVolumeSpecName: "scripts") pod "281f0a05-c7f3-4c5f-aad4-f953a8521233" (UID: "281f0a05-c7f3-4c5f-aad4-f953a8521233"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.473014 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281f0a05-c7f3-4c5f-aad4-f953a8521233-kube-api-access-sbhdr" (OuterVolumeSpecName: "kube-api-access-sbhdr") pod "281f0a05-c7f3-4c5f-aad4-f953a8521233" (UID: "281f0a05-c7f3-4c5f-aad4-f953a8521233"). InnerVolumeSpecName "kube-api-access-sbhdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.512402 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "281f0a05-c7f3-4c5f-aad4-f953a8521233" (UID: "281f0a05-c7f3-4c5f-aad4-f953a8521233"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.524491 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.564493 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/281f0a05-c7f3-4c5f-aad4-f953a8521233-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.564543 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.564552 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.564560 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.564570 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbhdr\" (UniqueName: \"kubernetes.io/projected/281f0a05-c7f3-4c5f-aad4-f953a8521233-kube-api-access-sbhdr\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.592191 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data" (OuterVolumeSpecName: "config-data") pod "281f0a05-c7f3-4c5f-aad4-f953a8521233" (UID: "281f0a05-c7f3-4c5f-aad4-f953a8521233"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.666848 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281f0a05-c7f3-4c5f-aad4-f953a8521233-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.977443 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"281f0a05-c7f3-4c5f-aad4-f953a8521233","Type":"ContainerDied","Data":"d882f59342db2b2780ac86137e504474fc9d1eb058a7ab9e3be2fc2b793433e7"} Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.978673 4786 scope.go:117] "RemoveContainer" containerID="9f0d56f2e8d225747d7a951d745bf3d3e955cfece7e60e266614992e46dd5acf" Mar 13 15:25:14 crc kubenswrapper[4786]: I0313 15:25:14.977583 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.003219 4786 scope.go:117] "RemoveContainer" containerID="3167d63404c6ac1fc813e053bca5f8c82d03c5e7782555575aafa93a76431795" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.025574 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.039994 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.050828 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.058913 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:25:15 crc kubenswrapper[4786]: E0313 15:25:15.059339 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91915175-6b85-4031-b7ba-6da229b76766" containerName="init" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.059361 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="91915175-6b85-4031-b7ba-6da229b76766" containerName="init" Mar 13 15:25:15 crc kubenswrapper[4786]: E0313 15:25:15.059381 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerName="probe" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.059389 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerName="probe" Mar 13 15:25:15 crc kubenswrapper[4786]: E0313 15:25:15.059405 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91915175-6b85-4031-b7ba-6da229b76766" containerName="dnsmasq-dns" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.059412 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="91915175-6b85-4031-b7ba-6da229b76766" containerName="dnsmasq-dns" Mar 13 15:25:15 crc kubenswrapper[4786]: E0313 15:25:15.059434 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerName="cinder-scheduler" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.059442 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerName="cinder-scheduler" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.059635 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerName="cinder-scheduler" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.059656 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="91915175-6b85-4031-b7ba-6da229b76766" containerName="dnsmasq-dns" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.059677 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" containerName="probe" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.060790 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.063092 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.077079 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.082118 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.146363 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78cf8bc498-pb2xv"] Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.146641 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78cf8bc498-pb2xv" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerName="placement-log" containerID="cri-o://6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3" gracePeriod=30 Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.146777 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78cf8bc498-pb2xv" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerName="placement-api" containerID="cri-o://46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058" gracePeriod=30 Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.181824 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.181890 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-scripts\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.181921 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.182098 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4vh6\" (UniqueName: \"kubernetes.io/projected/82f2e6fd-58ee-4002-b167-096b3b715233-kube-api-access-n4vh6\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.182303 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.182342 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82f2e6fd-58ee-4002-b167-096b3b715233-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.283701 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.283751 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-scripts\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.283785 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.283878 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4vh6\" (UniqueName: \"kubernetes.io/projected/82f2e6fd-58ee-4002-b167-096b3b715233-kube-api-access-n4vh6\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.283958 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.283991 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82f2e6fd-58ee-4002-b167-096b3b715233-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.284113 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82f2e6fd-58ee-4002-b167-096b3b715233-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.295153 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-scripts\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.295328 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.298794 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.306260 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.309366 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4vh6\" (UniqueName: \"kubernetes.io/projected/82f2e6fd-58ee-4002-b167-096b3b715233-kube-api-access-n4vh6\") pod \"cinder-scheduler-0\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.387080 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.870051 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.996957 4786 generic.go:334] "Generic (PLEG): container finished" podID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerID="6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3" exitCode=143 Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.997098 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cf8bc498-pb2xv" event={"ID":"52da1f88-a7fd-4c07-9db5-6651531c94a2","Type":"ContainerDied","Data":"6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3"} Mar 13 15:25:15 crc kubenswrapper[4786]: I0313 15:25:15.998775 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82f2e6fd-58ee-4002-b167-096b3b715233","Type":"ContainerStarted","Data":"5a6d50ecfb812efe49c91ee04726931e914a525e77cb44e45e64c17eff233e52"} Mar 13 15:25:16 crc kubenswrapper[4786]: I0313 15:25:16.563336 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281f0a05-c7f3-4c5f-aad4-f953a8521233" path="/var/lib/kubelet/pods/281f0a05-c7f3-4c5f-aad4-f953a8521233/volumes" Mar 13 15:25:17 crc kubenswrapper[4786]: I0313 15:25:17.011647 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82f2e6fd-58ee-4002-b167-096b3b715233","Type":"ContainerStarted","Data":"d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7"} Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.021885 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82f2e6fd-58ee-4002-b167-096b3b715233","Type":"ContainerStarted","Data":"c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57"} Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.050927 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.050903875 podStartE2EDuration="3.050903875s" podCreationTimestamp="2026-03-13 15:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:18.046619468 +0000 UTC m=+1348.209831289" watchObservedRunningTime="2026-03-13 15:25:18.050903875 +0000 UTC m=+1348.214115686" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.771337 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.848215 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppv5q\" (UniqueName: \"kubernetes.io/projected/52da1f88-a7fd-4c07-9db5-6651531c94a2-kube-api-access-ppv5q\") pod \"52da1f88-a7fd-4c07-9db5-6651531c94a2\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.848265 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-combined-ca-bundle\") pod \"52da1f88-a7fd-4c07-9db5-6651531c94a2\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.848298 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-internal-tls-certs\") pod \"52da1f88-a7fd-4c07-9db5-6651531c94a2\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.848377 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-scripts\") pod \"52da1f88-a7fd-4c07-9db5-6651531c94a2\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.848426 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-public-tls-certs\") pod \"52da1f88-a7fd-4c07-9db5-6651531c94a2\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.848468 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52da1f88-a7fd-4c07-9db5-6651531c94a2-logs\") pod \"52da1f88-a7fd-4c07-9db5-6651531c94a2\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.848516 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-config-data\") pod \"52da1f88-a7fd-4c07-9db5-6651531c94a2\" (UID: \"52da1f88-a7fd-4c07-9db5-6651531c94a2\") " Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.849193 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52da1f88-a7fd-4c07-9db5-6651531c94a2-logs" (OuterVolumeSpecName: "logs") pod "52da1f88-a7fd-4c07-9db5-6651531c94a2" (UID: "52da1f88-a7fd-4c07-9db5-6651531c94a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.849378 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52da1f88-a7fd-4c07-9db5-6651531c94a2-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.854678 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52da1f88-a7fd-4c07-9db5-6651531c94a2-kube-api-access-ppv5q" (OuterVolumeSpecName: "kube-api-access-ppv5q") pod "52da1f88-a7fd-4c07-9db5-6651531c94a2" (UID: "52da1f88-a7fd-4c07-9db5-6651531c94a2"). InnerVolumeSpecName "kube-api-access-ppv5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.857406 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-scripts" (OuterVolumeSpecName: "scripts") pod "52da1f88-a7fd-4c07-9db5-6651531c94a2" (UID: "52da1f88-a7fd-4c07-9db5-6651531c94a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.884118 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.911172 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-config-data" (OuterVolumeSpecName: "config-data") pod "52da1f88-a7fd-4c07-9db5-6651531c94a2" (UID: "52da1f88-a7fd-4c07-9db5-6651531c94a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.948564 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52da1f88-a7fd-4c07-9db5-6651531c94a2" (UID: "52da1f88-a7fd-4c07-9db5-6651531c94a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.955990 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.956213 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppv5q\" (UniqueName: \"kubernetes.io/projected/52da1f88-a7fd-4c07-9db5-6651531c94a2-kube-api-access-ppv5q\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.956305 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.956380 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:18 crc kubenswrapper[4786]: I0313 15:25:18.995060 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52da1f88-a7fd-4c07-9db5-6651531c94a2" (UID: "52da1f88-a7fd-4c07-9db5-6651531c94a2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.009261 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "52da1f88-a7fd-4c07-9db5-6651531c94a2" (UID: "52da1f88-a7fd-4c07-9db5-6651531c94a2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.033024 4786 generic.go:334] "Generic (PLEG): container finished" podID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerID="46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058" exitCode=0 Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.033090 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cf8bc498-pb2xv" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.033125 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cf8bc498-pb2xv" event={"ID":"52da1f88-a7fd-4c07-9db5-6651531c94a2","Type":"ContainerDied","Data":"46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058"} Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.033151 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cf8bc498-pb2xv" event={"ID":"52da1f88-a7fd-4c07-9db5-6651531c94a2","Type":"ContainerDied","Data":"5ad552dae743cb431681da6d6371026673f4c330d48ac7da71d3966f92d025a8"} Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.033178 4786 scope.go:117] "RemoveContainer" containerID="46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.059996 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.060030 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52da1f88-a7fd-4c07-9db5-6651531c94a2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.060274 4786 scope.go:117] "RemoveContainer" containerID="6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.079140 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78cf8bc498-pb2xv"] Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.088155 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-78cf8bc498-pb2xv"] Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.088323 4786 scope.go:117] "RemoveContainer" containerID="46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058" Mar 13 15:25:19 crc kubenswrapper[4786]: E0313 15:25:19.088784 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058\": container with ID starting with 46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058 not found: ID does not exist" containerID="46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.088814 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058"} err="failed to get container status \"46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058\": rpc error: code = NotFound desc = could not find container \"46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058\": container with ID starting with 46a781a951b04848141af5261d41fa97866cf79cfe99bb5a36f2c4929b556058 not found: ID does not exist" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.088838 4786 scope.go:117] "RemoveContainer" containerID="6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3" Mar 13 15:25:19 crc kubenswrapper[4786]: E0313 15:25:19.093569 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3\": container with ID starting with 6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3 not found: ID does not exist" containerID="6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.093611 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3"} err="failed to get container status \"6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3\": rpc error: code = NotFound desc = could not find container \"6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3\": container with ID starting with 6ce91e74135d6042239d22a3080628b98de040eb1c5de98324675d84634e35e3 not found: ID does not exist" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.408002 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 15:25:19 crc kubenswrapper[4786]: E0313 15:25:19.408528 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerName="placement-log" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.408551 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerName="placement-log" Mar 13 15:25:19 crc kubenswrapper[4786]: E0313 15:25:19.408565 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerName="placement-api" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.408572 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerName="placement-api" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.408784 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerName="placement-log" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.408828 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" containerName="placement-api" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.409638 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.411910 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.412364 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9c6mp" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.412600 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.435366 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.569469 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.569647 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82kt7\" (UniqueName: \"kubernetes.io/projected/bc33fb6e-0b09-479a-9825-3f7dfb100f37-kube-api-access-82kt7\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.569765 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.569810 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.671354 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.671466 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82kt7\" (UniqueName: \"kubernetes.io/projected/bc33fb6e-0b09-479a-9825-3f7dfb100f37-kube-api-access-82kt7\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.671536 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.671568 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.673135 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.676478 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.676826 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config-secret\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.689374 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82kt7\" (UniqueName: \"kubernetes.io/projected/bc33fb6e-0b09-479a-9825-3f7dfb100f37-kube-api-access-82kt7\") pod \"openstackclient\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.739016 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.778156 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.778410 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="ceilometer-central-agent" containerID="cri-o://b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de" gracePeriod=30 Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.778793 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="proxy-httpd" containerID="cri-o://a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc" gracePeriod=30 Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.778847 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="sg-core" containerID="cri-o://54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c" gracePeriod=30 Mar 13 15:25:19 crc kubenswrapper[4786]: I0313 15:25:19.778972 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="ceilometer-notification-agent" containerID="cri-o://709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f" gracePeriod=30 Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.048272 4786 generic.go:334] "Generic (PLEG): container finished" podID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerID="a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc" exitCode=0 Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.048673 4786 generic.go:334] "Generic (PLEG): container finished" podID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerID="54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c" exitCode=2 Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.048751 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerDied","Data":"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc"} Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.048784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerDied","Data":"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c"} Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.270013 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-58644c46cc-wt6m2"] Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.271567 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.275314 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.275482 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.275583 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.306346 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.314304 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58644c46cc-wt6m2"] Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.382919 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-etc-swift\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.383011 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-internal-tls-certs\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.383069 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-config-data\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.383093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-public-tls-certs\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.383125 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k54n6\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-kube-api-access-k54n6\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.383142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-combined-ca-bundle\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.383172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-log-httpd\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.383194 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-run-httpd\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.387740 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 15:25:20 crc kubenswrapper[4786]: E0313 15:25:20.423174 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea0a108_dc5a_4700_a956_563674797beb.slice/crio-e1b6010fd0cac4c86214670d608b7906e9b44204e2d697e0a76ab25d512038bd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod043063bf_c580_4ded_9e1f_8351b6f5f600.slice/crio-b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod043063bf_c580_4ded_9e1f_8351b6f5f600.slice/crio-conmon-709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod043063bf_c580_4ded_9e1f_8351b6f5f600.slice/crio-conmon-b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea0a108_dc5a_4700_a956_563674797beb.slice/crio-conmon-e1b6010fd0cac4c86214670d608b7906e9b44204e2d697e0a76ab25d512038bd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod043063bf_c580_4ded_9e1f_8351b6f5f600.slice/crio-709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f.scope\": RecentStats: unable to find data in memory cache]" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.484673 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-internal-tls-certs\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.484763 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-config-data\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.484786 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-public-tls-certs\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.484825 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k54n6\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-kube-api-access-k54n6\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.484871 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-combined-ca-bundle\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.484915 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-log-httpd\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.484938 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-run-httpd\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.485002 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-etc-swift\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.486620 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-run-httpd\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.486900 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-log-httpd\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.491978 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-etc-swift\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.503626 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-internal-tls-certs\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.504336 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-public-tls-certs\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.505782 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-config-data\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.505963 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k54n6\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-kube-api-access-k54n6\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.507722 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-combined-ca-bundle\") pod \"swift-proxy-58644c46cc-wt6m2\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.606576 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.618068 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52da1f88-a7fd-4c07-9db5-6651531c94a2" path="/var/lib/kubelet/pods/52da1f88-a7fd-4c07-9db5-6651531c94a2/volumes" Mar 13 15:25:20 crc kubenswrapper[4786]: I0313 15:25:20.868555 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.024695 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-log-httpd\") pod \"043063bf-c580-4ded-9e1f-8351b6f5f600\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.024763 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-sg-core-conf-yaml\") pod \"043063bf-c580-4ded-9e1f-8351b6f5f600\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.024818 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2drvp\" (UniqueName: \"kubernetes.io/projected/043063bf-c580-4ded-9e1f-8351b6f5f600-kube-api-access-2drvp\") pod \"043063bf-c580-4ded-9e1f-8351b6f5f600\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.024850 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-combined-ca-bundle\") pod \"043063bf-c580-4ded-9e1f-8351b6f5f600\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.025011 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-scripts\") pod \"043063bf-c580-4ded-9e1f-8351b6f5f600\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.025098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-run-httpd\") pod \"043063bf-c580-4ded-9e1f-8351b6f5f600\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.025123 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-config-data\") pod \"043063bf-c580-4ded-9e1f-8351b6f5f600\" (UID: \"043063bf-c580-4ded-9e1f-8351b6f5f600\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.026875 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "043063bf-c580-4ded-9e1f-8351b6f5f600" (UID: "043063bf-c580-4ded-9e1f-8351b6f5f600"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.027178 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "043063bf-c580-4ded-9e1f-8351b6f5f600" (UID: "043063bf-c580-4ded-9e1f-8351b6f5f600"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.040304 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043063bf-c580-4ded-9e1f-8351b6f5f600-kube-api-access-2drvp" (OuterVolumeSpecName: "kube-api-access-2drvp") pod "043063bf-c580-4ded-9e1f-8351b6f5f600" (UID: "043063bf-c580-4ded-9e1f-8351b6f5f600"). InnerVolumeSpecName "kube-api-access-2drvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.051342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-scripts" (OuterVolumeSpecName: "scripts") pod "043063bf-c580-4ded-9e1f-8351b6f5f600" (UID: "043063bf-c580-4ded-9e1f-8351b6f5f600"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.110518 4786 generic.go:334] "Generic (PLEG): container finished" podID="dea0a108-dc5a-4700-a956-563674797beb" containerID="e1b6010fd0cac4c86214670d608b7906e9b44204e2d697e0a76ab25d512038bd" exitCode=137 Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.110590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679bcd6995-7xnv6" event={"ID":"dea0a108-dc5a-4700-a956-563674797beb","Type":"ContainerDied","Data":"e1b6010fd0cac4c86214670d608b7906e9b44204e2d697e0a76ab25d512038bd"} Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.127105 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bc33fb6e-0b09-479a-9825-3f7dfb100f37","Type":"ContainerStarted","Data":"9b55e41466f1dbf69b4cae8fe2a29a2ea3b1d6575e1f287a28802007d8eb9929"} Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.128270 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.128299 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2drvp\" (UniqueName: \"kubernetes.io/projected/043063bf-c580-4ded-9e1f-8351b6f5f600-kube-api-access-2drvp\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.128310 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.128318 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/043063bf-c580-4ded-9e1f-8351b6f5f600-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.137158 4786 generic.go:334] "Generic (PLEG): container finished" podID="1063e48c-fed7-49b9-89f2-186b4627caea" containerID="60a51f9d609a82f2f63e61bdebf44e301bc76385b779ef4d3c4e833b60346e07" exitCode=137 Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.137267 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" event={"ID":"1063e48c-fed7-49b9-89f2-186b4627caea","Type":"ContainerDied","Data":"60a51f9d609a82f2f63e61bdebf44e301bc76385b779ef4d3c4e833b60346e07"} Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.141983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "043063bf-c580-4ded-9e1f-8351b6f5f600" (UID: "043063bf-c580-4ded-9e1f-8351b6f5f600"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.145484 4786 generic.go:334] "Generic (PLEG): container finished" podID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerID="709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f" exitCode=0 Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.145580 4786 generic.go:334] "Generic (PLEG): container finished" podID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerID="b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de" exitCode=0 Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.145654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerDied","Data":"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f"} Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.145730 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerDied","Data":"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de"} Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.145808 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"043063bf-c580-4ded-9e1f-8351b6f5f600","Type":"ContainerDied","Data":"cbef2ba03ecf90a2cbc0cc92e956e9bd2d498eb92a9259ed8b7a9d94a403efeb"} Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.145903 4786 scope.go:117] "RemoveContainer" containerID="a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.146169 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.166129 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.186139 4786 scope.go:117] "RemoveContainer" containerID="54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.192485 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.195969 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "043063bf-c580-4ded-9e1f-8351b6f5f600" (UID: "043063bf-c580-4ded-9e1f-8351b6f5f600"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.229815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data-custom\") pod \"dea0a108-dc5a-4700-a956-563674797beb\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.230027 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-combined-ca-bundle\") pod \"dea0a108-dc5a-4700-a956-563674797beb\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.230723 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data\") pod \"dea0a108-dc5a-4700-a956-563674797beb\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.230815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdx8f\" (UniqueName: \"kubernetes.io/projected/dea0a108-dc5a-4700-a956-563674797beb-kube-api-access-pdx8f\") pod \"dea0a108-dc5a-4700-a956-563674797beb\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.230838 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea0a108-dc5a-4700-a956-563674797beb-logs\") pod \"dea0a108-dc5a-4700-a956-563674797beb\" (UID: \"dea0a108-dc5a-4700-a956-563674797beb\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.231338 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.231355 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.232101 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea0a108-dc5a-4700-a956-563674797beb-logs" (OuterVolumeSpecName: "logs") pod "dea0a108-dc5a-4700-a956-563674797beb" (UID: "dea0a108-dc5a-4700-a956-563674797beb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.232144 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-config-data" (OuterVolumeSpecName: "config-data") pod "043063bf-c580-4ded-9e1f-8351b6f5f600" (UID: "043063bf-c580-4ded-9e1f-8351b6f5f600"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.234234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea0a108-dc5a-4700-a956-563674797beb-kube-api-access-pdx8f" (OuterVolumeSpecName: "kube-api-access-pdx8f") pod "dea0a108-dc5a-4700-a956-563674797beb" (UID: "dea0a108-dc5a-4700-a956-563674797beb"). InnerVolumeSpecName "kube-api-access-pdx8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.234443 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dea0a108-dc5a-4700-a956-563674797beb" (UID: "dea0a108-dc5a-4700-a956-563674797beb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.240140 4786 scope.go:117] "RemoveContainer" containerID="709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.261754 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dea0a108-dc5a-4700-a956-563674797beb" (UID: "dea0a108-dc5a-4700-a956-563674797beb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.274975 4786 scope.go:117] "RemoveContainer" containerID="b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.281284 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data" (OuterVolumeSpecName: "config-data") pod "dea0a108-dc5a-4700-a956-563674797beb" (UID: "dea0a108-dc5a-4700-a956-563674797beb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.300384 4786 scope.go:117] "RemoveContainer" containerID="a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.300771 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc\": container with ID starting with a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc not found: ID does not exist" containerID="a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.300893 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc"} err="failed to get container status \"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc\": rpc error: code = NotFound desc = could not find container \"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc\": container with ID starting with a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc not found: ID does not exist" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.300970 4786 scope.go:117] "RemoveContainer" containerID="54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.301988 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c\": container with ID starting with 54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c not found: ID does not exist" containerID="54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.302090 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c"} err="failed to get container status \"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c\": rpc error: code = NotFound desc = could not find container \"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c\": container with ID starting with 54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c not found: ID does not exist" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.302177 4786 scope.go:117] "RemoveContainer" containerID="709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.302782 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f\": container with ID starting with 709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f not found: ID does not exist" containerID="709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.302892 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f"} err="failed to get container status \"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f\": rpc error: code = NotFound desc = could not find container \"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f\": container with ID starting with 709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f not found: ID does not exist" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.302959 4786 scope.go:117] "RemoveContainer" containerID="b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.303406 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de\": container with ID starting with b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de not found: ID does not exist" containerID="b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.303509 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de"} err="failed to get container status \"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de\": rpc error: code = NotFound desc = could not find container \"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de\": container with ID starting with b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de not found: ID does not exist" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.303580 4786 scope.go:117] "RemoveContainer" containerID="a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.304228 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc"} err="failed to get container status \"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc\": rpc error: code = NotFound desc = could not find container \"a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc\": container with ID starting with a2c9a9c471fb94a7443259660f781af6e0b4311ab8840aa432dbec92886402bc not found: ID does not exist" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.304313 4786 scope.go:117] "RemoveContainer" containerID="54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.304625 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c"} err="failed to get container status \"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c\": rpc error: code = NotFound desc = could not find container \"54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c\": container with ID starting with 54aee9eb64b3b6247c33367eb978c7170ab8f48c6b606f4aa0fcc62f538bc66c not found: ID does not exist" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.304732 4786 scope.go:117] "RemoveContainer" containerID="709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.305262 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f"} err="failed to get container status \"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f\": rpc error: code = NotFound desc = could not find container \"709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f\": container with ID starting with 709bc9c7bfc0d506a1dde5e9ba6fe928277b6b1f9f47df0c372b82593a8d075f not found: ID does not exist" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.305339 4786 scope.go:117] "RemoveContainer" containerID="b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.305778 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de"} err="failed to get container status \"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de\": rpc error: code = NotFound desc = could not find container \"b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de\": container with ID starting with b3fe6c3204ee18465e1e43fea0142d296449c449809445b35701e38c223927de not found: ID does not exist" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.332259 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwl5h\" (UniqueName: \"kubernetes.io/projected/1063e48c-fed7-49b9-89f2-186b4627caea-kube-api-access-bwl5h\") pod \"1063e48c-fed7-49b9-89f2-186b4627caea\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.332695 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data\") pod \"1063e48c-fed7-49b9-89f2-186b4627caea\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.333455 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data-custom\") pod \"1063e48c-fed7-49b9-89f2-186b4627caea\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.334027 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-combined-ca-bundle\") pod \"1063e48c-fed7-49b9-89f2-186b4627caea\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.334248 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1063e48c-fed7-49b9-89f2-186b4627caea-logs\") pod \"1063e48c-fed7-49b9-89f2-186b4627caea\" (UID: \"1063e48c-fed7-49b9-89f2-186b4627caea\") " Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.336011 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.336121 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.336221 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea0a108-dc5a-4700-a956-563674797beb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.336283 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043063bf-c580-4ded-9e1f-8351b6f5f600-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.336373 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdx8f\" (UniqueName: \"kubernetes.io/projected/dea0a108-dc5a-4700-a956-563674797beb-kube-api-access-pdx8f\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.336714 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dea0a108-dc5a-4700-a956-563674797beb-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.337131 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1063e48c-fed7-49b9-89f2-186b4627caea-logs" (OuterVolumeSpecName: "logs") pod "1063e48c-fed7-49b9-89f2-186b4627caea" (UID: "1063e48c-fed7-49b9-89f2-186b4627caea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.339249 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1063e48c-fed7-49b9-89f2-186b4627caea-kube-api-access-bwl5h" (OuterVolumeSpecName: "kube-api-access-bwl5h") pod "1063e48c-fed7-49b9-89f2-186b4627caea" (UID: "1063e48c-fed7-49b9-89f2-186b4627caea"). InnerVolumeSpecName "kube-api-access-bwl5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.339768 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1063e48c-fed7-49b9-89f2-186b4627caea" (UID: "1063e48c-fed7-49b9-89f2-186b4627caea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.365667 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1063e48c-fed7-49b9-89f2-186b4627caea" (UID: "1063e48c-fed7-49b9-89f2-186b4627caea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.387326 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data" (OuterVolumeSpecName: "config-data") pod "1063e48c-fed7-49b9-89f2-186b4627caea" (UID: "1063e48c-fed7-49b9-89f2-186b4627caea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.438146 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1063e48c-fed7-49b9-89f2-186b4627caea-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.438176 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwl5h\" (UniqueName: \"kubernetes.io/projected/1063e48c-fed7-49b9-89f2-186b4627caea-kube-api-access-bwl5h\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.438189 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.438198 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.438206 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1063e48c-fed7-49b9-89f2-186b4627caea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.462742 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58644c46cc-wt6m2"] Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.707009 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.716969 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.736934 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.737657 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea0a108-dc5a-4700-a956-563674797beb" containerName="barbican-worker-log" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.737680 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea0a108-dc5a-4700-a956-563674797beb" containerName="barbican-worker-log" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.737700 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" containerName="barbican-keystone-listener" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.737707 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" containerName="barbican-keystone-listener" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.737725 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="proxy-httpd" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.737732 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="proxy-httpd" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.737753 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" containerName="barbican-keystone-listener-log" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.737762 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" containerName="barbican-keystone-listener-log" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.737781 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="ceilometer-notification-agent" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.737788 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="ceilometer-notification-agent" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.737799 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="ceilometer-central-agent" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.737806 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="ceilometer-central-agent" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.737818 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="sg-core" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.737825 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="sg-core" Mar 13 15:25:21 crc kubenswrapper[4786]: E0313 15:25:21.737837 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea0a108-dc5a-4700-a956-563674797beb" containerName="barbican-worker" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.737843 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea0a108-dc5a-4700-a956-563674797beb" containerName="barbican-worker" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.738072 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" containerName="barbican-keystone-listener" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.738091 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" containerName="barbican-keystone-listener-log" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.738105 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea0a108-dc5a-4700-a956-563674797beb" containerName="barbican-worker" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.738112 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="ceilometer-central-agent" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.738123 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="proxy-httpd" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.738137 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea0a108-dc5a-4700-a956-563674797beb" containerName="barbican-worker-log" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.738151 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="sg-core" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.738163 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" containerName="ceilometer-notification-agent" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.740159 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.745421 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.745682 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.759192 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.844516 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dlh7\" (UniqueName: \"kubernetes.io/projected/12c56e2a-2724-4bad-ad76-c152ab597606-kube-api-access-8dlh7\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.844589 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-config-data\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.844646 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.844773 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-scripts\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.844955 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-run-httpd\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.844985 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-log-httpd\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.845021 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.946787 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dlh7\" (UniqueName: \"kubernetes.io/projected/12c56e2a-2724-4bad-ad76-c152ab597606-kube-api-access-8dlh7\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.947282 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-config-data\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.947331 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.947370 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-scripts\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.947423 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-run-httpd\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.947441 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-log-httpd\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.947466 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.948838 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-run-httpd\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.949235 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-log-httpd\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.951723 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.954142 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.954221 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-scripts\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.955755 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-config-data\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:21 crc kubenswrapper[4786]: I0313 15:25:21.968675 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dlh7\" (UniqueName: \"kubernetes.io/projected/12c56e2a-2724-4bad-ad76-c152ab597606-kube-api-access-8dlh7\") pod \"ceilometer-0\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " pod="openstack/ceilometer-0" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.070826 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.158623 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58644c46cc-wt6m2" event={"ID":"3b415188-88f4-447e-a1e9-ca424047ee8e","Type":"ContainerStarted","Data":"04eabc29555d142e746eeaf8979b97c2eb9926d8687b2f8cbceebf6652f56b8c"} Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.158662 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58644c46cc-wt6m2" event={"ID":"3b415188-88f4-447e-a1e9-ca424047ee8e","Type":"ContainerStarted","Data":"53da64bef97437f332bd54aa2f803a2a48a781201e425bdc912c1c54d853dc83"} Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.158673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58644c46cc-wt6m2" event={"ID":"3b415188-88f4-447e-a1e9-ca424047ee8e","Type":"ContainerStarted","Data":"363cdc4ab6757aea8542ba65a07dc6c93649f3ff64028068f1a81e799bc90fc5"} Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.158702 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.158717 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.163034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" event={"ID":"1063e48c-fed7-49b9-89f2-186b4627caea","Type":"ContainerDied","Data":"6ebb37d204b7d57efc1a565452a94fc943aa32d22e889c150727ef28f081ad31"} Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.163076 4786 scope.go:117] "RemoveContainer" containerID="60a51f9d609a82f2f63e61bdebf44e301bc76385b779ef4d3c4e833b60346e07" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.163109 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5cd697874d-c2gtp" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.166014 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-679bcd6995-7xnv6" event={"ID":"dea0a108-dc5a-4700-a956-563674797beb","Type":"ContainerDied","Data":"576b4bea50e3b02bb64d7f78f21b6343aee734fed3bab91b49757abb0935b2bf"} Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.166080 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-679bcd6995-7xnv6" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.177109 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-58644c46cc-wt6m2" podStartSLOduration=2.177093171 podStartE2EDuration="2.177093171s" podCreationTimestamp="2026-03-13 15:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:22.175525802 +0000 UTC m=+1352.338737613" watchObservedRunningTime="2026-03-13 15:25:22.177093171 +0000 UTC m=+1352.340304982" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.200027 4786 scope.go:117] "RemoveContainer" containerID="a4f3f4569c5f0cc8dfdcbbd90eb9b6d5c153327bfd23daa7ca118edda5baf004" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.227081 4786 scope.go:117] "RemoveContainer" containerID="e1b6010fd0cac4c86214670d608b7906e9b44204e2d697e0a76ab25d512038bd" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.256972 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-679bcd6995-7xnv6"] Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.266437 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-679bcd6995-7xnv6"] Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.275188 4786 scope.go:117] "RemoveContainer" containerID="bdb0179cb2a6bc6617c488f820d28e7e3a40df645ba79cc6d779129a882afa34" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.276305 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5cd697874d-c2gtp"] Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.287404 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5cd697874d-c2gtp"] Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.564430 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043063bf-c580-4ded-9e1f-8351b6f5f600" path="/var/lib/kubelet/pods/043063bf-c580-4ded-9e1f-8351b6f5f600/volumes" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.565635 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1063e48c-fed7-49b9-89f2-186b4627caea" path="/var/lib/kubelet/pods/1063e48c-fed7-49b9-89f2-186b4627caea/volumes" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.566961 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea0a108-dc5a-4700-a956-563674797beb" path="/var/lib/kubelet/pods/dea0a108-dc5a-4700-a956-563674797beb/volumes" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.631894 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:22 crc kubenswrapper[4786]: W0313 15:25:22.637234 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12c56e2a_2724_4bad_ad76_c152ab597606.slice/crio-248a54ba5585514eb2cb0d2cce9253a7ad9797c015a2497d999c536662872371 WatchSource:0}: Error finding container 248a54ba5585514eb2cb0d2cce9253a7ad9797c015a2497d999c536662872371: Status 404 returned error can't find the container with id 248a54ba5585514eb2cb0d2cce9253a7ad9797c015a2497d999c536662872371 Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.726539 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.801356 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6555f55d84-r89pv"] Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.801584 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6555f55d84-r89pv" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerName="neutron-api" containerID="cri-o://de3445a1fdb5e7bffe16b21eca305829708edd7a5fbf7c071f4f2f3e27ebbf3c" gracePeriod=30 Mar 13 15:25:22 crc kubenswrapper[4786]: I0313 15:25:22.801978 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6555f55d84-r89pv" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerName="neutron-httpd" containerID="cri-o://a1ebd20687d6a618027792e71e4c3c2956017515d7c8b4635255e2dd1bd90cea" gracePeriod=30 Mar 13 15:25:23 crc kubenswrapper[4786]: I0313 15:25:23.188256 4786 generic.go:334] "Generic (PLEG): container finished" podID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerID="a1ebd20687d6a618027792e71e4c3c2956017515d7c8b4635255e2dd1bd90cea" exitCode=0 Mar 13 15:25:23 crc kubenswrapper[4786]: I0313 15:25:23.188331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6555f55d84-r89pv" event={"ID":"20b9e933-6f29-489e-92df-ac8ed12ae33d","Type":"ContainerDied","Data":"a1ebd20687d6a618027792e71e4c3c2956017515d7c8b4635255e2dd1bd90cea"} Mar 13 15:25:23 crc kubenswrapper[4786]: I0313 15:25:23.190938 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerStarted","Data":"248a54ba5585514eb2cb0d2cce9253a7ad9797c015a2497d999c536662872371"} Mar 13 15:25:24 crc kubenswrapper[4786]: I0313 15:25:24.202880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerStarted","Data":"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245"} Mar 13 15:25:24 crc kubenswrapper[4786]: I0313 15:25:24.203214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerStarted","Data":"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c"} Mar 13 15:25:25 crc kubenswrapper[4786]: I0313 15:25:25.212502 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerStarted","Data":"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357"} Mar 13 15:25:25 crc kubenswrapper[4786]: I0313 15:25:25.610174 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 15:25:27 crc kubenswrapper[4786]: I0313 15:25:27.245435 4786 generic.go:334] "Generic (PLEG): container finished" podID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerID="de3445a1fdb5e7bffe16b21eca305829708edd7a5fbf7c071f4f2f3e27ebbf3c" exitCode=0 Mar 13 15:25:27 crc kubenswrapper[4786]: I0313 15:25:27.245529 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6555f55d84-r89pv" event={"ID":"20b9e933-6f29-489e-92df-ac8ed12ae33d","Type":"ContainerDied","Data":"de3445a1fdb5e7bffe16b21eca305829708edd7a5fbf7c071f4f2f3e27ebbf3c"} Mar 13 15:25:27 crc kubenswrapper[4786]: I0313 15:25:27.437792 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:30 crc kubenswrapper[4786]: I0313 15:25:30.612140 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:30 crc kubenswrapper[4786]: I0313 15:25:30.614256 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.559416 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.645038 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b8jj\" (UniqueName: \"kubernetes.io/projected/20b9e933-6f29-489e-92df-ac8ed12ae33d-kube-api-access-8b8jj\") pod \"20b9e933-6f29-489e-92df-ac8ed12ae33d\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.645201 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-httpd-config\") pod \"20b9e933-6f29-489e-92df-ac8ed12ae33d\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.645260 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-combined-ca-bundle\") pod \"20b9e933-6f29-489e-92df-ac8ed12ae33d\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.645288 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-ovndb-tls-certs\") pod \"20b9e933-6f29-489e-92df-ac8ed12ae33d\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.645366 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-config\") pod \"20b9e933-6f29-489e-92df-ac8ed12ae33d\" (UID: \"20b9e933-6f29-489e-92df-ac8ed12ae33d\") " Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.651576 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "20b9e933-6f29-489e-92df-ac8ed12ae33d" (UID: "20b9e933-6f29-489e-92df-ac8ed12ae33d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.654838 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b9e933-6f29-489e-92df-ac8ed12ae33d-kube-api-access-8b8jj" (OuterVolumeSpecName: "kube-api-access-8b8jj") pod "20b9e933-6f29-489e-92df-ac8ed12ae33d" (UID: "20b9e933-6f29-489e-92df-ac8ed12ae33d"). InnerVolumeSpecName "kube-api-access-8b8jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.704923 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20b9e933-6f29-489e-92df-ac8ed12ae33d" (UID: "20b9e933-6f29-489e-92df-ac8ed12ae33d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.715770 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-config" (OuterVolumeSpecName: "config") pod "20b9e933-6f29-489e-92df-ac8ed12ae33d" (UID: "20b9e933-6f29-489e-92df-ac8ed12ae33d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.731881 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "20b9e933-6f29-489e-92df-ac8ed12ae33d" (UID: "20b9e933-6f29-489e-92df-ac8ed12ae33d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.748040 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.748089 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.748106 4786 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.748118 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/20b9e933-6f29-489e-92df-ac8ed12ae33d-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:31 crc kubenswrapper[4786]: I0313 15:25:31.748135 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b8jj\" (UniqueName: \"kubernetes.io/projected/20b9e933-6f29-489e-92df-ac8ed12ae33d-kube-api-access-8b8jj\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.292997 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerStarted","Data":"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232"} Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.293156 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="ceilometer-central-agent" containerID="cri-o://f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c" gracePeriod=30 Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.293182 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="proxy-httpd" containerID="cri-o://b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232" gracePeriod=30 Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.293299 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="ceilometer-notification-agent" containerID="cri-o://fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245" gracePeriod=30 Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.293429 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.293216 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="sg-core" containerID="cri-o://70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357" gracePeriod=30 Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.294868 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bc33fb6e-0b09-479a-9825-3f7dfb100f37","Type":"ContainerStarted","Data":"5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97"} Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.298971 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6555f55d84-r89pv" event={"ID":"20b9e933-6f29-489e-92df-ac8ed12ae33d","Type":"ContainerDied","Data":"ba78f3604cd999b75741463b390584086cde93602d009ffb6a7133fbe710ffe6"} Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.299037 4786 scope.go:117] "RemoveContainer" containerID="a1ebd20687d6a618027792e71e4c3c2956017515d7c8b4635255e2dd1bd90cea" Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.299035 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6555f55d84-r89pv" Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.327198 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.682547123 podStartE2EDuration="11.32718329s" podCreationTimestamp="2026-03-13 15:25:21 +0000 UTC" firstStartedPulling="2026-03-13 15:25:22.642081568 +0000 UTC m=+1352.805293379" lastFinishedPulling="2026-03-13 15:25:31.286717735 +0000 UTC m=+1361.449929546" observedRunningTime="2026-03-13 15:25:32.326072652 +0000 UTC m=+1362.489284483" watchObservedRunningTime="2026-03-13 15:25:32.32718329 +0000 UTC m=+1362.490395101" Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.331483 4786 scope.go:117] "RemoveContainer" containerID="de3445a1fdb5e7bffe16b21eca305829708edd7a5fbf7c071f4f2f3e27ebbf3c" Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.348300 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6555f55d84-r89pv"] Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.360971 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6555f55d84-r89pv"] Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.369469 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.418662164 podStartE2EDuration="13.369452238s" podCreationTimestamp="2026-03-13 15:25:19 +0000 UTC" firstStartedPulling="2026-03-13 15:25:20.350149624 +0000 UTC m=+1350.513361435" lastFinishedPulling="2026-03-13 15:25:31.300939708 +0000 UTC m=+1361.464151509" observedRunningTime="2026-03-13 15:25:32.363760686 +0000 UTC m=+1362.526972497" watchObservedRunningTime="2026-03-13 15:25:32.369452238 +0000 UTC m=+1362.532664049" Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.562929 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" path="/var/lib/kubelet/pods/20b9e933-6f29-489e-92df-ac8ed12ae33d/volumes" Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.960030 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.960528 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerName="glance-log" containerID="cri-o://7bfe1d57d354079630c40a517042fed84f22086574bff47d637020a997e365c0" gracePeriod=30 Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.960653 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerName="glance-httpd" containerID="cri-o://0399f2db3f72283f5c6cc3ac379a8523fcbcc9d2bccb605cb969c082a8cacc8f" gracePeriod=30 Mar 13 15:25:32 crc kubenswrapper[4786]: I0313 15:25:32.989391 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.074420 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dlh7\" (UniqueName: \"kubernetes.io/projected/12c56e2a-2724-4bad-ad76-c152ab597606-kube-api-access-8dlh7\") pod \"12c56e2a-2724-4bad-ad76-c152ab597606\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.074490 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-run-httpd\") pod \"12c56e2a-2724-4bad-ad76-c152ab597606\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.074600 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-config-data\") pod \"12c56e2a-2724-4bad-ad76-c152ab597606\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.074621 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-scripts\") pod \"12c56e2a-2724-4bad-ad76-c152ab597606\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.074639 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-combined-ca-bundle\") pod \"12c56e2a-2724-4bad-ad76-c152ab597606\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.074725 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-sg-core-conf-yaml\") pod \"12c56e2a-2724-4bad-ad76-c152ab597606\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.074791 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-log-httpd\") pod \"12c56e2a-2724-4bad-ad76-c152ab597606\" (UID: \"12c56e2a-2724-4bad-ad76-c152ab597606\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.075674 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12c56e2a-2724-4bad-ad76-c152ab597606" (UID: "12c56e2a-2724-4bad-ad76-c152ab597606"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.077361 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12c56e2a-2724-4bad-ad76-c152ab597606" (UID: "12c56e2a-2724-4bad-ad76-c152ab597606"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.084724 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c56e2a-2724-4bad-ad76-c152ab597606-kube-api-access-8dlh7" (OuterVolumeSpecName: "kube-api-access-8dlh7") pod "12c56e2a-2724-4bad-ad76-c152ab597606" (UID: "12c56e2a-2724-4bad-ad76-c152ab597606"). InnerVolumeSpecName "kube-api-access-8dlh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.094756 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-scripts" (OuterVolumeSpecName: "scripts") pod "12c56e2a-2724-4bad-ad76-c152ab597606" (UID: "12c56e2a-2724-4bad-ad76-c152ab597606"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.113268 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12c56e2a-2724-4bad-ad76-c152ab597606" (UID: "12c56e2a-2724-4bad-ad76-c152ab597606"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.161588 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12c56e2a-2724-4bad-ad76-c152ab597606" (UID: "12c56e2a-2724-4bad-ad76-c152ab597606"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.176962 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.176996 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.177005 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.177017 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.177026 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12c56e2a-2724-4bad-ad76-c152ab597606-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.177034 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dlh7\" (UniqueName: \"kubernetes.io/projected/12c56e2a-2724-4bad-ad76-c152ab597606-kube-api-access-8dlh7\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.246111 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-config-data" (OuterVolumeSpecName: "config-data") pod "12c56e2a-2724-4bad-ad76-c152ab597606" (UID: "12c56e2a-2724-4bad-ad76-c152ab597606"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.278464 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12c56e2a-2724-4bad-ad76-c152ab597606-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.311022 4786 generic.go:334] "Generic (PLEG): container finished" podID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerID="7bfe1d57d354079630c40a517042fed84f22086574bff47d637020a997e365c0" exitCode=143 Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.311101 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eeb9deff-99f7-4425-a84c-a520d67430ed","Type":"ContainerDied","Data":"7bfe1d57d354079630c40a517042fed84f22086574bff47d637020a997e365c0"} Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.313233 4786 generic.go:334] "Generic (PLEG): container finished" podID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerID="149ae29761fcea916eb11613dc8da76bbebea576e3bc59e445903d136869a7db" exitCode=137 Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.313281 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a6781ef-43aa-4084-9e32-3667cd7a7d18","Type":"ContainerDied","Data":"149ae29761fcea916eb11613dc8da76bbebea576e3bc59e445903d136869a7db"} Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.313300 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1a6781ef-43aa-4084-9e32-3667cd7a7d18","Type":"ContainerDied","Data":"9c7aa0b5317bfe1c96e01132591e36aa4ee6accaf8bff15868ccea3e0e637eaf"} Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.313310 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c7aa0b5317bfe1c96e01132591e36aa4ee6accaf8bff15868ccea3e0e637eaf" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.317921 4786 generic.go:334] "Generic (PLEG): container finished" podID="12c56e2a-2724-4bad-ad76-c152ab597606" containerID="b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232" exitCode=0 Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.317976 4786 generic.go:334] "Generic (PLEG): container finished" podID="12c56e2a-2724-4bad-ad76-c152ab597606" containerID="70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357" exitCode=2 Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.317978 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerDied","Data":"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232"} Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.317996 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.318051 4786 scope.go:117] "RemoveContainer" containerID="b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.318037 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerDied","Data":"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357"} Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.318161 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerDied","Data":"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245"} Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.317989 4786 generic.go:334] "Generic (PLEG): container finished" podID="12c56e2a-2724-4bad-ad76-c152ab597606" containerID="fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245" exitCode=0 Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.318289 4786 generic.go:334] "Generic (PLEG): container finished" podID="12c56e2a-2724-4bad-ad76-c152ab597606" containerID="f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c" exitCode=0 Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.318343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerDied","Data":"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c"} Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.318366 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12c56e2a-2724-4bad-ad76-c152ab597606","Type":"ContainerDied","Data":"248a54ba5585514eb2cb0d2cce9253a7ad9797c015a2497d999c536662872371"} Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.336918 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.359230 4786 scope.go:117] "RemoveContainer" containerID="70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.359584 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.377989 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393073 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.393390 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerName="neutron-httpd" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393401 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerName="neutron-httpd" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.393413 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="proxy-httpd" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393421 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="proxy-httpd" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.393433 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="ceilometer-notification-agent" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393440 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="ceilometer-notification-agent" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.393448 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerName="cinder-api" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393454 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerName="cinder-api" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.393466 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="ceilometer-central-agent" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393472 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="ceilometer-central-agent" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.393479 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerName="cinder-api-log" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393485 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerName="cinder-api-log" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.393493 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="sg-core" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393498 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="sg-core" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.393514 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerName="neutron-api" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393519 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerName="neutron-api" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393663 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="ceilometer-central-agent" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393673 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerName="cinder-api" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393683 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="proxy-httpd" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393689 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerName="neutron-httpd" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393700 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" containerName="cinder-api-log" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393716 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="ceilometer-notification-agent" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393726 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" containerName="sg-core" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.393736 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b9e933-6f29-489e-92df-ac8ed12ae33d" containerName="neutron-api" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.395452 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.403293 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.403461 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.403544 4786 scope.go:117] "RemoveContainer" containerID="fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.423994 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.446881 4786 scope.go:117] "RemoveContainer" containerID="f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.481705 4786 scope.go:117] "RemoveContainer" containerID="b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.486041 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": container with ID starting with b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232 not found: ID does not exist" containerID="b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.486095 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232"} err="failed to get container status \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": rpc error: code = NotFound desc = could not find container \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": container with ID starting with b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.486128 4786 scope.go:117] "RemoveContainer" containerID="70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.487324 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": container with ID starting with 70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357 not found: ID does not exist" containerID="70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.487374 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357"} err="failed to get container status \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": rpc error: code = NotFound desc = could not find container \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": container with ID starting with 70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.487407 4786 scope.go:117] "RemoveContainer" containerID="fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.496507 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": container with ID starting with fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245 not found: ID does not exist" containerID="fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.496560 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245"} err="failed to get container status \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": rpc error: code = NotFound desc = could not find container \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": container with ID starting with fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.496594 4786 scope.go:117] "RemoveContainer" containerID="f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c" Mar 13 15:25:33 crc kubenswrapper[4786]: E0313 15:25:33.497556 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": container with ID starting with f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c not found: ID does not exist" containerID="f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.497584 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c"} err="failed to get container status \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": rpc error: code = NotFound desc = could not find container \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": container with ID starting with f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.497605 4786 scope.go:117] "RemoveContainer" containerID="b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.498980 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232"} err="failed to get container status \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": rpc error: code = NotFound desc = could not find container \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": container with ID starting with b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.499020 4786 scope.go:117] "RemoveContainer" containerID="70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.499940 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357"} err="failed to get container status \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": rpc error: code = NotFound desc = could not find container \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": container with ID starting with 70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.499977 4786 scope.go:117] "RemoveContainer" containerID="fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.500265 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245"} err="failed to get container status \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": rpc error: code = NotFound desc = could not find container \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": container with ID starting with fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.500290 4786 scope.go:117] "RemoveContainer" containerID="f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.500602 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c"} err="failed to get container status \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": rpc error: code = NotFound desc = could not find container \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": container with ID starting with f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.500663 4786 scope.go:117] "RemoveContainer" containerID="b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.500977 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232"} err="failed to get container status \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": rpc error: code = NotFound desc = could not find container \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": container with ID starting with b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.501001 4786 scope.go:117] "RemoveContainer" containerID="70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.502159 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357"} err="failed to get container status \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": rpc error: code = NotFound desc = could not find container \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": container with ID starting with 70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.502180 4786 scope.go:117] "RemoveContainer" containerID="fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.502348 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245"} err="failed to get container status \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": rpc error: code = NotFound desc = could not find container \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": container with ID starting with fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.502367 4786 scope.go:117] "RemoveContainer" containerID="f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.502585 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c"} err="failed to get container status \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": rpc error: code = NotFound desc = could not find container \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": container with ID starting with f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.502624 4786 scope.go:117] "RemoveContainer" containerID="b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.503525 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232"} err="failed to get container status \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": rpc error: code = NotFound desc = could not find container \"b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232\": container with ID starting with b1c62f6a27fc3e1cfc254862108816bbe252921c1086c7ac255b7522d50be232 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.503549 4786 scope.go:117] "RemoveContainer" containerID="70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.503744 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357"} err="failed to get container status \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": rpc error: code = NotFound desc = could not find container \"70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357\": container with ID starting with 70fc5375f284af5e53f870261d5562b644cd56be98ed3c05bb24050bc9fb0357 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.503766 4786 scope.go:117] "RemoveContainer" containerID="fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.504142 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245"} err="failed to get container status \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": rpc error: code = NotFound desc = could not find container \"fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245\": container with ID starting with fd725da5ea05ceca880e43c00ae1cca555873fb53009e6afd1f1ab156c972245 not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.504162 4786 scope.go:117] "RemoveContainer" containerID="f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.504339 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c"} err="failed to get container status \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": rpc error: code = NotFound desc = could not find container \"f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c\": container with ID starting with f51422c4812b5fba62a2846ee488741aee2b9d4988fd83bde9f9e8098a91738c not found: ID does not exist" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.505896 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a6781ef-43aa-4084-9e32-3667cd7a7d18-etc-machine-id\") pod \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.505969 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data\") pod \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.505989 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data-custom\") pod \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506112 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-scripts\") pod \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506176 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6781ef-43aa-4084-9e32-3667cd7a7d18-logs\") pod \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506258 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rllfj\" (UniqueName: \"kubernetes.io/projected/1a6781ef-43aa-4084-9e32-3667cd7a7d18-kube-api-access-rllfj\") pod \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506283 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-combined-ca-bundle\") pod \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\" (UID: \"1a6781ef-43aa-4084-9e32-3667cd7a7d18\") " Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506357 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a6781ef-43aa-4084-9e32-3667cd7a7d18-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1a6781ef-43aa-4084-9e32-3667cd7a7d18" (UID: "1a6781ef-43aa-4084-9e32-3667cd7a7d18"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506511 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-log-httpd\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506590 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506626 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-config-data\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506691 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtfp\" (UniqueName: \"kubernetes.io/projected/85f784cf-ed1f-4734-bc01-ebdc55752f2c-kube-api-access-9qtfp\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506734 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-scripts\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506752 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506768 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-run-httpd\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.506827 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a6781ef-43aa-4084-9e32-3667cd7a7d18-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.507992 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6781ef-43aa-4084-9e32-3667cd7a7d18-logs" (OuterVolumeSpecName: "logs") pod "1a6781ef-43aa-4084-9e32-3667cd7a7d18" (UID: "1a6781ef-43aa-4084-9e32-3667cd7a7d18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.510327 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1a6781ef-43aa-4084-9e32-3667cd7a7d18" (UID: "1a6781ef-43aa-4084-9e32-3667cd7a7d18"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.511143 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6781ef-43aa-4084-9e32-3667cd7a7d18-kube-api-access-rllfj" (OuterVolumeSpecName: "kube-api-access-rllfj") pod "1a6781ef-43aa-4084-9e32-3667cd7a7d18" (UID: "1a6781ef-43aa-4084-9e32-3667cd7a7d18"). InnerVolumeSpecName "kube-api-access-rllfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.521003 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-scripts" (OuterVolumeSpecName: "scripts") pod "1a6781ef-43aa-4084-9e32-3667cd7a7d18" (UID: "1a6781ef-43aa-4084-9e32-3667cd7a7d18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.544686 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a6781ef-43aa-4084-9e32-3667cd7a7d18" (UID: "1a6781ef-43aa-4084-9e32-3667cd7a7d18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.560970 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data" (OuterVolumeSpecName: "config-data") pod "1a6781ef-43aa-4084-9e32-3667cd7a7d18" (UID: "1a6781ef-43aa-4084-9e32-3667cd7a7d18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.607784 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-log-httpd\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.607912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.607956 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-config-data\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.607997 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtfp\" (UniqueName: \"kubernetes.io/projected/85f784cf-ed1f-4734-bc01-ebdc55752f2c-kube-api-access-9qtfp\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608043 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-scripts\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608064 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608077 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-run-httpd\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608136 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rllfj\" (UniqueName: \"kubernetes.io/projected/1a6781ef-43aa-4084-9e32-3667cd7a7d18-kube-api-access-rllfj\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608147 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608156 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608165 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608174 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a6781ef-43aa-4084-9e32-3667cd7a7d18-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608182 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6781ef-43aa-4084-9e32-3667cd7a7d18-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.608538 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-run-httpd\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.609409 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-log-httpd\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.612135 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-scripts\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.612473 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.612973 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-config-data\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.618836 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.625063 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtfp\" (UniqueName: \"kubernetes.io/projected/85f784cf-ed1f-4734-bc01-ebdc55752f2c-kube-api-access-9qtfp\") pod \"ceilometer-0\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " pod="openstack/ceilometer-0" Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.731494 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:33 crc kubenswrapper[4786]: I0313 15:25:33.732160 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.176322 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.328714 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerStarted","Data":"60ab3f94c0d8b06b29ce97bea86c248521663adb4e0b55275842f8984530900a"} Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.328755 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.372414 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.383599 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.395437 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.397178 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.401730 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.401954 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.402097 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.427980 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.443219 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.443458 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-log" containerID="cri-o://df580d52b6d0ead148d878f686b085d8186767fa428863bb7af2712c22cea3ab" gracePeriod=30 Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.443506 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-httpd" containerID="cri-o://093d9f552c30bdec922a9a62149d9ced8460871a61712767fba2e030b08a33c1" gracePeriod=30 Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541066 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541119 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54a96a07-f63f-47d9-9191-0548996f01a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541148 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a96a07-f63f-47d9-9191-0548996f01a7-logs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541218 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541356 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnnxb\" (UniqueName: \"kubernetes.io/projected/54a96a07-f63f-47d9-9191-0548996f01a7-kube-api-access-hnnxb\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541412 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-scripts\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541433 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.541657 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.563950 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c56e2a-2724-4bad-ad76-c152ab597606" path="/var/lib/kubelet/pods/12c56e2a-2724-4bad-ad76-c152ab597606/volumes" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.564734 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a6781ef-43aa-4084-9e32-3667cd7a7d18" path="/var/lib/kubelet/pods/1a6781ef-43aa-4084-9e32-3667cd7a7d18/volumes" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.642768 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.642869 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.642899 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54a96a07-f63f-47d9-9191-0548996f01a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.642926 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a96a07-f63f-47d9-9191-0548996f01a7-logs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.642977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.643001 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.643026 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnnxb\" (UniqueName: \"kubernetes.io/projected/54a96a07-f63f-47d9-9191-0548996f01a7-kube-api-access-hnnxb\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.643054 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-scripts\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.643073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.644062 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54a96a07-f63f-47d9-9191-0548996f01a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.644327 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a96a07-f63f-47d9-9191-0548996f01a7-logs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.647211 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.647460 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-scripts\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.649148 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.649211 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.650073 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.653369 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.663925 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnnxb\" (UniqueName: \"kubernetes.io/projected/54a96a07-f63f-47d9-9191-0548996f01a7-kube-api-access-hnnxb\") pod \"cinder-api-0\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " pod="openstack/cinder-api-0" Mar 13 15:25:34 crc kubenswrapper[4786]: I0313 15:25:34.718565 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 15:25:35 crc kubenswrapper[4786]: I0313 15:25:35.183406 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:25:35 crc kubenswrapper[4786]: W0313 15:25:35.183952 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54a96a07_f63f_47d9_9191_0548996f01a7.slice/crio-713add33d33bda07f6a00ae2a6b7ef225bb7870c3cdb4671a24900eba46a92a6 WatchSource:0}: Error finding container 713add33d33bda07f6a00ae2a6b7ef225bb7870c3cdb4671a24900eba46a92a6: Status 404 returned error can't find the container with id 713add33d33bda07f6a00ae2a6b7ef225bb7870c3cdb4671a24900eba46a92a6 Mar 13 15:25:35 crc kubenswrapper[4786]: I0313 15:25:35.342497 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a96a07-f63f-47d9-9191-0548996f01a7","Type":"ContainerStarted","Data":"713add33d33bda07f6a00ae2a6b7ef225bb7870c3cdb4671a24900eba46a92a6"} Mar 13 15:25:35 crc kubenswrapper[4786]: I0313 15:25:35.357459 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerStarted","Data":"30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f"} Mar 13 15:25:35 crc kubenswrapper[4786]: I0313 15:25:35.359373 4786 generic.go:334] "Generic (PLEG): container finished" podID="24def402-fa10-4192-a42c-fb38e387247c" containerID="df580d52b6d0ead148d878f686b085d8186767fa428863bb7af2712c22cea3ab" exitCode=143 Mar 13 15:25:35 crc kubenswrapper[4786]: I0313 15:25:35.359416 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24def402-fa10-4192-a42c-fb38e387247c","Type":"ContainerDied","Data":"df580d52b6d0ead148d878f686b085d8186767fa428863bb7af2712c22cea3ab"} Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.377053 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a96a07-f63f-47d9-9191-0548996f01a7","Type":"ContainerStarted","Data":"4c92acc295fd57ca23b0e288711fb9a3af052bd9a754638cb9ef30ab57e5b073"} Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.380877 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerStarted","Data":"86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b"} Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.383425 4786 generic.go:334] "Generic (PLEG): container finished" podID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerID="0399f2db3f72283f5c6cc3ac379a8523fcbcc9d2bccb605cb969c082a8cacc8f" exitCode=0 Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.383466 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eeb9deff-99f7-4425-a84c-a520d67430ed","Type":"ContainerDied","Data":"0399f2db3f72283f5c6cc3ac379a8523fcbcc9d2bccb605cb969c082a8cacc8f"} Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.667988 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.795887 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-combined-ca-bundle\") pod \"eeb9deff-99f7-4425-a84c-a520d67430ed\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.795942 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-httpd-run\") pod \"eeb9deff-99f7-4425-a84c-a520d67430ed\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.795986 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-scripts\") pod \"eeb9deff-99f7-4425-a84c-a520d67430ed\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.796024 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-config-data\") pod \"eeb9deff-99f7-4425-a84c-a520d67430ed\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.796042 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-logs\") pod \"eeb9deff-99f7-4425-a84c-a520d67430ed\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.796188 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8zdc\" (UniqueName: \"kubernetes.io/projected/eeb9deff-99f7-4425-a84c-a520d67430ed-kube-api-access-r8zdc\") pod \"eeb9deff-99f7-4425-a84c-a520d67430ed\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.796236 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"eeb9deff-99f7-4425-a84c-a520d67430ed\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.796260 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-public-tls-certs\") pod \"eeb9deff-99f7-4425-a84c-a520d67430ed\" (UID: \"eeb9deff-99f7-4425-a84c-a520d67430ed\") " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.797224 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-logs" (OuterVolumeSpecName: "logs") pod "eeb9deff-99f7-4425-a84c-a520d67430ed" (UID: "eeb9deff-99f7-4425-a84c-a520d67430ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.797244 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eeb9deff-99f7-4425-a84c-a520d67430ed" (UID: "eeb9deff-99f7-4425-a84c-a520d67430ed"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.801082 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb9deff-99f7-4425-a84c-a520d67430ed-kube-api-access-r8zdc" (OuterVolumeSpecName: "kube-api-access-r8zdc") pod "eeb9deff-99f7-4425-a84c-a520d67430ed" (UID: "eeb9deff-99f7-4425-a84c-a520d67430ed"). InnerVolumeSpecName "kube-api-access-r8zdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.804004 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-scripts" (OuterVolumeSpecName: "scripts") pod "eeb9deff-99f7-4425-a84c-a520d67430ed" (UID: "eeb9deff-99f7-4425-a84c-a520d67430ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.804007 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "eeb9deff-99f7-4425-a84c-a520d67430ed" (UID: "eeb9deff-99f7-4425-a84c-a520d67430ed"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.833141 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eeb9deff-99f7-4425-a84c-a520d67430ed" (UID: "eeb9deff-99f7-4425-a84c-a520d67430ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.848539 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eeb9deff-99f7-4425-a84c-a520d67430ed" (UID: "eeb9deff-99f7-4425-a84c-a520d67430ed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.857689 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-config-data" (OuterVolumeSpecName: "config-data") pod "eeb9deff-99f7-4425-a84c-a520d67430ed" (UID: "eeb9deff-99f7-4425-a84c-a520d67430ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.897658 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8zdc\" (UniqueName: \"kubernetes.io/projected/eeb9deff-99f7-4425-a84c-a520d67430ed-kube-api-access-r8zdc\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.898021 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.898034 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.898043 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.898052 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.898061 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.898069 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeb9deff-99f7-4425-a84c-a520d67430ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.898077 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeb9deff-99f7-4425-a84c-a520d67430ed-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.917635 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 15:25:36 crc kubenswrapper[4786]: I0313 15:25:36.999679 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.394694 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a96a07-f63f-47d9-9191-0548996f01a7","Type":"ContainerStarted","Data":"6fee30319ab254dc362b9cb5404359a46216a536ddf834f8c5f2549a88b37dcc"} Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.396036 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.399146 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerStarted","Data":"5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb"} Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.401542 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eeb9deff-99f7-4425-a84c-a520d67430ed","Type":"ContainerDied","Data":"01124ab6c992a23e424e39162e51555f9ad606ca75fce6a2ce1d8b36aa46df44"} Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.401626 4786 scope.go:117] "RemoveContainer" containerID="0399f2db3f72283f5c6cc3ac379a8523fcbcc9d2bccb605cb969c082a8cacc8f" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.401595 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.425308 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.425290507 podStartE2EDuration="3.425290507s" podCreationTimestamp="2026-03-13 15:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:37.42175036 +0000 UTC m=+1367.584962191" watchObservedRunningTime="2026-03-13 15:25:37.425290507 +0000 UTC m=+1367.588502318" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.453481 4786 scope.go:117] "RemoveContainer" containerID="7bfe1d57d354079630c40a517042fed84f22086574bff47d637020a997e365c0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.454351 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.462965 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.486195 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:25:37 crc kubenswrapper[4786]: E0313 15:25:37.486694 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerName="glance-httpd" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.486720 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerName="glance-httpd" Mar 13 15:25:37 crc kubenswrapper[4786]: E0313 15:25:37.486740 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerName="glance-log" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.486751 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerName="glance-log" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.486997 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerName="glance-log" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.487032 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" containerName="glance-httpd" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.488228 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.494078 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.494460 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.497240 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.596445 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:37148->10.217.0.153:9292: read: connection reset by peer" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.596517 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:37162->10.217.0.153:9292: read: connection reset by peer" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.621723 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.621800 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.622045 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.622183 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.622232 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.622415 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vb6r\" (UniqueName: \"kubernetes.io/projected/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-kube-api-access-7vb6r\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.622462 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.622552 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-logs\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.723986 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vb6r\" (UniqueName: \"kubernetes.io/projected/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-kube-api-access-7vb6r\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724050 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724090 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-logs\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724197 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724261 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724320 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724362 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724390 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724696 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.724949 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.728872 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-logs\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.733639 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.734387 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.741639 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.746671 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.760064 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vb6r\" (UniqueName: \"kubernetes.io/projected/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-kube-api-access-7vb6r\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.769377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " pod="openstack/glance-default-external-api-0" Mar 13 15:25:37 crc kubenswrapper[4786]: I0313 15:25:37.822779 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.419705 4786 generic.go:334] "Generic (PLEG): container finished" podID="24def402-fa10-4192-a42c-fb38e387247c" containerID="093d9f552c30bdec922a9a62149d9ced8460871a61712767fba2e030b08a33c1" exitCode=0 Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.419775 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24def402-fa10-4192-a42c-fb38e387247c","Type":"ContainerDied","Data":"093d9f552c30bdec922a9a62149d9ced8460871a61712767fba2e030b08a33c1"} Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.420167 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24def402-fa10-4192-a42c-fb38e387247c","Type":"ContainerDied","Data":"2f8d27bcaec4964450f42a81d712977e0da415a522514d51baee575f33adb275"} Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.420185 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8d27bcaec4964450f42a81d712977e0da415a522514d51baee575f33adb275" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.536140 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.581345 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb9deff-99f7-4425-a84c-a520d67430ed" path="/var/lib/kubelet/pods/eeb9deff-99f7-4425-a84c-a520d67430ed/volumes" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.682378 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-scripts\") pod \"24def402-fa10-4192-a42c-fb38e387247c\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.682473 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"24def402-fa10-4192-a42c-fb38e387247c\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.682550 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-config-data\") pod \"24def402-fa10-4192-a42c-fb38e387247c\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.682612 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kl2l\" (UniqueName: \"kubernetes.io/projected/24def402-fa10-4192-a42c-fb38e387247c-kube-api-access-6kl2l\") pod \"24def402-fa10-4192-a42c-fb38e387247c\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.682681 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-httpd-run\") pod \"24def402-fa10-4192-a42c-fb38e387247c\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.682710 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-combined-ca-bundle\") pod \"24def402-fa10-4192-a42c-fb38e387247c\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.682739 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-logs\") pod \"24def402-fa10-4192-a42c-fb38e387247c\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.682778 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-internal-tls-certs\") pod \"24def402-fa10-4192-a42c-fb38e387247c\" (UID: \"24def402-fa10-4192-a42c-fb38e387247c\") " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.686514 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-logs" (OuterVolumeSpecName: "logs") pod "24def402-fa10-4192-a42c-fb38e387247c" (UID: "24def402-fa10-4192-a42c-fb38e387247c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.689978 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-scripts" (OuterVolumeSpecName: "scripts") pod "24def402-fa10-4192-a42c-fb38e387247c" (UID: "24def402-fa10-4192-a42c-fb38e387247c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.694054 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "24def402-fa10-4192-a42c-fb38e387247c" (UID: "24def402-fa10-4192-a42c-fb38e387247c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.700246 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.700574 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "24def402-fa10-4192-a42c-fb38e387247c" (UID: "24def402-fa10-4192-a42c-fb38e387247c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.703503 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24def402-fa10-4192-a42c-fb38e387247c-kube-api-access-6kl2l" (OuterVolumeSpecName: "kube-api-access-6kl2l") pod "24def402-fa10-4192-a42c-fb38e387247c" (UID: "24def402-fa10-4192-a42c-fb38e387247c"). InnerVolumeSpecName "kube-api-access-6kl2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:38 crc kubenswrapper[4786]: W0313 15:25:38.714377 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ee1d973_a40c_4db0_8cc7_1c64ece074ac.slice/crio-b842cb27aa7be04a1a215ec145dc704caa0529e9dc07c3f186f254fb1aaa6461 WatchSource:0}: Error finding container b842cb27aa7be04a1a215ec145dc704caa0529e9dc07c3f186f254fb1aaa6461: Status 404 returned error can't find the container with id b842cb27aa7be04a1a215ec145dc704caa0529e9dc07c3f186f254fb1aaa6461 Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.732468 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24def402-fa10-4192-a42c-fb38e387247c" (UID: "24def402-fa10-4192-a42c-fb38e387247c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.770324 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-config-data" (OuterVolumeSpecName: "config-data") pod "24def402-fa10-4192-a42c-fb38e387247c" (UID: "24def402-fa10-4192-a42c-fb38e387247c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.784625 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.784654 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kl2l\" (UniqueName: \"kubernetes.io/projected/24def402-fa10-4192-a42c-fb38e387247c-kube-api-access-6kl2l\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.784666 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.784675 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.784684 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24def402-fa10-4192-a42c-fb38e387247c-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.784692 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.784711 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.786961 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "24def402-fa10-4192-a42c-fb38e387247c" (UID: "24def402-fa10-4192-a42c-fb38e387247c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.818792 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.886364 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24def402-fa10-4192-a42c-fb38e387247c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:38 crc kubenswrapper[4786]: I0313 15:25:38.886389 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.448496 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ee1d973-a40c-4db0-8cc7-1c64ece074ac","Type":"ContainerStarted","Data":"18ddb14c07d08e39a714e8241c91bc621e8b9460fed525911386dabe3a845484"} Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.448840 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ee1d973-a40c-4db0-8cc7-1c64ece074ac","Type":"ContainerStarted","Data":"b842cb27aa7be04a1a215ec145dc704caa0529e9dc07c3f186f254fb1aaa6461"} Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.465658 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.466139 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="ceilometer-central-agent" containerID="cri-o://30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f" gracePeriod=30 Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.466404 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="proxy-httpd" containerID="cri-o://ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa" gracePeriod=30 Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.466448 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="sg-core" containerID="cri-o://5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb" gracePeriod=30 Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.466486 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="ceilometer-notification-agent" containerID="cri-o://86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b" gracePeriod=30 Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.466552 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerStarted","Data":"ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa"} Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.466586 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.525694 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.315355 podStartE2EDuration="6.525673468s" podCreationTimestamp="2026-03-13 15:25:33 +0000 UTC" firstStartedPulling="2026-03-13 15:25:34.182455324 +0000 UTC m=+1364.345667135" lastFinishedPulling="2026-03-13 15:25:38.392773792 +0000 UTC m=+1368.555985603" observedRunningTime="2026-03-13 15:25:39.50438586 +0000 UTC m=+1369.667597681" watchObservedRunningTime="2026-03-13 15:25:39.525673468 +0000 UTC m=+1369.688885279" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.540281 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.555552 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.569762 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:25:39 crc kubenswrapper[4786]: E0313 15:25:39.570151 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-log" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.570168 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-log" Mar 13 15:25:39 crc kubenswrapper[4786]: E0313 15:25:39.570188 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-httpd" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.570194 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-httpd" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.570349 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-httpd" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.570367 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="24def402-fa10-4192-a42c-fb38e387247c" containerName="glance-log" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.571356 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.574626 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.574845 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.583997 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.701815 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-logs\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.701883 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.701902 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.701934 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.701960 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.701988 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5lg4\" (UniqueName: \"kubernetes.io/projected/8609052d-1ba2-4888-b973-05c8e4663632-kube-api-access-f5lg4\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.702005 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.702035 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.803713 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-logs\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.803778 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.803810 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.803884 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.803912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.803953 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5lg4\" (UniqueName: \"kubernetes.io/projected/8609052d-1ba2-4888-b973-05c8e4663632-kube-api-access-f5lg4\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.803970 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.804003 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.804367 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.806219 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.809472 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-logs\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.811138 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.814486 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.816140 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.823942 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5lg4\" (UniqueName: \"kubernetes.io/projected/8609052d-1ba2-4888-b973-05c8e4663632-kube-api-access-f5lg4\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.824955 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.856501 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " pod="openstack/glance-default-internal-api-0" Mar 13 15:25:39 crc kubenswrapper[4786]: I0313 15:25:39.949496 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.331562 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.475386 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ee1d973-a40c-4db0-8cc7-1c64ece074ac","Type":"ContainerStarted","Data":"15848ce920e90b7648bab5e64f68ab325ba1bd7a143ec84d35fc3e9aa2a6e33f"} Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.478003 4786 generic.go:334] "Generic (PLEG): container finished" podID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerID="ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa" exitCode=0 Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.478030 4786 generic.go:334] "Generic (PLEG): container finished" podID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerID="5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb" exitCode=2 Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.478037 4786 generic.go:334] "Generic (PLEG): container finished" podID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerID="86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b" exitCode=0 Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.478069 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerDied","Data":"ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa"} Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.478114 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerDied","Data":"5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb"} Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.478132 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerDied","Data":"86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b"} Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.480993 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8609052d-1ba2-4888-b973-05c8e4663632","Type":"ContainerStarted","Data":"b38ed59273199c0dc37509362794a0bd3dd98b8d473e82a4d867134cd3608924"} Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.503689 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.5036687029999998 podStartE2EDuration="3.503668703s" podCreationTimestamp="2026-03-13 15:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:40.492078156 +0000 UTC m=+1370.655289977" watchObservedRunningTime="2026-03-13 15:25:40.503668703 +0000 UTC m=+1370.666880524" Mar 13 15:25:40 crc kubenswrapper[4786]: I0313 15:25:40.576575 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24def402-fa10-4192-a42c-fb38e387247c" path="/var/lib/kubelet/pods/24def402-fa10-4192-a42c-fb38e387247c/volumes" Mar 13 15:25:41 crc kubenswrapper[4786]: I0313 15:25:41.493780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8609052d-1ba2-4888-b973-05c8e4663632","Type":"ContainerStarted","Data":"da85c0b8ba7100a5a7dd0b1919a32c21a917bfb0e41b3a25bb5709d1b826d78c"} Mar 13 15:25:42 crc kubenswrapper[4786]: I0313 15:25:42.503674 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8609052d-1ba2-4888-b973-05c8e4663632","Type":"ContainerStarted","Data":"f8fb520920d825dd25aaead12b8048c225a995e2f744b27b8f73261839e24997"} Mar 13 15:25:42 crc kubenswrapper[4786]: I0313 15:25:42.535715 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.535692419 podStartE2EDuration="3.535692419s" podCreationTimestamp="2026-03-13 15:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:42.526588853 +0000 UTC m=+1372.689800674" watchObservedRunningTime="2026-03-13 15:25:42.535692419 +0000 UTC m=+1372.698904240" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.119550 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.258772 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-config-data\") pod \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.259130 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-combined-ca-bundle\") pod \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.259183 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-run-httpd\") pod \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.259277 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qtfp\" (UniqueName: \"kubernetes.io/projected/85f784cf-ed1f-4734-bc01-ebdc55752f2c-kube-api-access-9qtfp\") pod \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.259645 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "85f784cf-ed1f-4734-bc01-ebdc55752f2c" (UID: "85f784cf-ed1f-4734-bc01-ebdc55752f2c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.259749 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-scripts\") pod \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.259786 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-log-httpd\") pod \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.260098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-sg-core-conf-yaml\") pod \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\" (UID: \"85f784cf-ed1f-4734-bc01-ebdc55752f2c\") " Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.260491 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.261940 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "85f784cf-ed1f-4734-bc01-ebdc55752f2c" (UID: "85f784cf-ed1f-4734-bc01-ebdc55752f2c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.274122 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f784cf-ed1f-4734-bc01-ebdc55752f2c-kube-api-access-9qtfp" (OuterVolumeSpecName: "kube-api-access-9qtfp") pod "85f784cf-ed1f-4734-bc01-ebdc55752f2c" (UID: "85f784cf-ed1f-4734-bc01-ebdc55752f2c"). InnerVolumeSpecName "kube-api-access-9qtfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.278201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-scripts" (OuterVolumeSpecName: "scripts") pod "85f784cf-ed1f-4734-bc01-ebdc55752f2c" (UID: "85f784cf-ed1f-4734-bc01-ebdc55752f2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.302570 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "85f784cf-ed1f-4734-bc01-ebdc55752f2c" (UID: "85f784cf-ed1f-4734-bc01-ebdc55752f2c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.349069 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85f784cf-ed1f-4734-bc01-ebdc55752f2c" (UID: "85f784cf-ed1f-4734-bc01-ebdc55752f2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.359484 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-config-data" (OuterVolumeSpecName: "config-data") pod "85f784cf-ed1f-4734-bc01-ebdc55752f2c" (UID: "85f784cf-ed1f-4734-bc01-ebdc55752f2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.361630 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85f784cf-ed1f-4734-bc01-ebdc55752f2c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.361648 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.361659 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.361667 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.361678 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qtfp\" (UniqueName: \"kubernetes.io/projected/85f784cf-ed1f-4734-bc01-ebdc55752f2c-kube-api-access-9qtfp\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.361686 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85f784cf-ed1f-4734-bc01-ebdc55752f2c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.516342 4786 generic.go:334] "Generic (PLEG): container finished" podID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerID="30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f" exitCode=0 Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.517079 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerDied","Data":"30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f"} Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.517143 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85f784cf-ed1f-4734-bc01-ebdc55752f2c","Type":"ContainerDied","Data":"60ab3f94c0d8b06b29ce97bea86c248521663adb4e0b55275842f8984530900a"} Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.517152 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.517172 4786 scope.go:117] "RemoveContainer" containerID="ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.542501 4786 scope.go:117] "RemoveContainer" containerID="5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.559550 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.566589 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.568697 4786 scope.go:117] "RemoveContainer" containerID="86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589147 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:43 crc kubenswrapper[4786]: E0313 15:25:43.589590 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="ceilometer-notification-agent" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589606 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="ceilometer-notification-agent" Mar 13 15:25:43 crc kubenswrapper[4786]: E0313 15:25:43.589621 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="ceilometer-central-agent" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589628 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="ceilometer-central-agent" Mar 13 15:25:43 crc kubenswrapper[4786]: E0313 15:25:43.589637 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="sg-core" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589644 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="sg-core" Mar 13 15:25:43 crc kubenswrapper[4786]: E0313 15:25:43.589655 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="proxy-httpd" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589661 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="proxy-httpd" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589843 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="proxy-httpd" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589867 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="ceilometer-central-agent" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589887 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="ceilometer-notification-agent" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.589902 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" containerName="sg-core" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.591388 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.597745 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.598608 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.612462 4786 scope.go:117] "RemoveContainer" containerID="30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.624058 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.650277 4786 scope.go:117] "RemoveContainer" containerID="ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa" Mar 13 15:25:43 crc kubenswrapper[4786]: E0313 15:25:43.650816 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa\": container with ID starting with ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa not found: ID does not exist" containerID="ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.650864 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa"} err="failed to get container status \"ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa\": rpc error: code = NotFound desc = could not find container \"ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa\": container with ID starting with ac2b2e657fb7d0639bb24ed510b569e9563b2bd2b374fed763517f030b9748fa not found: ID does not exist" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.650891 4786 scope.go:117] "RemoveContainer" containerID="5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb" Mar 13 15:25:43 crc kubenswrapper[4786]: E0313 15:25:43.651179 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb\": container with ID starting with 5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb not found: ID does not exist" containerID="5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.651218 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb"} err="failed to get container status \"5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb\": rpc error: code = NotFound desc = could not find container \"5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb\": container with ID starting with 5bca2165e9910de1105667aece908b6d5358a8714deb682bc2ff8e8d2a7a90cb not found: ID does not exist" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.651236 4786 scope.go:117] "RemoveContainer" containerID="86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b" Mar 13 15:25:43 crc kubenswrapper[4786]: E0313 15:25:43.651608 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b\": container with ID starting with 86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b not found: ID does not exist" containerID="86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.651638 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b"} err="failed to get container status \"86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b\": rpc error: code = NotFound desc = could not find container \"86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b\": container with ID starting with 86511035a03fc28ebb18f0a497613feb82829b4f3c20353df2b47f87e1b1b00b not found: ID does not exist" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.651657 4786 scope.go:117] "RemoveContainer" containerID="30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f" Mar 13 15:25:43 crc kubenswrapper[4786]: E0313 15:25:43.651942 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f\": container with ID starting with 30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f not found: ID does not exist" containerID="30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.651996 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f"} err="failed to get container status \"30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f\": rpc error: code = NotFound desc = could not find container \"30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f\": container with ID starting with 30ad7e5805c633c76bcc7200b3356d45783e9cf379c317147a72cca26255115f not found: ID does not exist" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.772179 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzmvd\" (UniqueName: \"kubernetes.io/projected/ea9a687b-89fb-428c-8f54-60e65ddb100c-kube-api-access-gzmvd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.772259 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.772306 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-config-data\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.772341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-scripts\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.772388 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-run-httpd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.772677 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-log-httpd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.772728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.874606 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.874691 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-config-data\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.874716 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-scripts\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.874749 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-run-httpd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.874878 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-log-httpd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.874927 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.874977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzmvd\" (UniqueName: \"kubernetes.io/projected/ea9a687b-89fb-428c-8f54-60e65ddb100c-kube-api-access-gzmvd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.875781 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-run-httpd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.876109 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-log-httpd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.880717 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.882245 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-config-data\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.882379 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.903034 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-scripts\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:43 crc kubenswrapper[4786]: I0313 15:25:43.931685 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzmvd\" (UniqueName: \"kubernetes.io/projected/ea9a687b-89fb-428c-8f54-60e65ddb100c-kube-api-access-gzmvd\") pod \"ceilometer-0\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " pod="openstack/ceilometer-0" Mar 13 15:25:44 crc kubenswrapper[4786]: I0313 15:25:44.222642 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:44 crc kubenswrapper[4786]: I0313 15:25:44.562611 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f784cf-ed1f-4734-bc01-ebdc55752f2c" path="/var/lib/kubelet/pods/85f784cf-ed1f-4734-bc01-ebdc55752f2c/volumes" Mar 13 15:25:44 crc kubenswrapper[4786]: W0313 15:25:44.671268 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea9a687b_89fb_428c_8f54_60e65ddb100c.slice/crio-777ce7001b230097fba2ad2eda12146fcfbf4b5d344e9f1773c1857830552420 WatchSource:0}: Error finding container 777ce7001b230097fba2ad2eda12146fcfbf4b5d344e9f1773c1857830552420: Status 404 returned error can't find the container with id 777ce7001b230097fba2ad2eda12146fcfbf4b5d344e9f1773c1857830552420 Mar 13 15:25:44 crc kubenswrapper[4786]: I0313 15:25:44.673125 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:45 crc kubenswrapper[4786]: I0313 15:25:45.201153 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:45 crc kubenswrapper[4786]: I0313 15:25:45.537767 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerStarted","Data":"226b483e13ffb6b71db861d7d95a26caffd7cd6476ccac3d5eee1f10b3a190a4"} Mar 13 15:25:45 crc kubenswrapper[4786]: I0313 15:25:45.538049 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerStarted","Data":"777ce7001b230097fba2ad2eda12146fcfbf4b5d344e9f1773c1857830552420"} Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.548488 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerStarted","Data":"9a1d8e7415e60b0f860bdb06b261a4561ea7e0c83dcb2211976c265bbd18cdd4"} Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.566933 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jw79s"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.568282 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.580136 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jw79s"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.656533 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-2x4gs"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.657920 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.679355 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2x4gs"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.758418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzcf\" (UniqueName: \"kubernetes.io/projected/ed5fea62-85d0-4afd-a716-1a450c1baafb-kube-api-access-jwzcf\") pod \"nova-api-db-create-jw79s\" (UID: \"ed5fea62-85d0-4afd-a716-1a450c1baafb\") " pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.758581 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed5fea62-85d0-4afd-a716-1a450c1baafb-operator-scripts\") pod \"nova-api-db-create-jw79s\" (UID: \"ed5fea62-85d0-4afd-a716-1a450c1baafb\") " pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.764502 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8b6k9"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.765960 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.773489 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-df75-account-create-update-g659m"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.774906 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.776599 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.783510 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8b6k9"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.808171 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-df75-account-create-update-g659m"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.862740 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r7tk\" (UniqueName: \"kubernetes.io/projected/56def392-368a-4ab2-8958-559494d013cb-kube-api-access-4r7tk\") pod \"nova-cell1-db-create-8b6k9\" (UID: \"56def392-368a-4ab2-8958-559494d013cb\") " pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.862800 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed5fea62-85d0-4afd-a716-1a450c1baafb-operator-scripts\") pod \"nova-api-db-create-jw79s\" (UID: \"ed5fea62-85d0-4afd-a716-1a450c1baafb\") " pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.862831 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b11a69-a25d-4d50-aaa6-b19905e48741-operator-scripts\") pod \"nova-cell0-db-create-2x4gs\" (UID: \"55b11a69-a25d-4d50-aaa6-b19905e48741\") " pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.862899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsx2q\" (UniqueName: \"kubernetes.io/projected/15cffa2d-4a68-4125-aaca-c8e972b36d53-kube-api-access-nsx2q\") pod \"nova-api-df75-account-create-update-g659m\" (UID: \"15cffa2d-4a68-4125-aaca-c8e972b36d53\") " pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.862985 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj5t6\" (UniqueName: \"kubernetes.io/projected/55b11a69-a25d-4d50-aaa6-b19905e48741-kube-api-access-rj5t6\") pod \"nova-cell0-db-create-2x4gs\" (UID: \"55b11a69-a25d-4d50-aaa6-b19905e48741\") " pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.863057 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15cffa2d-4a68-4125-aaca-c8e972b36d53-operator-scripts\") pod \"nova-api-df75-account-create-update-g659m\" (UID: \"15cffa2d-4a68-4125-aaca-c8e972b36d53\") " pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.863087 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56def392-368a-4ab2-8958-559494d013cb-operator-scripts\") pod \"nova-cell1-db-create-8b6k9\" (UID: \"56def392-368a-4ab2-8958-559494d013cb\") " pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.863113 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzcf\" (UniqueName: \"kubernetes.io/projected/ed5fea62-85d0-4afd-a716-1a450c1baafb-kube-api-access-jwzcf\") pod \"nova-api-db-create-jw79s\" (UID: \"ed5fea62-85d0-4afd-a716-1a450c1baafb\") " pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.864353 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed5fea62-85d0-4afd-a716-1a450c1baafb-operator-scripts\") pod \"nova-api-db-create-jw79s\" (UID: \"ed5fea62-85d0-4afd-a716-1a450c1baafb\") " pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.906305 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzcf\" (UniqueName: \"kubernetes.io/projected/ed5fea62-85d0-4afd-a716-1a450c1baafb-kube-api-access-jwzcf\") pod \"nova-api-db-create-jw79s\" (UID: \"ed5fea62-85d0-4afd-a716-1a450c1baafb\") " pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.964019 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15cffa2d-4a68-4125-aaca-c8e972b36d53-operator-scripts\") pod \"nova-api-df75-account-create-update-g659m\" (UID: \"15cffa2d-4a68-4125-aaca-c8e972b36d53\") " pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.964071 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56def392-368a-4ab2-8958-559494d013cb-operator-scripts\") pod \"nova-cell1-db-create-8b6k9\" (UID: \"56def392-368a-4ab2-8958-559494d013cb\") " pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.964105 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r7tk\" (UniqueName: \"kubernetes.io/projected/56def392-368a-4ab2-8958-559494d013cb-kube-api-access-4r7tk\") pod \"nova-cell1-db-create-8b6k9\" (UID: \"56def392-368a-4ab2-8958-559494d013cb\") " pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.964126 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b11a69-a25d-4d50-aaa6-b19905e48741-operator-scripts\") pod \"nova-cell0-db-create-2x4gs\" (UID: \"55b11a69-a25d-4d50-aaa6-b19905e48741\") " pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.964155 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsx2q\" (UniqueName: \"kubernetes.io/projected/15cffa2d-4a68-4125-aaca-c8e972b36d53-kube-api-access-nsx2q\") pod \"nova-api-df75-account-create-update-g659m\" (UID: \"15cffa2d-4a68-4125-aaca-c8e972b36d53\") " pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.964222 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj5t6\" (UniqueName: \"kubernetes.io/projected/55b11a69-a25d-4d50-aaa6-b19905e48741-kube-api-access-rj5t6\") pod \"nova-cell0-db-create-2x4gs\" (UID: \"55b11a69-a25d-4d50-aaa6-b19905e48741\") " pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.965219 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15cffa2d-4a68-4125-aaca-c8e972b36d53-operator-scripts\") pod \"nova-api-df75-account-create-update-g659m\" (UID: \"15cffa2d-4a68-4125-aaca-c8e972b36d53\") " pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.965607 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56def392-368a-4ab2-8958-559494d013cb-operator-scripts\") pod \"nova-cell1-db-create-8b6k9\" (UID: \"56def392-368a-4ab2-8958-559494d013cb\") " pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.965895 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b11a69-a25d-4d50-aaa6-b19905e48741-operator-scripts\") pod \"nova-cell0-db-create-2x4gs\" (UID: \"55b11a69-a25d-4d50-aaa6-b19905e48741\") " pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.980958 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-744d-account-create-update-29wj6"] Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.982514 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:46 crc kubenswrapper[4786]: I0313 15:25:46.985632 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.004027 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r7tk\" (UniqueName: \"kubernetes.io/projected/56def392-368a-4ab2-8958-559494d013cb-kube-api-access-4r7tk\") pod \"nova-cell1-db-create-8b6k9\" (UID: \"56def392-368a-4ab2-8958-559494d013cb\") " pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.006547 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj5t6\" (UniqueName: \"kubernetes.io/projected/55b11a69-a25d-4d50-aaa6-b19905e48741-kube-api-access-rj5t6\") pod \"nova-cell0-db-create-2x4gs\" (UID: \"55b11a69-a25d-4d50-aaa6-b19905e48741\") " pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.011238 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsx2q\" (UniqueName: \"kubernetes.io/projected/15cffa2d-4a68-4125-aaca-c8e972b36d53-kube-api-access-nsx2q\") pod \"nova-api-df75-account-create-update-g659m\" (UID: \"15cffa2d-4a68-4125-aaca-c8e972b36d53\") " pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.015968 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-29wj6"] Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.041257 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.065503 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnnw\" (UniqueName: \"kubernetes.io/projected/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-kube-api-access-hpnnw\") pod \"nova-cell0-744d-account-create-update-29wj6\" (UID: \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\") " pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.065592 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-operator-scripts\") pod \"nova-cell0-744d-account-create-update-29wj6\" (UID: \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\") " pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.094825 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.112278 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.171127 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnnw\" (UniqueName: \"kubernetes.io/projected/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-kube-api-access-hpnnw\") pod \"nova-cell0-744d-account-create-update-29wj6\" (UID: \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\") " pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.171218 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-operator-scripts\") pod \"nova-cell0-744d-account-create-update-29wj6\" (UID: \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\") " pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.172346 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-operator-scripts\") pod \"nova-cell0-744d-account-create-update-29wj6\" (UID: \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\") " pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.189723 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.195672 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-46qvp"] Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.196751 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.199584 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.219612 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnnw\" (UniqueName: \"kubernetes.io/projected/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-kube-api-access-hpnnw\") pod \"nova-cell0-744d-account-create-update-29wj6\" (UID: \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\") " pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.226914 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-46qvp"] Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.275214 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cdd0010-08d5-4c55-98b8-c08ad54c7514-operator-scripts\") pod \"nova-cell1-a23c-account-create-update-46qvp\" (UID: \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\") " pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.275398 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6k5q\" (UniqueName: \"kubernetes.io/projected/8cdd0010-08d5-4c55-98b8-c08ad54c7514-kube-api-access-s6k5q\") pod \"nova-cell1-a23c-account-create-update-46qvp\" (UID: \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\") " pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.281275 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.378038 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cdd0010-08d5-4c55-98b8-c08ad54c7514-operator-scripts\") pod \"nova-cell1-a23c-account-create-update-46qvp\" (UID: \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\") " pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.379134 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cdd0010-08d5-4c55-98b8-c08ad54c7514-operator-scripts\") pod \"nova-cell1-a23c-account-create-update-46qvp\" (UID: \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\") " pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.379199 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6k5q\" (UniqueName: \"kubernetes.io/projected/8cdd0010-08d5-4c55-98b8-c08ad54c7514-kube-api-access-s6k5q\") pod \"nova-cell1-a23c-account-create-update-46qvp\" (UID: \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\") " pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.394999 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.400017 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6k5q\" (UniqueName: \"kubernetes.io/projected/8cdd0010-08d5-4c55-98b8-c08ad54c7514-kube-api-access-s6k5q\") pod \"nova-cell1-a23c-account-create-update-46qvp\" (UID: \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\") " pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.561599 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.622285 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerStarted","Data":"f598a148b1d900df926c1041261d965894b3be752766eefe782a3720360c5b1b"} Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.823265 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.823306 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.852881 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8b6k9"] Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.858115 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-df75-account-create-update-g659m"] Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.874339 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 15:25:47 crc kubenswrapper[4786]: I0313 15:25:47.900311 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.040323 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2x4gs"] Mar 13 15:25:48 crc kubenswrapper[4786]: W0313 15:25:48.053605 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55b11a69_a25d_4d50_aaa6_b19905e48741.slice/crio-4d83c4b47ee4c8155c2c886f924a74460e98bf96fe1171abb4517c97253857d8 WatchSource:0}: Error finding container 4d83c4b47ee4c8155c2c886f924a74460e98bf96fe1171abb4517c97253857d8: Status 404 returned error can't find the container with id 4d83c4b47ee4c8155c2c886f924a74460e98bf96fe1171abb4517c97253857d8 Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.075955 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jw79s"] Mar 13 15:25:48 crc kubenswrapper[4786]: W0313 15:25:48.080931 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded5fea62_85d0_4afd_a716_1a450c1baafb.slice/crio-fb8c3d68ccce1d82ac10ff34456243056f04b10e5cc17fecbb6062b8ccc12eb2 WatchSource:0}: Error finding container fb8c3d68ccce1d82ac10ff34456243056f04b10e5cc17fecbb6062b8ccc12eb2: Status 404 returned error can't find the container with id fb8c3d68ccce1d82ac10ff34456243056f04b10e5cc17fecbb6062b8ccc12eb2 Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.233511 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-29wj6"] Mar 13 15:25:48 crc kubenswrapper[4786]: W0313 15:25:48.239652 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6de6c03_d7b5_49f7_b4cc_9c07c6593273.slice/crio-d288219a10b65df02e00d0e870ea5bf1402d46858e543acdd878bc0cbc38f543 WatchSource:0}: Error finding container d288219a10b65df02e00d0e870ea5bf1402d46858e543acdd878bc0cbc38f543: Status 404 returned error can't find the container with id d288219a10b65df02e00d0e870ea5bf1402d46858e543acdd878bc0cbc38f543 Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.345665 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-46qvp"] Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.631027 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-df75-account-create-update-g659m" event={"ID":"15cffa2d-4a68-4125-aaca-c8e972b36d53","Type":"ContainerStarted","Data":"13d8ceed72e49b7cd2a9089e37e1daad7c70d0e13d73a4feb0a79e5b2664f7d2"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.631065 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-df75-account-create-update-g659m" event={"ID":"15cffa2d-4a68-4125-aaca-c8e972b36d53","Type":"ContainerStarted","Data":"2d17f871d30ae0689859e2d07c68e1e22d62364ac7d4c7e93fa306c289e50eee"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.633968 4786 generic.go:334] "Generic (PLEG): container finished" podID="55b11a69-a25d-4d50-aaa6-b19905e48741" containerID="56fd56f4a759002c22dddb32f7cdac80f70005c5ac86cfce40fbb6bb3ffc6e45" exitCode=0 Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.634031 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2x4gs" event={"ID":"55b11a69-a25d-4d50-aaa6-b19905e48741","Type":"ContainerDied","Data":"56fd56f4a759002c22dddb32f7cdac80f70005c5ac86cfce40fbb6bb3ffc6e45"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.634100 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2x4gs" event={"ID":"55b11a69-a25d-4d50-aaa6-b19905e48741","Type":"ContainerStarted","Data":"4d83c4b47ee4c8155c2c886f924a74460e98bf96fe1171abb4517c97253857d8"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.635316 4786 generic.go:334] "Generic (PLEG): container finished" podID="56def392-368a-4ab2-8958-559494d013cb" containerID="7391bf2bfd5a11f51ed0fde91d5f31bb0b8d96280948cba5e98ffa0e3356bece" exitCode=0 Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.635403 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8b6k9" event={"ID":"56def392-368a-4ab2-8958-559494d013cb","Type":"ContainerDied","Data":"7391bf2bfd5a11f51ed0fde91d5f31bb0b8d96280948cba5e98ffa0e3356bece"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.635422 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8b6k9" event={"ID":"56def392-368a-4ab2-8958-559494d013cb","Type":"ContainerStarted","Data":"5a6cfdc11fad957ab6d401706738aff9bdf5af1d0559e6315100ee9c7cac77fd"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.636531 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-744d-account-create-update-29wj6" event={"ID":"f6de6c03-d7b5-49f7-b4cc-9c07c6593273","Type":"ContainerStarted","Data":"bc46a0f61c16d0854539483ecafa210f744c7a82ed6f721a3c996b3f7c16d0ee"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.636563 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-744d-account-create-update-29wj6" event={"ID":"f6de6c03-d7b5-49f7-b4cc-9c07c6593273","Type":"ContainerStarted","Data":"d288219a10b65df02e00d0e870ea5bf1402d46858e543acdd878bc0cbc38f543"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.638385 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jw79s" event={"ID":"ed5fea62-85d0-4afd-a716-1a450c1baafb","Type":"ContainerStarted","Data":"1f32639a5f98b5405972431389f642ad6d670e08baa6cbd9a62adbd7a1df9754"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.638420 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jw79s" event={"ID":"ed5fea62-85d0-4afd-a716-1a450c1baafb","Type":"ContainerStarted","Data":"fb8c3d68ccce1d82ac10ff34456243056f04b10e5cc17fecbb6062b8ccc12eb2"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.642313 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" event={"ID":"8cdd0010-08d5-4c55-98b8-c08ad54c7514","Type":"ContainerStarted","Data":"0dab3fd432281d71bbfe474a789a6904e1521fb26d710ab2e196030b217b7c51"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.642351 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" event={"ID":"8cdd0010-08d5-4c55-98b8-c08ad54c7514","Type":"ContainerStarted","Data":"835feb54832e1c4497bedd4ab79c6ca94bb805da95c9a9d512d7464e7c76b26a"} Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.642688 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.642724 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.649960 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-df75-account-create-update-g659m" podStartSLOduration=2.649942808 podStartE2EDuration="2.649942808s" podCreationTimestamp="2026-03-13 15:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:48.645407335 +0000 UTC m=+1378.808619146" watchObservedRunningTime="2026-03-13 15:25:48.649942808 +0000 UTC m=+1378.813154619" Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.688624 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-744d-account-create-update-29wj6" podStartSLOduration=2.688598236 podStartE2EDuration="2.688598236s" podCreationTimestamp="2026-03-13 15:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:48.677542812 +0000 UTC m=+1378.840754623" watchObservedRunningTime="2026-03-13 15:25:48.688598236 +0000 UTC m=+1378.851810067" Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.700660 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-jw79s" podStartSLOduration=2.700636764 podStartE2EDuration="2.700636764s" podCreationTimestamp="2026-03-13 15:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:48.691907448 +0000 UTC m=+1378.855119259" watchObservedRunningTime="2026-03-13 15:25:48.700636764 +0000 UTC m=+1378.863848575" Mar 13 15:25:48 crc kubenswrapper[4786]: I0313 15:25:48.725024 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" podStartSLOduration=1.725007129 podStartE2EDuration="1.725007129s" podCreationTimestamp="2026-03-13 15:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:25:48.723366248 +0000 UTC m=+1378.886578069" watchObservedRunningTime="2026-03-13 15:25:48.725007129 +0000 UTC m=+1378.888218940" Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.662262 4786 generic.go:334] "Generic (PLEG): container finished" podID="ed5fea62-85d0-4afd-a716-1a450c1baafb" containerID="1f32639a5f98b5405972431389f642ad6d670e08baa6cbd9a62adbd7a1df9754" exitCode=0 Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.662513 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jw79s" event={"ID":"ed5fea62-85d0-4afd-a716-1a450c1baafb","Type":"ContainerDied","Data":"1f32639a5f98b5405972431389f642ad6d670e08baa6cbd9a62adbd7a1df9754"} Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.665153 4786 generic.go:334] "Generic (PLEG): container finished" podID="8cdd0010-08d5-4c55-98b8-c08ad54c7514" containerID="0dab3fd432281d71bbfe474a789a6904e1521fb26d710ab2e196030b217b7c51" exitCode=0 Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.665210 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" event={"ID":"8cdd0010-08d5-4c55-98b8-c08ad54c7514","Type":"ContainerDied","Data":"0dab3fd432281d71bbfe474a789a6904e1521fb26d710ab2e196030b217b7c51"} Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.666821 4786 generic.go:334] "Generic (PLEG): container finished" podID="15cffa2d-4a68-4125-aaca-c8e972b36d53" containerID="13d8ceed72e49b7cd2a9089e37e1daad7c70d0e13d73a4feb0a79e5b2664f7d2" exitCode=0 Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.666998 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-df75-account-create-update-g659m" event={"ID":"15cffa2d-4a68-4125-aaca-c8e972b36d53","Type":"ContainerDied","Data":"13d8ceed72e49b7cd2a9089e37e1daad7c70d0e13d73a4feb0a79e5b2664f7d2"} Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.668285 4786 generic.go:334] "Generic (PLEG): container finished" podID="f6de6c03-d7b5-49f7-b4cc-9c07c6593273" containerID="bc46a0f61c16d0854539483ecafa210f744c7a82ed6f721a3c996b3f7c16d0ee" exitCode=0 Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.668360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-744d-account-create-update-29wj6" event={"ID":"f6de6c03-d7b5-49f7-b4cc-9c07c6593273","Type":"ContainerDied","Data":"bc46a0f61c16d0854539483ecafa210f744c7a82ed6f721a3c996b3f7c16d0ee"} Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.950112 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.950444 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:49 crc kubenswrapper[4786]: I0313 15:25:49.998430 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.027676 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.223029 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.228996 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.357446 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56def392-368a-4ab2-8958-559494d013cb-operator-scripts\") pod \"56def392-368a-4ab2-8958-559494d013cb\" (UID: \"56def392-368a-4ab2-8958-559494d013cb\") " Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.357510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj5t6\" (UniqueName: \"kubernetes.io/projected/55b11a69-a25d-4d50-aaa6-b19905e48741-kube-api-access-rj5t6\") pod \"55b11a69-a25d-4d50-aaa6-b19905e48741\" (UID: \"55b11a69-a25d-4d50-aaa6-b19905e48741\") " Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.357633 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b11a69-a25d-4d50-aaa6-b19905e48741-operator-scripts\") pod \"55b11a69-a25d-4d50-aaa6-b19905e48741\" (UID: \"55b11a69-a25d-4d50-aaa6-b19905e48741\") " Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.357776 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r7tk\" (UniqueName: \"kubernetes.io/projected/56def392-368a-4ab2-8958-559494d013cb-kube-api-access-4r7tk\") pod \"56def392-368a-4ab2-8958-559494d013cb\" (UID: \"56def392-368a-4ab2-8958-559494d013cb\") " Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.358207 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56def392-368a-4ab2-8958-559494d013cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56def392-368a-4ab2-8958-559494d013cb" (UID: "56def392-368a-4ab2-8958-559494d013cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.358439 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56def392-368a-4ab2-8958-559494d013cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.358972 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b11a69-a25d-4d50-aaa6-b19905e48741-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55b11a69-a25d-4d50-aaa6-b19905e48741" (UID: "55b11a69-a25d-4d50-aaa6-b19905e48741"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.369320 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56def392-368a-4ab2-8958-559494d013cb-kube-api-access-4r7tk" (OuterVolumeSpecName: "kube-api-access-4r7tk") pod "56def392-368a-4ab2-8958-559494d013cb" (UID: "56def392-368a-4ab2-8958-559494d013cb"). InnerVolumeSpecName "kube-api-access-4r7tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.375214 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b11a69-a25d-4d50-aaa6-b19905e48741-kube-api-access-rj5t6" (OuterVolumeSpecName: "kube-api-access-rj5t6") pod "55b11a69-a25d-4d50-aaa6-b19905e48741" (UID: "55b11a69-a25d-4d50-aaa6-b19905e48741"). InnerVolumeSpecName "kube-api-access-rj5t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.459585 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55b11a69-a25d-4d50-aaa6-b19905e48741-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.459623 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r7tk\" (UniqueName: \"kubernetes.io/projected/56def392-368a-4ab2-8958-559494d013cb-kube-api-access-4r7tk\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.459635 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj5t6\" (UniqueName: \"kubernetes.io/projected/55b11a69-a25d-4d50-aaa6-b19905e48741-kube-api-access-rj5t6\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.678559 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerStarted","Data":"0b3a3223ad73a30c3bd91e2a1568b26ddb3fba5017e8830a3b0130bde5e58af5"} Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.678898 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.678713 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="ceilometer-central-agent" containerID="cri-o://226b483e13ffb6b71db861d7d95a26caffd7cd6476ccac3d5eee1f10b3a190a4" gracePeriod=30 Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.678990 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="ceilometer-notification-agent" containerID="cri-o://9a1d8e7415e60b0f860bdb06b261a4561ea7e0c83dcb2211976c265bbd18cdd4" gracePeriod=30 Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.679009 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="proxy-httpd" containerID="cri-o://0b3a3223ad73a30c3bd91e2a1568b26ddb3fba5017e8830a3b0130bde5e58af5" gracePeriod=30 Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.678995 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="sg-core" containerID="cri-o://f598a148b1d900df926c1041261d965894b3be752766eefe782a3720360c5b1b" gracePeriod=30 Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.684170 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2x4gs" event={"ID":"55b11a69-a25d-4d50-aaa6-b19905e48741","Type":"ContainerDied","Data":"4d83c4b47ee4c8155c2c886f924a74460e98bf96fe1171abb4517c97253857d8"} Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.684222 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d83c4b47ee4c8155c2c886f924a74460e98bf96fe1171abb4517c97253857d8" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.684305 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2x4gs" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.701414 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8b6k9" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.702188 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8b6k9" event={"ID":"56def392-368a-4ab2-8958-559494d013cb","Type":"ContainerDied","Data":"5a6cfdc11fad957ab6d401706738aff9bdf5af1d0559e6315100ee9c7cac77fd"} Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.702235 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6cfdc11fad957ab6d401706738aff9bdf5af1d0559e6315100ee9c7cac77fd" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.705895 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.706265 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.939791 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.939914 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:25:50 crc kubenswrapper[4786]: I0313 15:25:50.987279 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.235972024 podStartE2EDuration="7.987259252s" podCreationTimestamp="2026-03-13 15:25:43 +0000 UTC" firstStartedPulling="2026-03-13 15:25:44.674063762 +0000 UTC m=+1374.837275573" lastFinishedPulling="2026-03-13 15:25:49.42535099 +0000 UTC m=+1379.588562801" observedRunningTime="2026-03-13 15:25:50.717072974 +0000 UTC m=+1380.880284785" watchObservedRunningTime="2026-03-13 15:25:50.987259252 +0000 UTC m=+1381.150471063" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.361721 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.479588 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-operator-scripts\") pod \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\" (UID: \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\") " Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.479688 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpnnw\" (UniqueName: \"kubernetes.io/projected/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-kube-api-access-hpnnw\") pod \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\" (UID: \"f6de6c03-d7b5-49f7-b4cc-9c07c6593273\") " Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.481103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6de6c03-d7b5-49f7-b4cc-9c07c6593273" (UID: "f6de6c03-d7b5-49f7-b4cc-9c07c6593273"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.489070 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-kube-api-access-hpnnw" (OuterVolumeSpecName: "kube-api-access-hpnnw") pod "f6de6c03-d7b5-49f7-b4cc-9c07c6593273" (UID: "f6de6c03-d7b5-49f7-b4cc-9c07c6593273"). InnerVolumeSpecName "kube-api-access-hpnnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.564463 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.568347 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.579547 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.581891 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.581917 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpnnw\" (UniqueName: \"kubernetes.io/projected/f6de6c03-d7b5-49f7-b4cc-9c07c6593273-kube-api-access-hpnnw\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.616052 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.683290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed5fea62-85d0-4afd-a716-1a450c1baafb-operator-scripts\") pod \"ed5fea62-85d0-4afd-a716-1a450c1baafb\" (UID: \"ed5fea62-85d0-4afd-a716-1a450c1baafb\") " Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.683351 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15cffa2d-4a68-4125-aaca-c8e972b36d53-operator-scripts\") pod \"15cffa2d-4a68-4125-aaca-c8e972b36d53\" (UID: \"15cffa2d-4a68-4125-aaca-c8e972b36d53\") " Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.683423 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwzcf\" (UniqueName: \"kubernetes.io/projected/ed5fea62-85d0-4afd-a716-1a450c1baafb-kube-api-access-jwzcf\") pod \"ed5fea62-85d0-4afd-a716-1a450c1baafb\" (UID: \"ed5fea62-85d0-4afd-a716-1a450c1baafb\") " Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.683449 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsx2q\" (UniqueName: \"kubernetes.io/projected/15cffa2d-4a68-4125-aaca-c8e972b36d53-kube-api-access-nsx2q\") pod \"15cffa2d-4a68-4125-aaca-c8e972b36d53\" (UID: \"15cffa2d-4a68-4125-aaca-c8e972b36d53\") " Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.683577 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6k5q\" (UniqueName: \"kubernetes.io/projected/8cdd0010-08d5-4c55-98b8-c08ad54c7514-kube-api-access-s6k5q\") pod \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\" (UID: \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\") " Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.683923 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15cffa2d-4a68-4125-aaca-c8e972b36d53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15cffa2d-4a68-4125-aaca-c8e972b36d53" (UID: "15cffa2d-4a68-4125-aaca-c8e972b36d53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.684040 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15cffa2d-4a68-4125-aaca-c8e972b36d53-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.684232 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed5fea62-85d0-4afd-a716-1a450c1baafb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed5fea62-85d0-4afd-a716-1a450c1baafb" (UID: "ed5fea62-85d0-4afd-a716-1a450c1baafb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.688561 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15cffa2d-4a68-4125-aaca-c8e972b36d53-kube-api-access-nsx2q" (OuterVolumeSpecName: "kube-api-access-nsx2q") pod "15cffa2d-4a68-4125-aaca-c8e972b36d53" (UID: "15cffa2d-4a68-4125-aaca-c8e972b36d53"). InnerVolumeSpecName "kube-api-access-nsx2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.689829 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cdd0010-08d5-4c55-98b8-c08ad54c7514-kube-api-access-s6k5q" (OuterVolumeSpecName: "kube-api-access-s6k5q") pod "8cdd0010-08d5-4c55-98b8-c08ad54c7514" (UID: "8cdd0010-08d5-4c55-98b8-c08ad54c7514"). InnerVolumeSpecName "kube-api-access-s6k5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.693432 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed5fea62-85d0-4afd-a716-1a450c1baafb-kube-api-access-jwzcf" (OuterVolumeSpecName: "kube-api-access-jwzcf") pod "ed5fea62-85d0-4afd-a716-1a450c1baafb" (UID: "ed5fea62-85d0-4afd-a716-1a450c1baafb"). InnerVolumeSpecName "kube-api-access-jwzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.716613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-df75-account-create-update-g659m" event={"ID":"15cffa2d-4a68-4125-aaca-c8e972b36d53","Type":"ContainerDied","Data":"2d17f871d30ae0689859e2d07c68e1e22d62364ac7d4c7e93fa306c289e50eee"} Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.716633 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-df75-account-create-update-g659m" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.716652 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d17f871d30ae0689859e2d07c68e1e22d62364ac7d4c7e93fa306c289e50eee" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.717796 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-744d-account-create-update-29wj6" event={"ID":"f6de6c03-d7b5-49f7-b4cc-9c07c6593273","Type":"ContainerDied","Data":"d288219a10b65df02e00d0e870ea5bf1402d46858e543acdd878bc0cbc38f543"} Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.717813 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d288219a10b65df02e00d0e870ea5bf1402d46858e543acdd878bc0cbc38f543" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.717878 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-744d-account-create-update-29wj6" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.725127 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jw79s" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.725662 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jw79s" event={"ID":"ed5fea62-85d0-4afd-a716-1a450c1baafb","Type":"ContainerDied","Data":"fb8c3d68ccce1d82ac10ff34456243056f04b10e5cc17fecbb6062b8ccc12eb2"} Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.725702 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb8c3d68ccce1d82ac10ff34456243056f04b10e5cc17fecbb6062b8ccc12eb2" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.733417 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" event={"ID":"8cdd0010-08d5-4c55-98b8-c08ad54c7514","Type":"ContainerDied","Data":"835feb54832e1c4497bedd4ab79c6ca94bb805da95c9a9d512d7464e7c76b26a"} Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.733460 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="835feb54832e1c4497bedd4ab79c6ca94bb805da95c9a9d512d7464e7c76b26a" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.733533 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a23c-account-create-update-46qvp" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.767337 4786 generic.go:334] "Generic (PLEG): container finished" podID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerID="0b3a3223ad73a30c3bd91e2a1568b26ddb3fba5017e8830a3b0130bde5e58af5" exitCode=0 Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.767447 4786 generic.go:334] "Generic (PLEG): container finished" podID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerID="f598a148b1d900df926c1041261d965894b3be752766eefe782a3720360c5b1b" exitCode=2 Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.767510 4786 generic.go:334] "Generic (PLEG): container finished" podID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerID="9a1d8e7415e60b0f860bdb06b261a4561ea7e0c83dcb2211976c265bbd18cdd4" exitCode=0 Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.767379 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerDied","Data":"0b3a3223ad73a30c3bd91e2a1568b26ddb3fba5017e8830a3b0130bde5e58af5"} Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.768517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerDied","Data":"f598a148b1d900df926c1041261d965894b3be752766eefe782a3720360c5b1b"} Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.768581 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerDied","Data":"9a1d8e7415e60b0f860bdb06b261a4561ea7e0c83dcb2211976c265bbd18cdd4"} Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.787407 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cdd0010-08d5-4c55-98b8-c08ad54c7514-operator-scripts\") pod \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\" (UID: \"8cdd0010-08d5-4c55-98b8-c08ad54c7514\") " Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.787999 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwzcf\" (UniqueName: \"kubernetes.io/projected/ed5fea62-85d0-4afd-a716-1a450c1baafb-kube-api-access-jwzcf\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.788024 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsx2q\" (UniqueName: \"kubernetes.io/projected/15cffa2d-4a68-4125-aaca-c8e972b36d53-kube-api-access-nsx2q\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.788036 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6k5q\" (UniqueName: \"kubernetes.io/projected/8cdd0010-08d5-4c55-98b8-c08ad54c7514-kube-api-access-s6k5q\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.788048 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed5fea62-85d0-4afd-a716-1a450c1baafb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.790444 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cdd0010-08d5-4c55-98b8-c08ad54c7514-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8cdd0010-08d5-4c55-98b8-c08ad54c7514" (UID: "8cdd0010-08d5-4c55-98b8-c08ad54c7514"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:25:51 crc kubenswrapper[4786]: I0313 15:25:51.889822 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cdd0010-08d5-4c55-98b8-c08ad54c7514-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:52 crc kubenswrapper[4786]: I0313 15:25:52.793702 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:25:52 crc kubenswrapper[4786]: I0313 15:25:52.794033 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 15:25:53 crc kubenswrapper[4786]: I0313 15:25:53.102453 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:53 crc kubenswrapper[4786]: I0313 15:25:53.103198 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.302268 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s28qj"] Mar 13 15:25:57 crc kubenswrapper[4786]: E0313 15:25:57.303302 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56def392-368a-4ab2-8958-559494d013cb" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303320 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="56def392-368a-4ab2-8958-559494d013cb" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: E0313 15:25:57.303332 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cffa2d-4a68-4125-aaca-c8e972b36d53" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303340 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cffa2d-4a68-4125-aaca-c8e972b36d53" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: E0313 15:25:57.303359 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5fea62-85d0-4afd-a716-1a450c1baafb" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303370 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5fea62-85d0-4afd-a716-1a450c1baafb" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: E0313 15:25:57.303386 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b11a69-a25d-4d50-aaa6-b19905e48741" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303395 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b11a69-a25d-4d50-aaa6-b19905e48741" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: E0313 15:25:57.303413 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdd0010-08d5-4c55-98b8-c08ad54c7514" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303420 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdd0010-08d5-4c55-98b8-c08ad54c7514" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: E0313 15:25:57.303430 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6de6c03-d7b5-49f7-b4cc-9c07c6593273" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303438 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6de6c03-d7b5-49f7-b4cc-9c07c6593273" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303653 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cdd0010-08d5-4c55-98b8-c08ad54c7514" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303667 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6de6c03-d7b5-49f7-b4cc-9c07c6593273" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303678 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="56def392-368a-4ab2-8958-559494d013cb" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303691 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5fea62-85d0-4afd-a716-1a450c1baafb" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303705 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="55b11a69-a25d-4d50-aaa6-b19905e48741" containerName="mariadb-database-create" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.303720 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cffa2d-4a68-4125-aaca-c8e972b36d53" containerName="mariadb-account-create-update" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.304548 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.307681 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.309472 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.310097 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kx4pn" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.322445 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s28qj"] Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.387005 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.387405 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-config-data\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.387522 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swsjx\" (UniqueName: \"kubernetes.io/projected/5d32db15-e92e-4497-82ed-94cfe47acde6-kube-api-access-swsjx\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.387606 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-scripts\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.492967 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-config-data\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.494607 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swsjx\" (UniqueName: \"kubernetes.io/projected/5d32db15-e92e-4497-82ed-94cfe47acde6-kube-api-access-swsjx\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.494757 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-scripts\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.494911 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.499601 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-config-data\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.500510 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.519672 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-scripts\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.524158 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swsjx\" (UniqueName: \"kubernetes.io/projected/5d32db15-e92e-4497-82ed-94cfe47acde6-kube-api-access-swsjx\") pod \"nova-cell0-conductor-db-sync-s28qj\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.646623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.864588 4786 generic.go:334] "Generic (PLEG): container finished" podID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerID="226b483e13ffb6b71db861d7d95a26caffd7cd6476ccac3d5eee1f10b3a190a4" exitCode=0 Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.864810 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerDied","Data":"226b483e13ffb6b71db861d7d95a26caffd7cd6476ccac3d5eee1f10b3a190a4"} Mar 13 15:25:57 crc kubenswrapper[4786]: I0313 15:25:57.939618 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.116959 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-config-data\") pod \"ea9a687b-89fb-428c-8f54-60e65ddb100c\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.117018 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzmvd\" (UniqueName: \"kubernetes.io/projected/ea9a687b-89fb-428c-8f54-60e65ddb100c-kube-api-access-gzmvd\") pod \"ea9a687b-89fb-428c-8f54-60e65ddb100c\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.117048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-run-httpd\") pod \"ea9a687b-89fb-428c-8f54-60e65ddb100c\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.117100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-scripts\") pod \"ea9a687b-89fb-428c-8f54-60e65ddb100c\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.117229 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-sg-core-conf-yaml\") pod \"ea9a687b-89fb-428c-8f54-60e65ddb100c\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.117286 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-log-httpd\") pod \"ea9a687b-89fb-428c-8f54-60e65ddb100c\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.117321 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-combined-ca-bundle\") pod \"ea9a687b-89fb-428c-8f54-60e65ddb100c\" (UID: \"ea9a687b-89fb-428c-8f54-60e65ddb100c\") " Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.118327 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ea9a687b-89fb-428c-8f54-60e65ddb100c" (UID: "ea9a687b-89fb-428c-8f54-60e65ddb100c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.120041 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ea9a687b-89fb-428c-8f54-60e65ddb100c" (UID: "ea9a687b-89fb-428c-8f54-60e65ddb100c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.123346 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-scripts" (OuterVolumeSpecName: "scripts") pod "ea9a687b-89fb-428c-8f54-60e65ddb100c" (UID: "ea9a687b-89fb-428c-8f54-60e65ddb100c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.123554 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea9a687b-89fb-428c-8f54-60e65ddb100c-kube-api-access-gzmvd" (OuterVolumeSpecName: "kube-api-access-gzmvd") pod "ea9a687b-89fb-428c-8f54-60e65ddb100c" (UID: "ea9a687b-89fb-428c-8f54-60e65ddb100c"). InnerVolumeSpecName "kube-api-access-gzmvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.151204 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ea9a687b-89fb-428c-8f54-60e65ddb100c" (UID: "ea9a687b-89fb-428c-8f54-60e65ddb100c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.155405 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s28qj"] Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.191566 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea9a687b-89fb-428c-8f54-60e65ddb100c" (UID: "ea9a687b-89fb-428c-8f54-60e65ddb100c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.219715 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.219749 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.219762 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzmvd\" (UniqueName: \"kubernetes.io/projected/ea9a687b-89fb-428c-8f54-60e65ddb100c-kube-api-access-gzmvd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.219774 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea9a687b-89fb-428c-8f54-60e65ddb100c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.219794 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.219804 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.219893 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-config-data" (OuterVolumeSpecName: "config-data") pod "ea9a687b-89fb-428c-8f54-60e65ddb100c" (UID: "ea9a687b-89fb-428c-8f54-60e65ddb100c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.321567 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea9a687b-89fb-428c-8f54-60e65ddb100c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.881064 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea9a687b-89fb-428c-8f54-60e65ddb100c","Type":"ContainerDied","Data":"777ce7001b230097fba2ad2eda12146fcfbf4b5d344e9f1773c1857830552420"} Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.881504 4786 scope.go:117] "RemoveContainer" containerID="0b3a3223ad73a30c3bd91e2a1568b26ddb3fba5017e8830a3b0130bde5e58af5" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.881093 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.883409 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s28qj" event={"ID":"5d32db15-e92e-4497-82ed-94cfe47acde6","Type":"ContainerStarted","Data":"5d0fb48ef00c1244f8df01ed82de5eecf6350803200cf10c2c65753aa08c5c59"} Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.922538 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.924441 4786 scope.go:117] "RemoveContainer" containerID="f598a148b1d900df926c1041261d965894b3be752766eefe782a3720360c5b1b" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.932709 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.950427 4786 scope.go:117] "RemoveContainer" containerID="9a1d8e7415e60b0f860bdb06b261a4561ea7e0c83dcb2211976c265bbd18cdd4" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.958639 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:58 crc kubenswrapper[4786]: E0313 15:25:58.959010 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="ceilometer-central-agent" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.959029 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="ceilometer-central-agent" Mar 13 15:25:58 crc kubenswrapper[4786]: E0313 15:25:58.959045 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="sg-core" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.959051 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="sg-core" Mar 13 15:25:58 crc kubenswrapper[4786]: E0313 15:25:58.959064 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="proxy-httpd" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.959070 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="proxy-httpd" Mar 13 15:25:58 crc kubenswrapper[4786]: E0313 15:25:58.959083 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="ceilometer-notification-agent" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.959089 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="ceilometer-notification-agent" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.959253 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="proxy-httpd" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.959268 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="sg-core" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.959284 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="ceilometer-notification-agent" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.959297 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" containerName="ceilometer-central-agent" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.960815 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.962923 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.963170 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.980168 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:58 crc kubenswrapper[4786]: I0313 15:25:58.990411 4786 scope.go:117] "RemoveContainer" containerID="226b483e13ffb6b71db861d7d95a26caffd7cd6476ccac3d5eee1f10b3a190a4" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.142819 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-scripts\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.142908 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-log-httpd\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.142944 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.142971 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-run-httpd\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.143064 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.143090 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b45q\" (UniqueName: \"kubernetes.io/projected/6b29184d-5d3e-45d6-add8-50082c1d56b6-kube-api-access-2b45q\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.143118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-config-data\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.245483 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.245574 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b45q\" (UniqueName: \"kubernetes.io/projected/6b29184d-5d3e-45d6-add8-50082c1d56b6-kube-api-access-2b45q\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.245640 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-config-data\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.245810 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-scripts\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.245935 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-log-httpd\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.245994 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.246060 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-run-httpd\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.247248 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-log-httpd\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.247815 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-run-httpd\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.258534 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-config-data\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.260330 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.264671 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.265664 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b45q\" (UniqueName: \"kubernetes.io/projected/6b29184d-5d3e-45d6-add8-50082c1d56b6-kube-api-access-2b45q\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.271917 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-scripts\") pod \"ceilometer-0\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.285614 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.773180 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:25:59 crc kubenswrapper[4786]: I0313 15:25:59.893171 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerStarted","Data":"bd28880149922e6d72fa1f9ce976f21171c7b72a8d7d1480d67889dcddce399e"} Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.128830 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556926-wmpbn"] Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.130231 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.135674 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.135779 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.135935 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.138283 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-wmpbn"] Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.270776 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7gk\" (UniqueName: \"kubernetes.io/projected/be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7-kube-api-access-bf7gk\") pod \"auto-csr-approver-29556926-wmpbn\" (UID: \"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7\") " pod="openshift-infra/auto-csr-approver-29556926-wmpbn" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.372299 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7gk\" (UniqueName: \"kubernetes.io/projected/be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7-kube-api-access-bf7gk\") pod \"auto-csr-approver-29556926-wmpbn\" (UID: \"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7\") " pod="openshift-infra/auto-csr-approver-29556926-wmpbn" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.392580 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7gk\" (UniqueName: \"kubernetes.io/projected/be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7-kube-api-access-bf7gk\") pod \"auto-csr-approver-29556926-wmpbn\" (UID: \"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7\") " pod="openshift-infra/auto-csr-approver-29556926-wmpbn" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.455450 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.609464 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea9a687b-89fb-428c-8f54-60e65ddb100c" path="/var/lib/kubelet/pods/ea9a687b-89fb-428c-8f54-60e65ddb100c/volumes" Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.910695 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerStarted","Data":"790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484"} Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.946916 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-wmpbn"] Mar 13 15:26:00 crc kubenswrapper[4786]: W0313 15:26:00.966428 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe3ea1f3_d0e3_43d8_a99b_8ef3d473bee7.slice/crio-3f4d9d3711d2d2371eb91ac859572ea91c136db6ffebdca4241704db7b5d295c WatchSource:0}: Error finding container 3f4d9d3711d2d2371eb91ac859572ea91c136db6ffebdca4241704db7b5d295c: Status 404 returned error can't find the container with id 3f4d9d3711d2d2371eb91ac859572ea91c136db6ffebdca4241704db7b5d295c Mar 13 15:26:00 crc kubenswrapper[4786]: I0313 15:26:00.997172 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:26:01 crc kubenswrapper[4786]: I0313 15:26:01.930435 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" event={"ID":"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7","Type":"ContainerStarted","Data":"3f4d9d3711d2d2371eb91ac859572ea91c136db6ffebdca4241704db7b5d295c"} Mar 13 15:26:01 crc kubenswrapper[4786]: I0313 15:26:01.935835 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerStarted","Data":"ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b"} Mar 13 15:26:06 crc kubenswrapper[4786]: I0313 15:26:06.983569 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerStarted","Data":"99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480"} Mar 13 15:26:06 crc kubenswrapper[4786]: I0313 15:26:06.986992 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s28qj" event={"ID":"5d32db15-e92e-4497-82ed-94cfe47acde6","Type":"ContainerStarted","Data":"9a6f3ca38fa64af3b952f62a4de858e3db59344ce667e5e2776aba24da0b0bf9"} Mar 13 15:26:06 crc kubenswrapper[4786]: I0313 15:26:06.990698 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" event={"ID":"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7","Type":"ContainerStarted","Data":"ef2bbe6ec8f3f59479b1f5d89c5a9963a5edef3c6766386ec321d8bd3e16cf0d"} Mar 13 15:26:07 crc kubenswrapper[4786]: I0313 15:26:07.013275 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-s28qj" podStartSLOduration=1.524898047 podStartE2EDuration="10.013250913s" podCreationTimestamp="2026-03-13 15:25:57 +0000 UTC" firstStartedPulling="2026-03-13 15:25:58.15712137 +0000 UTC m=+1388.320333181" lastFinishedPulling="2026-03-13 15:26:06.645474226 +0000 UTC m=+1396.808686047" observedRunningTime="2026-03-13 15:26:07.010219058 +0000 UTC m=+1397.173430869" watchObservedRunningTime="2026-03-13 15:26:07.013250913 +0000 UTC m=+1397.176462724" Mar 13 15:26:07 crc kubenswrapper[4786]: I0313 15:26:07.031350 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" podStartSLOduration=1.359710645 podStartE2EDuration="7.031332101s" podCreationTimestamp="2026-03-13 15:26:00 +0000 UTC" firstStartedPulling="2026-03-13 15:26:00.9730782 +0000 UTC m=+1391.136290011" lastFinishedPulling="2026-03-13 15:26:06.644699656 +0000 UTC m=+1396.807911467" observedRunningTime="2026-03-13 15:26:07.024961794 +0000 UTC m=+1397.188173605" watchObservedRunningTime="2026-03-13 15:26:07.031332101 +0000 UTC m=+1397.194543912" Mar 13 15:26:08 crc kubenswrapper[4786]: I0313 15:26:08.002561 4786 generic.go:334] "Generic (PLEG): container finished" podID="be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7" containerID="ef2bbe6ec8f3f59479b1f5d89c5a9963a5edef3c6766386ec321d8bd3e16cf0d" exitCode=0 Mar 13 15:26:08 crc kubenswrapper[4786]: I0313 15:26:08.003036 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" event={"ID":"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7","Type":"ContainerDied","Data":"ef2bbe6ec8f3f59479b1f5d89c5a9963a5edef3c6766386ec321d8bd3e16cf0d"} Mar 13 15:26:09 crc kubenswrapper[4786]: I0313 15:26:09.377801 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" Mar 13 15:26:09 crc kubenswrapper[4786]: I0313 15:26:09.560325 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf7gk\" (UniqueName: \"kubernetes.io/projected/be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7-kube-api-access-bf7gk\") pod \"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7\" (UID: \"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7\") " Mar 13 15:26:09 crc kubenswrapper[4786]: I0313 15:26:09.575187 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7-kube-api-access-bf7gk" (OuterVolumeSpecName: "kube-api-access-bf7gk") pod "be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7" (UID: "be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7"). InnerVolumeSpecName "kube-api-access-bf7gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:09 crc kubenswrapper[4786]: I0313 15:26:09.662974 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf7gk\" (UniqueName: \"kubernetes.io/projected/be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7-kube-api-access-bf7gk\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.021355 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerStarted","Data":"b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1"} Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.021693 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.021499 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="proxy-httpd" containerID="cri-o://b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1" gracePeriod=30 Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.021466 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="ceilometer-central-agent" containerID="cri-o://790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484" gracePeriod=30 Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.021552 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="sg-core" containerID="cri-o://99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480" gracePeriod=30 Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.021577 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="ceilometer-notification-agent" containerID="cri-o://ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b" gracePeriod=30 Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.025565 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" event={"ID":"be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7","Type":"ContainerDied","Data":"3f4d9d3711d2d2371eb91ac859572ea91c136db6ffebdca4241704db7b5d295c"} Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.025610 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f4d9d3711d2d2371eb91ac859572ea91c136db6ffebdca4241704db7b5d295c" Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.025630 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-wmpbn" Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.045699 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.089209709 podStartE2EDuration="12.04567965s" podCreationTimestamp="2026-03-13 15:25:58 +0000 UTC" firstStartedPulling="2026-03-13 15:25:59.794014961 +0000 UTC m=+1389.957226782" lastFinishedPulling="2026-03-13 15:26:09.750484912 +0000 UTC m=+1399.913696723" observedRunningTime="2026-03-13 15:26:10.044404639 +0000 UTC m=+1400.207616460" watchObservedRunningTime="2026-03-13 15:26:10.04567965 +0000 UTC m=+1400.208891461" Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.449111 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-t5l58"] Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.458614 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-t5l58"] Mar 13 15:26:10 crc kubenswrapper[4786]: I0313 15:26:10.566938 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4b8359-292c-4c01-85b0-6969cb325afa" path="/var/lib/kubelet/pods/0b4b8359-292c-4c01-85b0-6969cb325afa/volumes" Mar 13 15:26:11 crc kubenswrapper[4786]: I0313 15:26:11.050830 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerID="99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480" exitCode=2 Mar 13 15:26:11 crc kubenswrapper[4786]: I0313 15:26:11.050872 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerID="ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b" exitCode=0 Mar 13 15:26:11 crc kubenswrapper[4786]: I0313 15:26:11.050880 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerID="790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484" exitCode=0 Mar 13 15:26:11 crc kubenswrapper[4786]: I0313 15:26:11.050884 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerDied","Data":"99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480"} Mar 13 15:26:11 crc kubenswrapper[4786]: I0313 15:26:11.050939 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerDied","Data":"ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b"} Mar 13 15:26:11 crc kubenswrapper[4786]: I0313 15:26:11.050955 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerDied","Data":"790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484"} Mar 13 15:26:19 crc kubenswrapper[4786]: I0313 15:26:19.146462 4786 generic.go:334] "Generic (PLEG): container finished" podID="5d32db15-e92e-4497-82ed-94cfe47acde6" containerID="9a6f3ca38fa64af3b952f62a4de858e3db59344ce667e5e2776aba24da0b0bf9" exitCode=0 Mar 13 15:26:19 crc kubenswrapper[4786]: I0313 15:26:19.146602 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s28qj" event={"ID":"5d32db15-e92e-4497-82ed-94cfe47acde6","Type":"ContainerDied","Data":"9a6f3ca38fa64af3b952f62a4de858e3db59344ce667e5e2776aba24da0b0bf9"} Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.473389 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.563366 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-config-data\") pod \"5d32db15-e92e-4497-82ed-94cfe47acde6\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.563806 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swsjx\" (UniqueName: \"kubernetes.io/projected/5d32db15-e92e-4497-82ed-94cfe47acde6-kube-api-access-swsjx\") pod \"5d32db15-e92e-4497-82ed-94cfe47acde6\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.564180 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-scripts\") pod \"5d32db15-e92e-4497-82ed-94cfe47acde6\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.564319 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-combined-ca-bundle\") pod \"5d32db15-e92e-4497-82ed-94cfe47acde6\" (UID: \"5d32db15-e92e-4497-82ed-94cfe47acde6\") " Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.569459 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-scripts" (OuterVolumeSpecName: "scripts") pod "5d32db15-e92e-4497-82ed-94cfe47acde6" (UID: "5d32db15-e92e-4497-82ed-94cfe47acde6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.572166 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d32db15-e92e-4497-82ed-94cfe47acde6-kube-api-access-swsjx" (OuterVolumeSpecName: "kube-api-access-swsjx") pod "5d32db15-e92e-4497-82ed-94cfe47acde6" (UID: "5d32db15-e92e-4497-82ed-94cfe47acde6"). InnerVolumeSpecName "kube-api-access-swsjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.589507 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-config-data" (OuterVolumeSpecName: "config-data") pod "5d32db15-e92e-4497-82ed-94cfe47acde6" (UID: "5d32db15-e92e-4497-82ed-94cfe47acde6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.632094 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d32db15-e92e-4497-82ed-94cfe47acde6" (UID: "5d32db15-e92e-4497-82ed-94cfe47acde6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.667690 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swsjx\" (UniqueName: \"kubernetes.io/projected/5d32db15-e92e-4497-82ed-94cfe47acde6-kube-api-access-swsjx\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.667739 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.667758 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:20 crc kubenswrapper[4786]: I0313 15:26:20.667773 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32db15-e92e-4497-82ed-94cfe47acde6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.169510 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s28qj" event={"ID":"5d32db15-e92e-4497-82ed-94cfe47acde6","Type":"ContainerDied","Data":"5d0fb48ef00c1244f8df01ed82de5eecf6350803200cf10c2c65753aa08c5c59"} Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.169573 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d0fb48ef00c1244f8df01ed82de5eecf6350803200cf10c2c65753aa08c5c59" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.169601 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s28qj" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.308365 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 15:26:21 crc kubenswrapper[4786]: E0313 15:26:21.308920 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7" containerName="oc" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.308942 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7" containerName="oc" Mar 13 15:26:21 crc kubenswrapper[4786]: E0313 15:26:21.308970 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d32db15-e92e-4497-82ed-94cfe47acde6" containerName="nova-cell0-conductor-db-sync" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.308979 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d32db15-e92e-4497-82ed-94cfe47acde6" containerName="nova-cell0-conductor-db-sync" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.309183 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7" containerName="oc" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.309219 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d32db15-e92e-4497-82ed-94cfe47acde6" containerName="nova-cell0-conductor-db-sync" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.309940 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.312584 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kx4pn" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.313426 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.318991 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.379729 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.380101 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.380282 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qskg7\" (UniqueName: \"kubernetes.io/projected/2c66255a-19d5-4417-bf43-f7f5bfff892a-kube-api-access-qskg7\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.481613 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.481716 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qskg7\" (UniqueName: \"kubernetes.io/projected/2c66255a-19d5-4417-bf43-f7f5bfff892a-kube-api-access-qskg7\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.481799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.487208 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.494192 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.504501 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qskg7\" (UniqueName: \"kubernetes.io/projected/2c66255a-19d5-4417-bf43-f7f5bfff892a-kube-api-access-qskg7\") pod \"nova-cell0-conductor-0\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:21 crc kubenswrapper[4786]: I0313 15:26:21.638282 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:22 crc kubenswrapper[4786]: I0313 15:26:22.221960 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 15:26:23 crc kubenswrapper[4786]: I0313 15:26:23.194648 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2c66255a-19d5-4417-bf43-f7f5bfff892a","Type":"ContainerStarted","Data":"8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0"} Mar 13 15:26:23 crc kubenswrapper[4786]: I0313 15:26:23.195148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2c66255a-19d5-4417-bf43-f7f5bfff892a","Type":"ContainerStarted","Data":"3a5b4bf2950734fade486ffb993657244ff2ff0cbddd1dfdee6d161458e13991"} Mar 13 15:26:23 crc kubenswrapper[4786]: I0313 15:26:23.197168 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:23 crc kubenswrapper[4786]: I0313 15:26:23.231943 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.23191674 podStartE2EDuration="2.23191674s" podCreationTimestamp="2026-03-13 15:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:26:23.219360869 +0000 UTC m=+1413.382572720" watchObservedRunningTime="2026-03-13 15:26:23.23191674 +0000 UTC m=+1413.395128591" Mar 13 15:26:29 crc kubenswrapper[4786]: I0313 15:26:29.296182 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 15:26:31 crc kubenswrapper[4786]: I0313 15:26:31.669096 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.114806 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cvs2v"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.115940 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.120780 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.120985 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.131424 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cvs2v"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.212359 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-config-data\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.212442 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.212467 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msqtp\" (UniqueName: \"kubernetes.io/projected/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-kube-api-access-msqtp\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.212573 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-scripts\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.321958 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.325242 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.335435 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.369460 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-config-data\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.369587 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.369616 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msqtp\" (UniqueName: \"kubernetes.io/projected/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-kube-api-access-msqtp\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.369838 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-scripts\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.377638 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.384297 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-config-data\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.391264 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.396434 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.407563 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.413951 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.419704 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-scripts\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.434559 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msqtp\" (UniqueName: \"kubernetes.io/projected/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-kube-api-access-msqtp\") pod \"nova-cell0-cell-mapping-cvs2v\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.450621 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.477946 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.489054 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1c56d-85df-423c-8151-25ad2e681203-logs\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.489198 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5glvw\" (UniqueName: \"kubernetes.io/projected/bce7950c-6bdd-4565-bbea-4020273d4230-kube-api-access-5glvw\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.489218 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.489255 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-config-data\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.489325 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-config-data\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.489351 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvm4q\" (UniqueName: \"kubernetes.io/projected/7ee1c56d-85df-423c-8151-25ad2e681203-kube-api-access-wvm4q\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.489415 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.546906 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.548707 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.553670 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.599776 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.599832 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e68ea03-384f-49ba-9ff6-871c7162797a-logs\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.599886 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5glvw\" (UniqueName: \"kubernetes.io/projected/bce7950c-6bdd-4565-bbea-4020273d4230-kube-api-access-5glvw\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.599902 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.599931 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-config-data\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.599969 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-config-data\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.599989 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvm4q\" (UniqueName: \"kubernetes.io/projected/7ee1c56d-85df-423c-8151-25ad2e681203-kube-api-access-wvm4q\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.600024 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-config-data\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.600041 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.600132 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvcrc\" (UniqueName: \"kubernetes.io/projected/8e68ea03-384f-49ba-9ff6-871c7162797a-kube-api-access-zvcrc\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.600168 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1c56d-85df-423c-8151-25ad2e681203-logs\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.600647 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1c56d-85df-423c-8151-25ad2e681203-logs\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.607072 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.612196 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-config-data\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.618981 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.619273 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-config-data\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.619954 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.627518 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5glvw\" (UniqueName: \"kubernetes.io/projected/bce7950c-6bdd-4565-bbea-4020273d4230-kube-api-access-5glvw\") pod \"nova-scheduler-0\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.631165 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.633308 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvm4q\" (UniqueName: \"kubernetes.io/projected/7ee1c56d-85df-423c-8151-25ad2e681203-kube-api-access-wvm4q\") pod \"nova-api-0\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.640624 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.641911 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.645295 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.665127 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69b4446475-9spf5"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.666803 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.679075 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.701804 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vnn6\" (UniqueName: \"kubernetes.io/projected/a6e0b566-6e65-4b10-965f-ffd19a54feaf-kube-api-access-2vnn6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.701884 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvcrc\" (UniqueName: \"kubernetes.io/projected/8e68ea03-384f-49ba-9ff6-871c7162797a-kube-api-access-zvcrc\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.701923 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.701942 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e68ea03-384f-49ba-9ff6-871c7162797a-logs\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.701978 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.701993 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.702048 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.702070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.702120 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-config\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.702147 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-config-data\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.702182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-svc\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.702198 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjd6b\" (UniqueName: \"kubernetes.io/projected/1c673124-31a1-48ec-a799-19ab9be5469a-kube-api-access-xjd6b\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.702219 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.703336 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e68ea03-384f-49ba-9ff6-871c7162797a-logs\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.717319 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-9spf5"] Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.719164 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.721563 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-config-data\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.730690 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvcrc\" (UniqueName: \"kubernetes.io/projected/8e68ea03-384f-49ba-9ff6-871c7162797a-kube-api-access-zvcrc\") pod \"nova-metadata-0\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.803187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.803443 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.803559 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-config\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.803666 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-svc\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.803744 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjd6b\" (UniqueName: \"kubernetes.io/projected/1c673124-31a1-48ec-a799-19ab9be5469a-kube-api-access-xjd6b\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.803839 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.803991 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vnn6\" (UniqueName: \"kubernetes.io/projected/a6e0b566-6e65-4b10-965f-ffd19a54feaf-kube-api-access-2vnn6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.804121 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.804220 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.805191 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-nb\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.805377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-sb\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.805657 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-swift-storage-0\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.805952 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-svc\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.806000 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-config\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.818668 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.819309 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.828623 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vnn6\" (UniqueName: \"kubernetes.io/projected/a6e0b566-6e65-4b10-965f-ffd19a54feaf-kube-api-access-2vnn6\") pod \"nova-cell1-novncproxy-0\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.828915 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjd6b\" (UniqueName: \"kubernetes.io/projected/1c673124-31a1-48ec-a799-19ab9be5469a-kube-api-access-xjd6b\") pod \"dnsmasq-dns-69b4446475-9spf5\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.922872 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.959334 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.973767 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:32 crc kubenswrapper[4786]: I0313 15:26:32.994067 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.247251 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cvs2v"] Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.351613 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.395428 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cvs2v" event={"ID":"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4","Type":"ContainerStarted","Data":"8a9b0e9abf13c238108b2055bbb1b13da34cf936573e149fea18f7ded9897c44"} Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.433781 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.667207 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:26:33 crc kubenswrapper[4786]: W0313 15:26:33.672717 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e68ea03_384f_49ba_9ff6_871c7162797a.slice/crio-e91efc9bde0c64554f0e783378450b5ea436dd69fd2dfcc5797f57c5cedd732c WatchSource:0}: Error finding container e91efc9bde0c64554f0e783378450b5ea436dd69fd2dfcc5797f57c5cedd732c: Status 404 returned error can't find the container with id e91efc9bde0c64554f0e783378450b5ea436dd69fd2dfcc5797f57c5cedd732c Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.717398 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmld"] Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.718593 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.720230 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.720848 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.737429 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmld"] Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.819185 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:26:33 crc kubenswrapper[4786]: W0313 15:26:33.823355 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6e0b566_6e65_4b10_965f_ffd19a54feaf.slice/crio-2f93f6cd1e7b47323a735b1d59e6e9d9acc7fe770deef1cd8734532a89373ec8 WatchSource:0}: Error finding container 2f93f6cd1e7b47323a735b1d59e6e9d9acc7fe770deef1cd8734532a89373ec8: Status 404 returned error can't find the container with id 2f93f6cd1e7b47323a735b1d59e6e9d9acc7fe770deef1cd8734532a89373ec8 Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.825269 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.825355 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-scripts\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.825466 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-config-data\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.825539 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsh5b\" (UniqueName: \"kubernetes.io/projected/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-kube-api-access-dsh5b\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.838681 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-9spf5"] Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.927114 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.927192 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-scripts\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.927313 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-config-data\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.927398 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsh5b\" (UniqueName: \"kubernetes.io/projected/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-kube-api-access-dsh5b\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.934340 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-scripts\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.934571 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-config-data\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.934683 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:33 crc kubenswrapper[4786]: I0313 15:26:33.948060 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsh5b\" (UniqueName: \"kubernetes.io/projected/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-kube-api-access-dsh5b\") pod \"nova-cell1-conductor-db-sync-lnmld\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.043527 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.428818 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cvs2v" event={"ID":"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4","Type":"ContainerStarted","Data":"b88c276bd3280d84dd7d091ac5a691295599342583e25d8e6753ca7442b48a94"} Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.432079 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a6e0b566-6e65-4b10-965f-ffd19a54feaf","Type":"ContainerStarted","Data":"2f93f6cd1e7b47323a735b1d59e6e9d9acc7fe770deef1cd8734532a89373ec8"} Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.433622 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bce7950c-6bdd-4565-bbea-4020273d4230","Type":"ContainerStarted","Data":"60d95a3700ef59e8e6f0cc54ec4f416eddb9136c3024622d3bac7ddf4cc4c889"} Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.446494 4786 generic.go:334] "Generic (PLEG): container finished" podID="1c673124-31a1-48ec-a799-19ab9be5469a" containerID="f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7" exitCode=0 Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.446670 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-9spf5" event={"ID":"1c673124-31a1-48ec-a799-19ab9be5469a","Type":"ContainerDied","Data":"f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7"} Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.446798 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-9spf5" event={"ID":"1c673124-31a1-48ec-a799-19ab9be5469a","Type":"ContainerStarted","Data":"374fc183891be52576d21deac40615a8c5586547a84ff86b58022e9b60951d07"} Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.452045 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e68ea03-384f-49ba-9ff6-871c7162797a","Type":"ContainerStarted","Data":"e91efc9bde0c64554f0e783378450b5ea436dd69fd2dfcc5797f57c5cedd732c"} Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.459117 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ee1c56d-85df-423c-8151-25ad2e681203","Type":"ContainerStarted","Data":"44434c2895aa19b144e29c5114036b06f343c91b08678957cc7ffe4ab0fd6bdb"} Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.464515 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cvs2v" podStartSLOduration=2.464488837 podStartE2EDuration="2.464488837s" podCreationTimestamp="2026-03-13 15:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:26:34.450164332 +0000 UTC m=+1424.613376153" watchObservedRunningTime="2026-03-13 15:26:34.464488837 +0000 UTC m=+1424.627700668" Mar 13 15:26:34 crc kubenswrapper[4786]: I0313 15:26:34.576755 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmld"] Mar 13 15:26:35 crc kubenswrapper[4786]: I0313 15:26:35.894167 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:26:35 crc kubenswrapper[4786]: I0313 15:26:35.907235 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:26:36 crc kubenswrapper[4786]: W0313 15:26:36.738340 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfc0bcf0_d9d4_4cdb_bdbb_a6bd0b008556.slice/crio-003e355fe78260d0725e500592e98a87ef850f86ad3acd445283a0fad70eb08f WatchSource:0}: Error finding container 003e355fe78260d0725e500592e98a87ef850f86ad3acd445283a0fad70eb08f: Status 404 returned error can't find the container with id 003e355fe78260d0725e500592e98a87ef850f86ad3acd445283a0fad70eb08f Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.493466 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lnmld" event={"ID":"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556","Type":"ContainerStarted","Data":"dc20faaf8cc18fcc57c6cd26a3dd4b742eebc5e5fa114e21cff65868c6520d96"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.494278 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lnmld" event={"ID":"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556","Type":"ContainerStarted","Data":"003e355fe78260d0725e500592e98a87ef850f86ad3acd445283a0fad70eb08f"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.496064 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a6e0b566-6e65-4b10-965f-ffd19a54feaf","Type":"ContainerStarted","Data":"9310b7cce2e4fe802b213a6880009adcc4fa06391c412ca10270cea916666961"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.496217 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a6e0b566-6e65-4b10-965f-ffd19a54feaf" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9310b7cce2e4fe802b213a6880009adcc4fa06391c412ca10270cea916666961" gracePeriod=30 Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.498645 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bce7950c-6bdd-4565-bbea-4020273d4230","Type":"ContainerStarted","Data":"a5ebf5d47c3bda5fed80041797374895ffd3e246047c47e900c5f4b65007033b"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.503422 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-9spf5" event={"ID":"1c673124-31a1-48ec-a799-19ab9be5469a","Type":"ContainerStarted","Data":"a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.503894 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.507196 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e68ea03-384f-49ba-9ff6-871c7162797a","Type":"ContainerStarted","Data":"97169bbe009e1a8cd5540385dcb980d4b982bb33d4a67d5a28a9e3329c1751a0"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.507247 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e68ea03-384f-49ba-9ff6-871c7162797a","Type":"ContainerStarted","Data":"e6f71e613f469c2bac0f3799e031a811cb3db82f35c00b0f461a98ffb7b7c85e"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.507331 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerName="nova-metadata-log" containerID="cri-o://e6f71e613f469c2bac0f3799e031a811cb3db82f35c00b0f461a98ffb7b7c85e" gracePeriod=30 Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.507486 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerName="nova-metadata-metadata" containerID="cri-o://97169bbe009e1a8cd5540385dcb980d4b982bb33d4a67d5a28a9e3329c1751a0" gracePeriod=30 Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.516277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ee1c56d-85df-423c-8151-25ad2e681203","Type":"ContainerStarted","Data":"aa58163b1811bf4e08e92bda5164b74d3c5c5e36ddf0d80825b4497a55b219de"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.516329 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ee1c56d-85df-423c-8151-25ad2e681203","Type":"ContainerStarted","Data":"06c5f43e5b49a883415f1dcb2802322135aa1d4ac4bf33e0a364acafc89c04b4"} Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.517450 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-lnmld" podStartSLOduration=4.517438583 podStartE2EDuration="4.517438583s" podCreationTimestamp="2026-03-13 15:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:26:37.507398444 +0000 UTC m=+1427.670610255" watchObservedRunningTime="2026-03-13 15:26:37.517438583 +0000 UTC m=+1427.680650394" Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.543718 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.057279232 podStartE2EDuration="5.543695614s" podCreationTimestamp="2026-03-13 15:26:32 +0000 UTC" firstStartedPulling="2026-03-13 15:26:33.392261595 +0000 UTC m=+1423.555473406" lastFinishedPulling="2026-03-13 15:26:36.878677977 +0000 UTC m=+1427.041889788" observedRunningTime="2026-03-13 15:26:37.529501392 +0000 UTC m=+1427.692713203" watchObservedRunningTime="2026-03-13 15:26:37.543695614 +0000 UTC m=+1427.706907425" Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.561230 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5055403849999998 podStartE2EDuration="5.561209648s" podCreationTimestamp="2026-03-13 15:26:32 +0000 UTC" firstStartedPulling="2026-03-13 15:26:33.825751332 +0000 UTC m=+1423.988963153" lastFinishedPulling="2026-03-13 15:26:36.881420605 +0000 UTC m=+1427.044632416" observedRunningTime="2026-03-13 15:26:37.5528203 +0000 UTC m=+1427.716032111" watchObservedRunningTime="2026-03-13 15:26:37.561209648 +0000 UTC m=+1427.724421459" Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.584204 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.369504822 podStartE2EDuration="5.584186398s" podCreationTimestamp="2026-03-13 15:26:32 +0000 UTC" firstStartedPulling="2026-03-13 15:26:33.684022298 +0000 UTC m=+1423.847234109" lastFinishedPulling="2026-03-13 15:26:36.898703874 +0000 UTC m=+1427.061915685" observedRunningTime="2026-03-13 15:26:37.580221639 +0000 UTC m=+1427.743433450" watchObservedRunningTime="2026-03-13 15:26:37.584186398 +0000 UTC m=+1427.747398209" Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.606578 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69b4446475-9spf5" podStartSLOduration=5.606561062 podStartE2EDuration="5.606561062s" podCreationTimestamp="2026-03-13 15:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:26:37.598164134 +0000 UTC m=+1427.761375945" watchObservedRunningTime="2026-03-13 15:26:37.606561062 +0000 UTC m=+1427.769772873" Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.630402 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.224128269 podStartE2EDuration="5.630376023s" podCreationTimestamp="2026-03-13 15:26:32 +0000 UTC" firstStartedPulling="2026-03-13 15:26:33.472782362 +0000 UTC m=+1423.635994173" lastFinishedPulling="2026-03-13 15:26:36.879030116 +0000 UTC m=+1427.042241927" observedRunningTime="2026-03-13 15:26:37.620216981 +0000 UTC m=+1427.783428802" watchObservedRunningTime="2026-03-13 15:26:37.630376023 +0000 UTC m=+1427.793587834" Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.632016 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 15:26:37 crc kubenswrapper[4786]: I0313 15:26:37.974278 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:26:38 crc kubenswrapper[4786]: I0313 15:26:38.531596 4786 generic.go:334] "Generic (PLEG): container finished" podID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerID="e6f71e613f469c2bac0f3799e031a811cb3db82f35c00b0f461a98ffb7b7c85e" exitCode=143 Mar 13 15:26:38 crc kubenswrapper[4786]: I0313 15:26:38.531721 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e68ea03-384f-49ba-9ff6-871c7162797a","Type":"ContainerDied","Data":"e6f71e613f469c2bac0f3799e031a811cb3db82f35c00b0f461a98ffb7b7c85e"} Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.487189 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.555523 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerID="b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1" exitCode=137 Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.561833 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.571816 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerDied","Data":"b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1"} Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.571889 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b29184d-5d3e-45d6-add8-50082c1d56b6","Type":"ContainerDied","Data":"bd28880149922e6d72fa1f9ce976f21171c7b72a8d7d1480d67889dcddce399e"} Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.571916 4786 scope.go:117] "RemoveContainer" containerID="b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.596181 4786 scope.go:117] "RemoveContainer" containerID="99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.616617 4786 scope.go:117] "RemoveContainer" containerID="ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.635912 4786 scope.go:117] "RemoveContainer" containerID="790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.657071 4786 scope.go:117] "RemoveContainer" containerID="b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.657235 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b45q\" (UniqueName: \"kubernetes.io/projected/6b29184d-5d3e-45d6-add8-50082c1d56b6-kube-api-access-2b45q\") pod \"6b29184d-5d3e-45d6-add8-50082c1d56b6\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.657396 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-combined-ca-bundle\") pod \"6b29184d-5d3e-45d6-add8-50082c1d56b6\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.657418 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-scripts\") pod \"6b29184d-5d3e-45d6-add8-50082c1d56b6\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.657454 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-config-data\") pod \"6b29184d-5d3e-45d6-add8-50082c1d56b6\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.657471 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-log-httpd\") pod \"6b29184d-5d3e-45d6-add8-50082c1d56b6\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.657553 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-sg-core-conf-yaml\") pod \"6b29184d-5d3e-45d6-add8-50082c1d56b6\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.657598 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-run-httpd\") pod \"6b29184d-5d3e-45d6-add8-50082c1d56b6\" (UID: \"6b29184d-5d3e-45d6-add8-50082c1d56b6\") " Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.658529 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6b29184d-5d3e-45d6-add8-50082c1d56b6" (UID: "6b29184d-5d3e-45d6-add8-50082c1d56b6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.658686 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6b29184d-5d3e-45d6-add8-50082c1d56b6" (UID: "6b29184d-5d3e-45d6-add8-50082c1d56b6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:40 crc kubenswrapper[4786]: E0313 15:26:40.659495 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1\": container with ID starting with b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1 not found: ID does not exist" containerID="b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.659541 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1"} err="failed to get container status \"b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1\": rpc error: code = NotFound desc = could not find container \"b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1\": container with ID starting with b558839298f39b0a48401da3596bb309af786c616b5ceebb391db470c56d4cb1 not found: ID does not exist" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.659569 4786 scope.go:117] "RemoveContainer" containerID="99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480" Mar 13 15:26:40 crc kubenswrapper[4786]: E0313 15:26:40.660276 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480\": container with ID starting with 99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480 not found: ID does not exist" containerID="99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.660323 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480"} err="failed to get container status \"99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480\": rpc error: code = NotFound desc = could not find container \"99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480\": container with ID starting with 99bbc962f7e9163b23ba9cb7e18a3b0cb8ad70adb48c332b8bd3501a783b0480 not found: ID does not exist" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.660359 4786 scope.go:117] "RemoveContainer" containerID="ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b" Mar 13 15:26:40 crc kubenswrapper[4786]: E0313 15:26:40.661449 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b\": container with ID starting with ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b not found: ID does not exist" containerID="ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.661491 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b"} err="failed to get container status \"ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b\": rpc error: code = NotFound desc = could not find container \"ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b\": container with ID starting with ccf0a3af17281a6d26839add0f8ed4d8e2b34aa53584a496feeea423701dee1b not found: ID does not exist" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.661558 4786 scope.go:117] "RemoveContainer" containerID="790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484" Mar 13 15:26:40 crc kubenswrapper[4786]: E0313 15:26:40.662455 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484\": container with ID starting with 790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484 not found: ID does not exist" containerID="790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.662498 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484"} err="failed to get container status \"790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484\": rpc error: code = NotFound desc = could not find container \"790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484\": container with ID starting with 790665c2fd47067b9ca6c2261e50138c527e5f615873a0f1577352c41337b484 not found: ID does not exist" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.664427 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-scripts" (OuterVolumeSpecName: "scripts") pod "6b29184d-5d3e-45d6-add8-50082c1d56b6" (UID: "6b29184d-5d3e-45d6-add8-50082c1d56b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.664553 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b29184d-5d3e-45d6-add8-50082c1d56b6-kube-api-access-2b45q" (OuterVolumeSpecName: "kube-api-access-2b45q") pod "6b29184d-5d3e-45d6-add8-50082c1d56b6" (UID: "6b29184d-5d3e-45d6-add8-50082c1d56b6"). InnerVolumeSpecName "kube-api-access-2b45q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.694969 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6b29184d-5d3e-45d6-add8-50082c1d56b6" (UID: "6b29184d-5d3e-45d6-add8-50082c1d56b6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.759548 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.759581 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.759590 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.759600 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b29184d-5d3e-45d6-add8-50082c1d56b6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.759609 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b45q\" (UniqueName: \"kubernetes.io/projected/6b29184d-5d3e-45d6-add8-50082c1d56b6-kube-api-access-2b45q\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.775122 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b29184d-5d3e-45d6-add8-50082c1d56b6" (UID: "6b29184d-5d3e-45d6-add8-50082c1d56b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.790477 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-config-data" (OuterVolumeSpecName: "config-data") pod "6b29184d-5d3e-45d6-add8-50082c1d56b6" (UID: "6b29184d-5d3e-45d6-add8-50082c1d56b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.862182 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.862246 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b29184d-5d3e-45d6-add8-50082c1d56b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.918971 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.931487 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.944957 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:26:40 crc kubenswrapper[4786]: E0313 15:26:40.945464 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="proxy-httpd" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.945492 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="proxy-httpd" Mar 13 15:26:40 crc kubenswrapper[4786]: E0313 15:26:40.945528 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="ceilometer-notification-agent" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.945537 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="ceilometer-notification-agent" Mar 13 15:26:40 crc kubenswrapper[4786]: E0313 15:26:40.945554 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="ceilometer-central-agent" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.945562 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="ceilometer-central-agent" Mar 13 15:26:40 crc kubenswrapper[4786]: E0313 15:26:40.945583 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="sg-core" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.945591 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="sg-core" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.945820 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="ceilometer-central-agent" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.945837 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="sg-core" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.945874 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="proxy-httpd" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.945891 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" containerName="ceilometer-notification-agent" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.951692 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.957334 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.959417 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:26:40 crc kubenswrapper[4786]: I0313 15:26:40.968089 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.065270 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.065308 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-log-httpd\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.065351 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-config-data\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.065369 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-scripts\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.065386 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jcqp\" (UniqueName: \"kubernetes.io/projected/7462754d-ff0e-45ba-962b-d69596ba9d1d-kube-api-access-6jcqp\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.065418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.065458 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-run-httpd\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.166699 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-run-httpd\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.166801 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.166823 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-log-httpd\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.166872 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-config-data\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.166893 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-scripts\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.166909 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jcqp\" (UniqueName: \"kubernetes.io/projected/7462754d-ff0e-45ba-962b-d69596ba9d1d-kube-api-access-6jcqp\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.166940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.168168 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-run-httpd\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.168212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-log-httpd\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.171550 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-config-data\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.171987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.172238 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.173576 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-scripts\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.187837 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jcqp\" (UniqueName: \"kubernetes.io/projected/7462754d-ff0e-45ba-962b-d69596ba9d1d-kube-api-access-6jcqp\") pod \"ceilometer-0\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.278219 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.565397 4786 generic.go:334] "Generic (PLEG): container finished" podID="9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" containerID="b88c276bd3280d84dd7d091ac5a691295599342583e25d8e6753ca7442b48a94" exitCode=0 Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.565590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cvs2v" event={"ID":"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4","Type":"ContainerDied","Data":"b88c276bd3280d84dd7d091ac5a691295599342583e25d8e6753ca7442b48a94"} Mar 13 15:26:41 crc kubenswrapper[4786]: I0313 15:26:41.757278 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:26:41 crc kubenswrapper[4786]: W0313 15:26:41.765310 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7462754d_ff0e_45ba_962b_d69596ba9d1d.slice/crio-de9f8c58ecb5e05d06c9bd22b9bb08b28e89a28e4e48bec507a11017914fedb7 WatchSource:0}: Error finding container de9f8c58ecb5e05d06c9bd22b9bb08b28e89a28e4e48bec507a11017914fedb7: Status 404 returned error can't find the container with id de9f8c58ecb5e05d06c9bd22b9bb08b28e89a28e4e48bec507a11017914fedb7 Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.566150 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b29184d-5d3e-45d6-add8-50082c1d56b6" path="/var/lib/kubelet/pods/6b29184d-5d3e-45d6-add8-50082c1d56b6/volumes" Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.583792 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerStarted","Data":"61b2ace046d99002dc159d84e1b5024fabf2d7a51f930bebda31369942937842"} Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.583842 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerStarted","Data":"de9f8c58ecb5e05d06c9bd22b9bb08b28e89a28e4e48bec507a11017914fedb7"} Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.631985 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.695593 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.923690 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.924025 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.982473 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:42 crc kubenswrapper[4786]: I0313 15:26:42.996420 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.097839 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-4jq58"] Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.100118 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" podUID="ab472ede-43a0-40ac-8e23-81798838d0dc" containerName="dnsmasq-dns" containerID="cri-o://1a635506d88c5049e300f4b6bd956300aa9a7758cc6f9b567b6c714e5f08cbd1" gracePeriod=10 Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.101499 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-combined-ca-bundle\") pod \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.101649 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msqtp\" (UniqueName: \"kubernetes.io/projected/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-kube-api-access-msqtp\") pod \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.101831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-config-data\") pod \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.101907 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-scripts\") pod \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\" (UID: \"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.132485 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-scripts" (OuterVolumeSpecName: "scripts") pod "9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" (UID: "9f4b51a9-f1dd-4b95-85a6-fb2098b786e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.132609 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-kube-api-access-msqtp" (OuterVolumeSpecName: "kube-api-access-msqtp") pod "9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" (UID: "9f4b51a9-f1dd-4b95-85a6-fb2098b786e4"). InnerVolumeSpecName "kube-api-access-msqtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.153022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-config-data" (OuterVolumeSpecName: "config-data") pod "9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" (UID: "9f4b51a9-f1dd-4b95-85a6-fb2098b786e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.168126 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" (UID: "9f4b51a9-f1dd-4b95-85a6-fb2098b786e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.204059 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.204097 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.204106 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.204116 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msqtp\" (UniqueName: \"kubernetes.io/projected/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4-kube-api-access-msqtp\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.599660 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cvs2v" event={"ID":"9f4b51a9-f1dd-4b95-85a6-fb2098b786e4","Type":"ContainerDied","Data":"8a9b0e9abf13c238108b2055bbb1b13da34cf936573e149fea18f7ded9897c44"} Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.599923 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a9b0e9abf13c238108b2055bbb1b13da34cf936573e149fea18f7ded9897c44" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.599722 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cvs2v" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.603599 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerStarted","Data":"7089b3a8dc1c06bc37f73a36e9c24ac8bdea575b2b2101c2ecb2eac66b2c4231"} Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.626182 4786 generic.go:334] "Generic (PLEG): container finished" podID="ab472ede-43a0-40ac-8e23-81798838d0dc" containerID="1a635506d88c5049e300f4b6bd956300aa9a7758cc6f9b567b6c714e5f08cbd1" exitCode=0 Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.626266 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" event={"ID":"ab472ede-43a0-40ac-8e23-81798838d0dc","Type":"ContainerDied","Data":"1a635506d88c5049e300f4b6bd956300aa9a7758cc6f9b567b6c714e5f08cbd1"} Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.626312 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" event={"ID":"ab472ede-43a0-40ac-8e23-81798838d0dc","Type":"ContainerDied","Data":"9378f6bfc7849ff1a7c5f0242a0eec3f150a7b7938c428264fe5aaf9feaa980f"} Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.626329 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9378f6bfc7849ff1a7c5f0242a0eec3f150a7b7938c428264fe5aaf9feaa980f" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.658737 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.660211 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.711902 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-config\") pod \"ab472ede-43a0-40ac-8e23-81798838d0dc\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.712162 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcr5f\" (UniqueName: \"kubernetes.io/projected/ab472ede-43a0-40ac-8e23-81798838d0dc-kube-api-access-fcr5f\") pod \"ab472ede-43a0-40ac-8e23-81798838d0dc\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.712226 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-swift-storage-0\") pod \"ab472ede-43a0-40ac-8e23-81798838d0dc\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.712257 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-nb\") pod \"ab472ede-43a0-40ac-8e23-81798838d0dc\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.712273 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-sb\") pod \"ab472ede-43a0-40ac-8e23-81798838d0dc\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.712391 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-svc\") pod \"ab472ede-43a0-40ac-8e23-81798838d0dc\" (UID: \"ab472ede-43a0-40ac-8e23-81798838d0dc\") " Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.735936 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab472ede-43a0-40ac-8e23-81798838d0dc-kube-api-access-fcr5f" (OuterVolumeSpecName: "kube-api-access-fcr5f") pod "ab472ede-43a0-40ac-8e23-81798838d0dc" (UID: "ab472ede-43a0-40ac-8e23-81798838d0dc"). InnerVolumeSpecName "kube-api-access-fcr5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.736904 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.737117 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-log" containerID="cri-o://06c5f43e5b49a883415f1dcb2802322135aa1d4ac4bf33e0a364acafc89c04b4" gracePeriod=30 Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.737570 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-api" containerID="cri-o://aa58163b1811bf4e08e92bda5164b74d3c5c5e36ddf0d80825b4497a55b219de" gracePeriod=30 Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.748102 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": EOF" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.752214 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.759731 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": EOF" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.821081 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcr5f\" (UniqueName: \"kubernetes.io/projected/ab472ede-43a0-40ac-8e23-81798838d0dc-kube-api-access-fcr5f\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.946106 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ab472ede-43a0-40ac-8e23-81798838d0dc" (UID: "ab472ede-43a0-40ac-8e23-81798838d0dc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.951030 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ab472ede-43a0-40ac-8e23-81798838d0dc" (UID: "ab472ede-43a0-40ac-8e23-81798838d0dc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.954585 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-config" (OuterVolumeSpecName: "config") pod "ab472ede-43a0-40ac-8e23-81798838d0dc" (UID: "ab472ede-43a0-40ac-8e23-81798838d0dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.955272 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ab472ede-43a0-40ac-8e23-81798838d0dc" (UID: "ab472ede-43a0-40ac-8e23-81798838d0dc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:26:43 crc kubenswrapper[4786]: I0313 15:26:43.962076 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab472ede-43a0-40ac-8e23-81798838d0dc" (UID: "ab472ede-43a0-40ac-8e23-81798838d0dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.024756 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.024808 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.024823 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.024838 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.024873 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab472ede-43a0-40ac-8e23-81798838d0dc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.637501 4786 generic.go:334] "Generic (PLEG): container finished" podID="dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" containerID="dc20faaf8cc18fcc57c6cd26a3dd4b742eebc5e5fa114e21cff65868c6520d96" exitCode=0 Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.637568 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lnmld" event={"ID":"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556","Type":"ContainerDied","Data":"dc20faaf8cc18fcc57c6cd26a3dd4b742eebc5e5fa114e21cff65868c6520d96"} Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.641322 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerStarted","Data":"113a12c3a31ae7b16aa2285b35f4e9de9e7133bde0f6fed4e46c89912b32b68f"} Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.643407 4786 generic.go:334] "Generic (PLEG): container finished" podID="7ee1c56d-85df-423c-8151-25ad2e681203" containerID="06c5f43e5b49a883415f1dcb2802322135aa1d4ac4bf33e0a364acafc89c04b4" exitCode=143 Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.643497 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b85ccffc-4jq58" Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.643492 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ee1c56d-85df-423c-8151-25ad2e681203","Type":"ContainerDied","Data":"06c5f43e5b49a883415f1dcb2802322135aa1d4ac4bf33e0a364acafc89c04b4"} Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.712446 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-4jq58"] Mar 13 15:26:44 crc kubenswrapper[4786]: I0313 15:26:44.720951 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b85ccffc-4jq58"] Mar 13 15:26:45 crc kubenswrapper[4786]: I0313 15:26:45.654313 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bce7950c-6bdd-4565-bbea-4020273d4230" containerName="nova-scheduler-scheduler" containerID="cri-o://a5ebf5d47c3bda5fed80041797374895ffd3e246047c47e900c5f4b65007033b" gracePeriod=30 Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.073483 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.167732 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-config-data\") pod \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.167792 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-scripts\") pod \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.167813 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsh5b\" (UniqueName: \"kubernetes.io/projected/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-kube-api-access-dsh5b\") pod \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.167831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-combined-ca-bundle\") pod \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\" (UID: \"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556\") " Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.173495 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-scripts" (OuterVolumeSpecName: "scripts") pod "dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" (UID: "dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.174773 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-kube-api-access-dsh5b" (OuterVolumeSpecName: "kube-api-access-dsh5b") pod "dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" (UID: "dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556"). InnerVolumeSpecName "kube-api-access-dsh5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.217718 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" (UID: "dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.219963 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-config-data" (OuterVolumeSpecName: "config-data") pod "dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" (UID: "dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.270150 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.270187 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.270199 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsh5b\" (UniqueName: \"kubernetes.io/projected/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-kube-api-access-dsh5b\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.270213 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.581823 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab472ede-43a0-40ac-8e23-81798838d0dc" path="/var/lib/kubelet/pods/ab472ede-43a0-40ac-8e23-81798838d0dc/volumes" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.664895 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lnmld" event={"ID":"dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556","Type":"ContainerDied","Data":"003e355fe78260d0725e500592e98a87ef850f86ad3acd445283a0fad70eb08f"} Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.664936 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003e355fe78260d0725e500592e98a87ef850f86ad3acd445283a0fad70eb08f" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.664947 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lnmld" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.679623 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerStarted","Data":"5ffcd6c46edff893480dd233acb3914e0943e1b48da5c31485c925e8a9c7efec"} Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.680530 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.728396 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.07336314 podStartE2EDuration="6.728365671s" podCreationTimestamp="2026-03-13 15:26:40 +0000 UTC" firstStartedPulling="2026-03-13 15:26:41.768736437 +0000 UTC m=+1431.931948248" lastFinishedPulling="2026-03-13 15:26:45.423738968 +0000 UTC m=+1435.586950779" observedRunningTime="2026-03-13 15:26:46.703975606 +0000 UTC m=+1436.867187457" watchObservedRunningTime="2026-03-13 15:26:46.728365671 +0000 UTC m=+1436.891577522" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.755270 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 15:26:46 crc kubenswrapper[4786]: E0313 15:26:46.755961 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" containerName="nova-cell1-conductor-db-sync" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.756078 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" containerName="nova-cell1-conductor-db-sync" Mar 13 15:26:46 crc kubenswrapper[4786]: E0313 15:26:46.756185 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab472ede-43a0-40ac-8e23-81798838d0dc" containerName="init" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.756257 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab472ede-43a0-40ac-8e23-81798838d0dc" containerName="init" Mar 13 15:26:46 crc kubenswrapper[4786]: E0313 15:26:46.756348 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" containerName="nova-manage" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.756425 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" containerName="nova-manage" Mar 13 15:26:46 crc kubenswrapper[4786]: E0313 15:26:46.756507 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab472ede-43a0-40ac-8e23-81798838d0dc" containerName="dnsmasq-dns" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.756808 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab472ede-43a0-40ac-8e23-81798838d0dc" containerName="dnsmasq-dns" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.757137 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab472ede-43a0-40ac-8e23-81798838d0dc" containerName="dnsmasq-dns" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.757244 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" containerName="nova-manage" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.757327 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" containerName="nova-cell1-conductor-db-sync" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.758212 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.760619 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.768899 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.880945 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2djlj\" (UniqueName: \"kubernetes.io/projected/f6b8537d-23ab-4c8d-9ca7-b307562baad8-kube-api-access-2djlj\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.881361 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.881659 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.982891 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.983263 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.983485 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2djlj\" (UniqueName: \"kubernetes.io/projected/f6b8537d-23ab-4c8d-9ca7-b307562baad8-kube-api-access-2djlj\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.993427 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:46 crc kubenswrapper[4786]: I0313 15:26:46.993608 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:47 crc kubenswrapper[4786]: I0313 15:26:47.003309 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2djlj\" (UniqueName: \"kubernetes.io/projected/f6b8537d-23ab-4c8d-9ca7-b307562baad8-kube-api-access-2djlj\") pod \"nova-cell1-conductor-0\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:47 crc kubenswrapper[4786]: I0313 15:26:47.086635 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:47 crc kubenswrapper[4786]: W0313 15:26:47.564067 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6b8537d_23ab_4c8d_9ca7_b307562baad8.slice/crio-7cc9c14bbb0fea5ff50b76a0016e4862b7b9658f981a4a53d341159aa0d5ccb3 WatchSource:0}: Error finding container 7cc9c14bbb0fea5ff50b76a0016e4862b7b9658f981a4a53d341159aa0d5ccb3: Status 404 returned error can't find the container with id 7cc9c14bbb0fea5ff50b76a0016e4862b7b9658f981a4a53d341159aa0d5ccb3 Mar 13 15:26:47 crc kubenswrapper[4786]: I0313 15:26:47.564459 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 15:26:47 crc kubenswrapper[4786]: E0313 15:26:47.634068 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a5ebf5d47c3bda5fed80041797374895ffd3e246047c47e900c5f4b65007033b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 15:26:47 crc kubenswrapper[4786]: E0313 15:26:47.636512 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a5ebf5d47c3bda5fed80041797374895ffd3e246047c47e900c5f4b65007033b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 15:26:47 crc kubenswrapper[4786]: E0313 15:26:47.637534 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a5ebf5d47c3bda5fed80041797374895ffd3e246047c47e900c5f4b65007033b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 15:26:47 crc kubenswrapper[4786]: E0313 15:26:47.637571 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bce7950c-6bdd-4565-bbea-4020273d4230" containerName="nova-scheduler-scheduler" Mar 13 15:26:47 crc kubenswrapper[4786]: I0313 15:26:47.688652 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6b8537d-23ab-4c8d-9ca7-b307562baad8","Type":"ContainerStarted","Data":"7cc9c14bbb0fea5ff50b76a0016e4862b7b9658f981a4a53d341159aa0d5ccb3"} Mar 13 15:26:48 crc kubenswrapper[4786]: I0313 15:26:48.701154 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6b8537d-23ab-4c8d-9ca7-b307562baad8","Type":"ContainerStarted","Data":"ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32"} Mar 13 15:26:48 crc kubenswrapper[4786]: I0313 15:26:48.702883 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:48 crc kubenswrapper[4786]: I0313 15:26:48.704553 4786 generic.go:334] "Generic (PLEG): container finished" podID="bce7950c-6bdd-4565-bbea-4020273d4230" containerID="a5ebf5d47c3bda5fed80041797374895ffd3e246047c47e900c5f4b65007033b" exitCode=0 Mar 13 15:26:48 crc kubenswrapper[4786]: I0313 15:26:48.704582 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bce7950c-6bdd-4565-bbea-4020273d4230","Type":"ContainerDied","Data":"a5ebf5d47c3bda5fed80041797374895ffd3e246047c47e900c5f4b65007033b"} Mar 13 15:26:48 crc kubenswrapper[4786]: I0313 15:26:48.722218 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.722203661 podStartE2EDuration="2.722203661s" podCreationTimestamp="2026-03-13 15:26:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:26:48.722004926 +0000 UTC m=+1438.885216747" watchObservedRunningTime="2026-03-13 15:26:48.722203661 +0000 UTC m=+1438.885415462" Mar 13 15:26:48 crc kubenswrapper[4786]: I0313 15:26:48.954539 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.023751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5glvw\" (UniqueName: \"kubernetes.io/projected/bce7950c-6bdd-4565-bbea-4020273d4230-kube-api-access-5glvw\") pod \"bce7950c-6bdd-4565-bbea-4020273d4230\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.023939 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-combined-ca-bundle\") pod \"bce7950c-6bdd-4565-bbea-4020273d4230\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.024119 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-config-data\") pod \"bce7950c-6bdd-4565-bbea-4020273d4230\" (UID: \"bce7950c-6bdd-4565-bbea-4020273d4230\") " Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.032245 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce7950c-6bdd-4565-bbea-4020273d4230-kube-api-access-5glvw" (OuterVolumeSpecName: "kube-api-access-5glvw") pod "bce7950c-6bdd-4565-bbea-4020273d4230" (UID: "bce7950c-6bdd-4565-bbea-4020273d4230"). InnerVolumeSpecName "kube-api-access-5glvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.053103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bce7950c-6bdd-4565-bbea-4020273d4230" (UID: "bce7950c-6bdd-4565-bbea-4020273d4230"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.066439 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-config-data" (OuterVolumeSpecName: "config-data") pod "bce7950c-6bdd-4565-bbea-4020273d4230" (UID: "bce7950c-6bdd-4565-bbea-4020273d4230"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.127305 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5glvw\" (UniqueName: \"kubernetes.io/projected/bce7950c-6bdd-4565-bbea-4020273d4230-kube-api-access-5glvw\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.127340 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.127349 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bce7950c-6bdd-4565-bbea-4020273d4230-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.725793 4786 generic.go:334] "Generic (PLEG): container finished" podID="7ee1c56d-85df-423c-8151-25ad2e681203" containerID="aa58163b1811bf4e08e92bda5164b74d3c5c5e36ddf0d80825b4497a55b219de" exitCode=0 Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.725889 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ee1c56d-85df-423c-8151-25ad2e681203","Type":"ContainerDied","Data":"aa58163b1811bf4e08e92bda5164b74d3c5c5e36ddf0d80825b4497a55b219de"} Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.725923 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ee1c56d-85df-423c-8151-25ad2e681203","Type":"ContainerDied","Data":"44434c2895aa19b144e29c5114036b06f343c91b08678957cc7ffe4ab0fd6bdb"} Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.725938 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44434c2895aa19b144e29c5114036b06f343c91b08678957cc7ffe4ab0fd6bdb" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.735753 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.736250 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bce7950c-6bdd-4565-bbea-4020273d4230","Type":"ContainerDied","Data":"60d95a3700ef59e8e6f0cc54ec4f416eddb9136c3024622d3bac7ddf4cc4c889"} Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.736308 4786 scope.go:117] "RemoveContainer" containerID="a5ebf5d47c3bda5fed80041797374895ffd3e246047c47e900c5f4b65007033b" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.794071 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.826296 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.840688 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvm4q\" (UniqueName: \"kubernetes.io/projected/7ee1c56d-85df-423c-8151-25ad2e681203-kube-api-access-wvm4q\") pod \"7ee1c56d-85df-423c-8151-25ad2e681203\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.840742 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1c56d-85df-423c-8151-25ad2e681203-logs\") pod \"7ee1c56d-85df-423c-8151-25ad2e681203\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.840922 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-combined-ca-bundle\") pod \"7ee1c56d-85df-423c-8151-25ad2e681203\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.840966 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-config-data\") pod \"7ee1c56d-85df-423c-8151-25ad2e681203\" (UID: \"7ee1c56d-85df-423c-8151-25ad2e681203\") " Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.841662 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ee1c56d-85df-423c-8151-25ad2e681203-logs" (OuterVolumeSpecName: "logs") pod "7ee1c56d-85df-423c-8151-25ad2e681203" (UID: "7ee1c56d-85df-423c-8151-25ad2e681203"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.852094 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.861227 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee1c56d-85df-423c-8151-25ad2e681203-kube-api-access-wvm4q" (OuterVolumeSpecName: "kube-api-access-wvm4q") pod "7ee1c56d-85df-423c-8151-25ad2e681203" (UID: "7ee1c56d-85df-423c-8151-25ad2e681203"). InnerVolumeSpecName "kube-api-access-wvm4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.868753 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:49 crc kubenswrapper[4786]: E0313 15:26:49.869396 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-api" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.869523 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-api" Mar 13 15:26:49 crc kubenswrapper[4786]: E0313 15:26:49.869599 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce7950c-6bdd-4565-bbea-4020273d4230" containerName="nova-scheduler-scheduler" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.869653 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce7950c-6bdd-4565-bbea-4020273d4230" containerName="nova-scheduler-scheduler" Mar 13 15:26:49 crc kubenswrapper[4786]: E0313 15:26:49.869732 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-log" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.869799 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-log" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.870053 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce7950c-6bdd-4565-bbea-4020273d4230" containerName="nova-scheduler-scheduler" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.870599 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-api" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.870708 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" containerName="nova-api-log" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.871424 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.872269 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-config-data" (OuterVolumeSpecName: "config-data") pod "7ee1c56d-85df-423c-8151-25ad2e681203" (UID: "7ee1c56d-85df-423c-8151-25ad2e681203"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.874912 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.877197 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ee1c56d-85df-423c-8151-25ad2e681203" (UID: "7ee1c56d-85df-423c-8151-25ad2e681203"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.883953 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.943016 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhpkp\" (UniqueName: \"kubernetes.io/projected/e63a2420-9298-4a45-b7c3-89a58226dddd-kube-api-access-lhpkp\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.943101 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-config-data\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.943190 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.943292 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvm4q\" (UniqueName: \"kubernetes.io/projected/7ee1c56d-85df-423c-8151-25ad2e681203-kube-api-access-wvm4q\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.943304 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1c56d-85df-423c-8151-25ad2e681203-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.943315 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:49 crc kubenswrapper[4786]: I0313 15:26:49.943324 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1c56d-85df-423c-8151-25ad2e681203-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.044830 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhpkp\" (UniqueName: \"kubernetes.io/projected/e63a2420-9298-4a45-b7c3-89a58226dddd-kube-api-access-lhpkp\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.044934 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-config-data\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.045006 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.049454 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.052340 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-config-data\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.061493 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhpkp\" (UniqueName: \"kubernetes.io/projected/e63a2420-9298-4a45-b7c3-89a58226dddd-kube-api-access-lhpkp\") pod \"nova-scheduler-0\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " pod="openstack/nova-scheduler-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.252787 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.563798 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce7950c-6bdd-4565-bbea-4020273d4230" path="/var/lib/kubelet/pods/bce7950c-6bdd-4565-bbea-4020273d4230/volumes" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.708824 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.746696 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.747455 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e63a2420-9298-4a45-b7c3-89a58226dddd","Type":"ContainerStarted","Data":"e8e0fd87900005c54e3cf410934f67403a6731ab0cafeede7233f6d93e0a5342"} Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.863277 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.877587 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.889975 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.892433 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.895060 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.901104 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.959350 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.959402 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.960641 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-config-data\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.960828 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.960990 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5fvf\" (UniqueName: \"kubernetes.io/projected/e660c9b2-94b9-4e3b-a92e-c965bacccaef-kube-api-access-l5fvf\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:50 crc kubenswrapper[4786]: I0313 15:26:50.961249 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e660c9b2-94b9-4e3b-a92e-c965bacccaef-logs\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.062585 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e660c9b2-94b9-4e3b-a92e-c965bacccaef-logs\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.063264 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-config-data\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.063355 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.063422 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5fvf\" (UniqueName: \"kubernetes.io/projected/e660c9b2-94b9-4e3b-a92e-c965bacccaef-kube-api-access-l5fvf\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.064347 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e660c9b2-94b9-4e3b-a92e-c965bacccaef-logs\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.068277 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-config-data\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.075965 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.095324 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5fvf\" (UniqueName: \"kubernetes.io/projected/e660c9b2-94b9-4e3b-a92e-c965bacccaef-kube-api-access-l5fvf\") pod \"nova-api-0\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.211505 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.680473 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.758809 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e660c9b2-94b9-4e3b-a92e-c965bacccaef","Type":"ContainerStarted","Data":"f89ff398ec37d46e656058d85515f02362d35b6aca1d564d3434f4083ea34006"} Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.760583 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e63a2420-9298-4a45-b7c3-89a58226dddd","Type":"ContainerStarted","Data":"74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb"} Mar 13 15:26:51 crc kubenswrapper[4786]: I0313 15:26:51.787800 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.787781569 podStartE2EDuration="2.787781569s" podCreationTimestamp="2026-03-13 15:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:26:51.778068819 +0000 UTC m=+1441.941280640" watchObservedRunningTime="2026-03-13 15:26:51.787781569 +0000 UTC m=+1441.950993380" Mar 13 15:26:52 crc kubenswrapper[4786]: I0313 15:26:52.135886 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 15:26:52 crc kubenswrapper[4786]: I0313 15:26:52.564787 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee1c56d-85df-423c-8151-25ad2e681203" path="/var/lib/kubelet/pods/7ee1c56d-85df-423c-8151-25ad2e681203/volumes" Mar 13 15:26:52 crc kubenswrapper[4786]: I0313 15:26:52.773376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e660c9b2-94b9-4e3b-a92e-c965bacccaef","Type":"ContainerStarted","Data":"d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6"} Mar 13 15:26:52 crc kubenswrapper[4786]: I0313 15:26:52.773741 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e660c9b2-94b9-4e3b-a92e-c965bacccaef","Type":"ContainerStarted","Data":"8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe"} Mar 13 15:26:52 crc kubenswrapper[4786]: I0313 15:26:52.805140 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.805118701 podStartE2EDuration="2.805118701s" podCreationTimestamp="2026-03-13 15:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:26:52.792316953 +0000 UTC m=+1442.955528774" watchObservedRunningTime="2026-03-13 15:26:52.805118701 +0000 UTC m=+1442.968330512" Mar 13 15:26:55 crc kubenswrapper[4786]: I0313 15:26:55.253919 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 15:27:00 crc kubenswrapper[4786]: I0313 15:27:00.253031 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 15:27:00 crc kubenswrapper[4786]: I0313 15:27:00.285115 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 15:27:00 crc kubenswrapper[4786]: I0313 15:27:00.863448 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 15:27:01 crc kubenswrapper[4786]: I0313 15:27:01.212923 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 15:27:01 crc kubenswrapper[4786]: I0313 15:27:01.212972 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 15:27:02 crc kubenswrapper[4786]: I0313 15:27:02.295021 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:27:02 crc kubenswrapper[4786]: I0313 15:27:02.295407 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:27:06 crc kubenswrapper[4786]: I0313 15:27:06.914764 4786 scope.go:117] "RemoveContainer" containerID="b75d9bc1c31b885ef36a2bda2dbd11ce09de40ca79c24eca1fb59504c54df56e" Mar 13 15:27:07 crc kubenswrapper[4786]: E0313 15:27:07.834174 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e68ea03_384f_49ba_9ff6_871c7162797a.slice/crio-conmon-97169bbe009e1a8cd5540385dcb980d4b982bb33d4a67d5a28a9e3329c1751a0.scope\": RecentStats: unable to find data in memory cache]" Mar 13 15:27:07 crc kubenswrapper[4786]: I0313 15:27:07.909958 4786 generic.go:334] "Generic (PLEG): container finished" podID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerID="97169bbe009e1a8cd5540385dcb980d4b982bb33d4a67d5a28a9e3329c1751a0" exitCode=137 Mar 13 15:27:07 crc kubenswrapper[4786]: I0313 15:27:07.910038 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e68ea03-384f-49ba-9ff6-871c7162797a","Type":"ContainerDied","Data":"97169bbe009e1a8cd5540385dcb980d4b982bb33d4a67d5a28a9e3329c1751a0"} Mar 13 15:27:07 crc kubenswrapper[4786]: I0313 15:27:07.910102 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e68ea03-384f-49ba-9ff6-871c7162797a","Type":"ContainerDied","Data":"e91efc9bde0c64554f0e783378450b5ea436dd69fd2dfcc5797f57c5cedd732c"} Mar 13 15:27:07 crc kubenswrapper[4786]: I0313 15:27:07.910116 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e91efc9bde0c64554f0e783378450b5ea436dd69fd2dfcc5797f57c5cedd732c" Mar 13 15:27:07 crc kubenswrapper[4786]: I0313 15:27:07.911215 4786 generic.go:334] "Generic (PLEG): container finished" podID="a6e0b566-6e65-4b10-965f-ffd19a54feaf" containerID="9310b7cce2e4fe802b213a6880009adcc4fa06391c412ca10270cea916666961" exitCode=137 Mar 13 15:27:07 crc kubenswrapper[4786]: I0313 15:27:07.911254 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a6e0b566-6e65-4b10-965f-ffd19a54feaf","Type":"ContainerDied","Data":"9310b7cce2e4fe802b213a6880009adcc4fa06391c412ca10270cea916666961"} Mar 13 15:27:07 crc kubenswrapper[4786]: I0313 15:27:07.991331 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:27:07 crc kubenswrapper[4786]: I0313 15:27:07.998124 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.116614 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-config-data\") pod \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.117088 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-combined-ca-bundle\") pod \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.117287 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-config-data\") pod \"8e68ea03-384f-49ba-9ff6-871c7162797a\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.117449 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vnn6\" (UniqueName: \"kubernetes.io/projected/a6e0b566-6e65-4b10-965f-ffd19a54feaf-kube-api-access-2vnn6\") pod \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\" (UID: \"a6e0b566-6e65-4b10-965f-ffd19a54feaf\") " Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.117615 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvcrc\" (UniqueName: \"kubernetes.io/projected/8e68ea03-384f-49ba-9ff6-871c7162797a-kube-api-access-zvcrc\") pod \"8e68ea03-384f-49ba-9ff6-871c7162797a\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.117710 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e68ea03-384f-49ba-9ff6-871c7162797a-logs\") pod \"8e68ea03-384f-49ba-9ff6-871c7162797a\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.117829 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-combined-ca-bundle\") pod \"8e68ea03-384f-49ba-9ff6-871c7162797a\" (UID: \"8e68ea03-384f-49ba-9ff6-871c7162797a\") " Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.118136 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e68ea03-384f-49ba-9ff6-871c7162797a-logs" (OuterVolumeSpecName: "logs") pod "8e68ea03-384f-49ba-9ff6-871c7162797a" (UID: "8e68ea03-384f-49ba-9ff6-871c7162797a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.118697 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e68ea03-384f-49ba-9ff6-871c7162797a-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.122601 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e0b566-6e65-4b10-965f-ffd19a54feaf-kube-api-access-2vnn6" (OuterVolumeSpecName: "kube-api-access-2vnn6") pod "a6e0b566-6e65-4b10-965f-ffd19a54feaf" (UID: "a6e0b566-6e65-4b10-965f-ffd19a54feaf"). InnerVolumeSpecName "kube-api-access-2vnn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.122742 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e68ea03-384f-49ba-9ff6-871c7162797a-kube-api-access-zvcrc" (OuterVolumeSpecName: "kube-api-access-zvcrc") pod "8e68ea03-384f-49ba-9ff6-871c7162797a" (UID: "8e68ea03-384f-49ba-9ff6-871c7162797a"). InnerVolumeSpecName "kube-api-access-zvcrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.143709 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-config-data" (OuterVolumeSpecName: "config-data") pod "8e68ea03-384f-49ba-9ff6-871c7162797a" (UID: "8e68ea03-384f-49ba-9ff6-871c7162797a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.144680 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6e0b566-6e65-4b10-965f-ffd19a54feaf" (UID: "a6e0b566-6e65-4b10-965f-ffd19a54feaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.149616 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e68ea03-384f-49ba-9ff6-871c7162797a" (UID: "8e68ea03-384f-49ba-9ff6-871c7162797a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.158456 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-config-data" (OuterVolumeSpecName: "config-data") pod "a6e0b566-6e65-4b10-965f-ffd19a54feaf" (UID: "a6e0b566-6e65-4b10-965f-ffd19a54feaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.233388 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.233436 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.233448 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vnn6\" (UniqueName: \"kubernetes.io/projected/a6e0b566-6e65-4b10-965f-ffd19a54feaf-kube-api-access-2vnn6\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.233466 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvcrc\" (UniqueName: \"kubernetes.io/projected/8e68ea03-384f-49ba-9ff6-871c7162797a-kube-api-access-zvcrc\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.233478 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e68ea03-384f-49ba-9ff6-871c7162797a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.233489 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e0b566-6e65-4b10-965f-ffd19a54feaf-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.923896 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.925008 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.925355 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a6e0b566-6e65-4b10-965f-ffd19a54feaf","Type":"ContainerDied","Data":"2f93f6cd1e7b47323a735b1d59e6e9d9acc7fe770deef1cd8734532a89373ec8"} Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.925394 4786 scope.go:117] "RemoveContainer" containerID="9310b7cce2e4fe802b213a6880009adcc4fa06391c412ca10270cea916666961" Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.973602 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:08 crc kubenswrapper[4786]: I0313 15:27:08.988641 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.009176 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.020134 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.038829 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: E0313 15:27:09.039499 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerName="nova-metadata-metadata" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.039562 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerName="nova-metadata-metadata" Mar 13 15:27:09 crc kubenswrapper[4786]: E0313 15:27:09.039580 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e0b566-6e65-4b10-965f-ffd19a54feaf" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.039589 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e0b566-6e65-4b10-965f-ffd19a54feaf" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 15:27:09 crc kubenswrapper[4786]: E0313 15:27:09.039610 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerName="nova-metadata-log" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.039621 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerName="nova-metadata-log" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.039911 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerName="nova-metadata-log" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.039986 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" containerName="nova-metadata-metadata" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.040003 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e0b566-6e65-4b10-965f-ffd19a54feaf" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.041678 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.046719 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.049935 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.051886 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.054797 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.054822 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.055068 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.055321 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.065215 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.075664 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.151828 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.151944 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.151985 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.152016 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6k9d\" (UniqueName: \"kubernetes.io/projected/3e362102-0c50-415e-8108-82eb18632381-kube-api-access-l6k9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.152096 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.152239 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.152465 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-config-data\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.152657 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.152685 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg5jv\" (UniqueName: \"kubernetes.io/projected/2322d84c-3acc-433b-a70a-21b88d0f2aa1-kube-api-access-wg5jv\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.152723 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2322d84c-3acc-433b-a70a-21b88d0f2aa1-logs\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.212368 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.212433 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254551 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254598 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg5jv\" (UniqueName: \"kubernetes.io/projected/2322d84c-3acc-433b-a70a-21b88d0f2aa1-kube-api-access-wg5jv\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254626 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2322d84c-3acc-433b-a70a-21b88d0f2aa1-logs\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254682 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254723 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254768 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254796 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6k9d\" (UniqueName: \"kubernetes.io/projected/3e362102-0c50-415e-8108-82eb18632381-kube-api-access-l6k9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254822 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254847 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.254905 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-config-data\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.255663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2322d84c-3acc-433b-a70a-21b88d0f2aa1-logs\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.260991 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.261069 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.261313 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-config-data\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.261313 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.262417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.262752 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.264815 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.274035 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6k9d\" (UniqueName: \"kubernetes.io/projected/3e362102-0c50-415e-8108-82eb18632381-kube-api-access-l6k9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.277254 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg5jv\" (UniqueName: \"kubernetes.io/projected/2322d84c-3acc-433b-a70a-21b88d0f2aa1-kube-api-access-wg5jv\") pod \"nova-metadata-0\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.383704 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.395061 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.829364 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: W0313 15:27:09.835260 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2322d84c_3acc_433b_a70a_21b88d0f2aa1.slice/crio-013bf940a40626258858b2d34b9ed1ffde3c6b1bc42a322439f61256577dc5f8 WatchSource:0}: Error finding container 013bf940a40626258858b2d34b9ed1ffde3c6b1bc42a322439f61256577dc5f8: Status 404 returned error can't find the container with id 013bf940a40626258858b2d34b9ed1ffde3c6b1bc42a322439f61256577dc5f8 Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.936592 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:27:09 crc kubenswrapper[4786]: I0313 15:27:09.952009 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2322d84c-3acc-433b-a70a-21b88d0f2aa1","Type":"ContainerStarted","Data":"013bf940a40626258858b2d34b9ed1ffde3c6b1bc42a322439f61256577dc5f8"} Mar 13 15:27:10 crc kubenswrapper[4786]: I0313 15:27:10.563999 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e68ea03-384f-49ba-9ff6-871c7162797a" path="/var/lib/kubelet/pods/8e68ea03-384f-49ba-9ff6-871c7162797a/volumes" Mar 13 15:27:10 crc kubenswrapper[4786]: I0313 15:27:10.565214 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e0b566-6e65-4b10-965f-ffd19a54feaf" path="/var/lib/kubelet/pods/a6e0b566-6e65-4b10-965f-ffd19a54feaf/volumes" Mar 13 15:27:10 crc kubenswrapper[4786]: I0313 15:27:10.970843 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2322d84c-3acc-433b-a70a-21b88d0f2aa1","Type":"ContainerStarted","Data":"b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2"} Mar 13 15:27:10 crc kubenswrapper[4786]: I0313 15:27:10.970927 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2322d84c-3acc-433b-a70a-21b88d0f2aa1","Type":"ContainerStarted","Data":"5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22"} Mar 13 15:27:10 crc kubenswrapper[4786]: I0313 15:27:10.973156 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e362102-0c50-415e-8108-82eb18632381","Type":"ContainerStarted","Data":"2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150"} Mar 13 15:27:10 crc kubenswrapper[4786]: I0313 15:27:10.973189 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e362102-0c50-415e-8108-82eb18632381","Type":"ContainerStarted","Data":"3cdc000bb5d0f0c5280ba172145d2076ac73bf9e48cabd2a89dc4ed2dd4eecb9"} Mar 13 15:27:11 crc kubenswrapper[4786]: I0313 15:27:11.008718 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.008689505 podStartE2EDuration="3.008689505s" podCreationTimestamp="2026-03-13 15:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:27:10.997837986 +0000 UTC m=+1461.161049797" watchObservedRunningTime="2026-03-13 15:27:11.008689505 +0000 UTC m=+1461.171901316" Mar 13 15:27:11 crc kubenswrapper[4786]: I0313 15:27:11.023496 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.023478772 podStartE2EDuration="3.023478772s" podCreationTimestamp="2026-03-13 15:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:27:11.018274263 +0000 UTC m=+1461.181486074" watchObservedRunningTime="2026-03-13 15:27:11.023478772 +0000 UTC m=+1461.186690583" Mar 13 15:27:11 crc kubenswrapper[4786]: I0313 15:27:11.217052 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 15:27:11 crc kubenswrapper[4786]: I0313 15:27:11.218652 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 15:27:11 crc kubenswrapper[4786]: I0313 15:27:11.231112 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 15:27:11 crc kubenswrapper[4786]: I0313 15:27:11.293814 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 15:27:11 crc kubenswrapper[4786]: I0313 15:27:11.989066 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.175937 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-2k7lw"] Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.177525 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.196922 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-2k7lw"] Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.345740 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.345830 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfv6x\" (UniqueName: \"kubernetes.io/projected/4a092179-7f71-47ce-9764-df909331a819-kube-api-access-hfv6x\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.345872 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-config\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.345928 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.345946 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.345975 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.448009 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.448129 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.448213 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfv6x\" (UniqueName: \"kubernetes.io/projected/4a092179-7f71-47ce-9764-df909331a819-kube-api-access-hfv6x\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.448243 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-config\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.448306 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.448321 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.449295 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-svc\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.449799 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-swift-storage-0\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.450454 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-sb\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.450657 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-config\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.451003 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-nb\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.474161 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfv6x\" (UniqueName: \"kubernetes.io/projected/4a092179-7f71-47ce-9764-df909331a819-kube-api-access-hfv6x\") pod \"dnsmasq-dns-fdb8f6449-2k7lw\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:12 crc kubenswrapper[4786]: I0313 15:27:12.509339 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:13 crc kubenswrapper[4786]: I0313 15:27:13.036677 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-2k7lw"] Mar 13 15:27:13 crc kubenswrapper[4786]: W0313 15:27:13.039592 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a092179_7f71_47ce_9764_df909331a819.slice/crio-ca9e2922a461a1501211db7fa9e74b9e832dcc502667cdbe57945185d0cd0787 WatchSource:0}: Error finding container ca9e2922a461a1501211db7fa9e74b9e832dcc502667cdbe57945185d0cd0787: Status 404 returned error can't find the container with id ca9e2922a461a1501211db7fa9e74b9e832dcc502667cdbe57945185d0cd0787 Mar 13 15:27:14 crc kubenswrapper[4786]: I0313 15:27:14.001895 4786 generic.go:334] "Generic (PLEG): container finished" podID="4a092179-7f71-47ce-9764-df909331a819" containerID="3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca" exitCode=0 Mar 13 15:27:14 crc kubenswrapper[4786]: I0313 15:27:14.002139 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" event={"ID":"4a092179-7f71-47ce-9764-df909331a819","Type":"ContainerDied","Data":"3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca"} Mar 13 15:27:14 crc kubenswrapper[4786]: I0313 15:27:14.002679 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" event={"ID":"4a092179-7f71-47ce-9764-df909331a819","Type":"ContainerStarted","Data":"ca9e2922a461a1501211db7fa9e74b9e832dcc502667cdbe57945185d0cd0787"} Mar 13 15:27:14 crc kubenswrapper[4786]: I0313 15:27:14.395184 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.015166 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" event={"ID":"4a092179-7f71-47ce-9764-df909331a819","Type":"ContainerStarted","Data":"d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f"} Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.015478 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.050339 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" podStartSLOduration=3.050318121 podStartE2EDuration="3.050318121s" podCreationTimestamp="2026-03-13 15:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:27:15.037199545 +0000 UTC m=+1465.200411366" watchObservedRunningTime="2026-03-13 15:27:15.050318121 +0000 UTC m=+1465.213529932" Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.229178 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.229436 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="ceilometer-central-agent" containerID="cri-o://61b2ace046d99002dc159d84e1b5024fabf2d7a51f930bebda31369942937842" gracePeriod=30 Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.229566 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="proxy-httpd" containerID="cri-o://5ffcd6c46edff893480dd233acb3914e0943e1b48da5c31485c925e8a9c7efec" gracePeriod=30 Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.229611 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="sg-core" containerID="cri-o://113a12c3a31ae7b16aa2285b35f4e9de9e7133bde0f6fed4e46c89912b32b68f" gracePeriod=30 Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.229640 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="ceilometer-notification-agent" containerID="cri-o://7089b3a8dc1c06bc37f73a36e9c24ac8bdea575b2b2101c2ecb2eac66b2c4231" gracePeriod=30 Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.622566 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.624168 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-log" containerID="cri-o://8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe" gracePeriod=30 Mar 13 15:27:15 crc kubenswrapper[4786]: I0313 15:27:15.624202 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-api" containerID="cri-o://d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6" gracePeriod=30 Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.035630 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e660c9b2-94b9-4e3b-a92e-c965bacccaef","Type":"ContainerDied","Data":"8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe"} Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.035718 4786 generic.go:334] "Generic (PLEG): container finished" podID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerID="8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe" exitCode=143 Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.046197 4786 generic.go:334] "Generic (PLEG): container finished" podID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerID="5ffcd6c46edff893480dd233acb3914e0943e1b48da5c31485c925e8a9c7efec" exitCode=0 Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.046249 4786 generic.go:334] "Generic (PLEG): container finished" podID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerID="113a12c3a31ae7b16aa2285b35f4e9de9e7133bde0f6fed4e46c89912b32b68f" exitCode=2 Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.046258 4786 generic.go:334] "Generic (PLEG): container finished" podID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerID="7089b3a8dc1c06bc37f73a36e9c24ac8bdea575b2b2101c2ecb2eac66b2c4231" exitCode=0 Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.046266 4786 generic.go:334] "Generic (PLEG): container finished" podID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerID="61b2ace046d99002dc159d84e1b5024fabf2d7a51f930bebda31369942937842" exitCode=0 Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.047657 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerDied","Data":"5ffcd6c46edff893480dd233acb3914e0943e1b48da5c31485c925e8a9c7efec"} Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.047712 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerDied","Data":"113a12c3a31ae7b16aa2285b35f4e9de9e7133bde0f6fed4e46c89912b32b68f"} Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.047726 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerDied","Data":"7089b3a8dc1c06bc37f73a36e9c24ac8bdea575b2b2101c2ecb2eac66b2c4231"} Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.047737 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerDied","Data":"61b2ace046d99002dc159d84e1b5024fabf2d7a51f930bebda31369942937842"} Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.117510 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.224747 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-run-httpd\") pod \"7462754d-ff0e-45ba-962b-d69596ba9d1d\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.224790 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-scripts\") pod \"7462754d-ff0e-45ba-962b-d69596ba9d1d\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.224884 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-config-data\") pod \"7462754d-ff0e-45ba-962b-d69596ba9d1d\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.224922 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-sg-core-conf-yaml\") pod \"7462754d-ff0e-45ba-962b-d69596ba9d1d\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.224943 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jcqp\" (UniqueName: \"kubernetes.io/projected/7462754d-ff0e-45ba-962b-d69596ba9d1d-kube-api-access-6jcqp\") pod \"7462754d-ff0e-45ba-962b-d69596ba9d1d\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.225061 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-combined-ca-bundle\") pod \"7462754d-ff0e-45ba-962b-d69596ba9d1d\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.225077 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-log-httpd\") pod \"7462754d-ff0e-45ba-962b-d69596ba9d1d\" (UID: \"7462754d-ff0e-45ba-962b-d69596ba9d1d\") " Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.226349 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7462754d-ff0e-45ba-962b-d69596ba9d1d" (UID: "7462754d-ff0e-45ba-962b-d69596ba9d1d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.226591 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7462754d-ff0e-45ba-962b-d69596ba9d1d" (UID: "7462754d-ff0e-45ba-962b-d69596ba9d1d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.233983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7462754d-ff0e-45ba-962b-d69596ba9d1d-kube-api-access-6jcqp" (OuterVolumeSpecName: "kube-api-access-6jcqp") pod "7462754d-ff0e-45ba-962b-d69596ba9d1d" (UID: "7462754d-ff0e-45ba-962b-d69596ba9d1d"). InnerVolumeSpecName "kube-api-access-6jcqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.234098 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-scripts" (OuterVolumeSpecName: "scripts") pod "7462754d-ff0e-45ba-962b-d69596ba9d1d" (UID: "7462754d-ff0e-45ba-962b-d69596ba9d1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.250905 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7462754d-ff0e-45ba-962b-d69596ba9d1d" (UID: "7462754d-ff0e-45ba-962b-d69596ba9d1d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.320143 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7462754d-ff0e-45ba-962b-d69596ba9d1d" (UID: "7462754d-ff0e-45ba-962b-d69596ba9d1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.327470 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jcqp\" (UniqueName: \"kubernetes.io/projected/7462754d-ff0e-45ba-962b-d69596ba9d1d-kube-api-access-6jcqp\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.327505 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.327518 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.327530 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.327541 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.327552 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7462754d-ff0e-45ba-962b-d69596ba9d1d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.359524 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-config-data" (OuterVolumeSpecName: "config-data") pod "7462754d-ff0e-45ba-962b-d69596ba9d1d" (UID: "7462754d-ff0e-45ba-962b-d69596ba9d1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:16 crc kubenswrapper[4786]: I0313 15:27:16.429722 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462754d-ff0e-45ba-962b-d69596ba9d1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.057379 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7462754d-ff0e-45ba-962b-d69596ba9d1d","Type":"ContainerDied","Data":"de9f8c58ecb5e05d06c9bd22b9bb08b28e89a28e4e48bec507a11017914fedb7"} Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.057704 4786 scope.go:117] "RemoveContainer" containerID="5ffcd6c46edff893480dd233acb3914e0943e1b48da5c31485c925e8a9c7efec" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.057836 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.080936 4786 scope.go:117] "RemoveContainer" containerID="113a12c3a31ae7b16aa2285b35f4e9de9e7133bde0f6fed4e46c89912b32b68f" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.087508 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.098988 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.110027 4786 scope.go:117] "RemoveContainer" containerID="7089b3a8dc1c06bc37f73a36e9c24ac8bdea575b2b2101c2ecb2eac66b2c4231" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.113652 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:17 crc kubenswrapper[4786]: E0313 15:27:17.116551 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="sg-core" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.116572 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="sg-core" Mar 13 15:27:17 crc kubenswrapper[4786]: E0313 15:27:17.116601 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="proxy-httpd" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.116607 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="proxy-httpd" Mar 13 15:27:17 crc kubenswrapper[4786]: E0313 15:27:17.116631 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="ceilometer-central-agent" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.116644 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="ceilometer-central-agent" Mar 13 15:27:17 crc kubenswrapper[4786]: E0313 15:27:17.116681 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="ceilometer-notification-agent" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.116691 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="ceilometer-notification-agent" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.117269 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="proxy-httpd" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.117305 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="ceilometer-central-agent" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.117319 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="sg-core" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.117326 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" containerName="ceilometer-notification-agent" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.122451 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.126277 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.126815 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.160680 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.176425 4786 scope.go:117] "RemoveContainer" containerID="61b2ace046d99002dc159d84e1b5024fabf2d7a51f930bebda31369942937842" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.245199 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qrwc\" (UniqueName: \"kubernetes.io/projected/f5089f8e-a024-4142-8bd3-dfe2b7004efb-kube-api-access-2qrwc\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.245250 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-log-httpd\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.245297 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.245332 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-scripts\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.245530 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-config-data\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.245615 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-run-httpd\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.245674 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.346998 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-scripts\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.347081 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-config-data\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.347120 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-run-httpd\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.347153 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.347246 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qrwc\" (UniqueName: \"kubernetes.io/projected/f5089f8e-a024-4142-8bd3-dfe2b7004efb-kube-api-access-2qrwc\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.347276 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-log-httpd\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.347329 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.347981 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-run-httpd\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.347996 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-log-httpd\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.352555 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.353386 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.353531 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-config-data\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.354540 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-scripts\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.369566 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qrwc\" (UniqueName: \"kubernetes.io/projected/f5089f8e-a024-4142-8bd3-dfe2b7004efb-kube-api-access-2qrwc\") pod \"ceilometer-0\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.398222 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.399045 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.441774 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.442034 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="13eda4d3-ef97-4ed1-889d-bd7b60b91179" containerName="kube-state-metrics" containerID="cri-o://d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5" gracePeriod=30 Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.883189 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:17 crc kubenswrapper[4786]: I0313 15:27:17.961514 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.062088 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68m96\" (UniqueName: \"kubernetes.io/projected/13eda4d3-ef97-4ed1-889d-bd7b60b91179-kube-api-access-68m96\") pod \"13eda4d3-ef97-4ed1-889d-bd7b60b91179\" (UID: \"13eda4d3-ef97-4ed1-889d-bd7b60b91179\") " Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.067148 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13eda4d3-ef97-4ed1-889d-bd7b60b91179-kube-api-access-68m96" (OuterVolumeSpecName: "kube-api-access-68m96") pod "13eda4d3-ef97-4ed1-889d-bd7b60b91179" (UID: "13eda4d3-ef97-4ed1-889d-bd7b60b91179"). InnerVolumeSpecName "kube-api-access-68m96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.070730 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerStarted","Data":"1150e7eb0d695fc6922539ca97c39452355231d532758027c0499e9de5894a50"} Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.073037 4786 generic.go:334] "Generic (PLEG): container finished" podID="13eda4d3-ef97-4ed1-889d-bd7b60b91179" containerID="d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5" exitCode=2 Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.073095 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13eda4d3-ef97-4ed1-889d-bd7b60b91179","Type":"ContainerDied","Data":"d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5"} Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.073148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"13eda4d3-ef97-4ed1-889d-bd7b60b91179","Type":"ContainerDied","Data":"f6457d5fad4655eff23797e21535c4e7bb7f64f53eb7cc8595d869689366ddf4"} Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.073106 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.073172 4786 scope.go:117] "RemoveContainer" containerID="d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.115343 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.120365 4786 scope.go:117] "RemoveContainer" containerID="d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5" Mar 13 15:27:18 crc kubenswrapper[4786]: E0313 15:27:18.123548 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5\": container with ID starting with d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5 not found: ID does not exist" containerID="d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.123594 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5"} err="failed to get container status \"d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5\": rpc error: code = NotFound desc = could not find container \"d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5\": container with ID starting with d4d064b37b35ba2fdc0aacf3b7b32e097b959010359e35096382a80c1b0cd6c5 not found: ID does not exist" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.133304 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.141413 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:27:18 crc kubenswrapper[4786]: E0313 15:27:18.141814 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13eda4d3-ef97-4ed1-889d-bd7b60b91179" containerName="kube-state-metrics" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.141830 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="13eda4d3-ef97-4ed1-889d-bd7b60b91179" containerName="kube-state-metrics" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.142090 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="13eda4d3-ef97-4ed1-889d-bd7b60b91179" containerName="kube-state-metrics" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.142629 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.144939 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.145929 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.150115 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.164772 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68m96\" (UniqueName: \"kubernetes.io/projected/13eda4d3-ef97-4ed1-889d-bd7b60b91179-kube-api-access-68m96\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.266148 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.266197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.266222 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxgb\" (UniqueName: \"kubernetes.io/projected/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-api-access-rzxgb\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.266322 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.368220 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.368282 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.368312 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxgb\" (UniqueName: \"kubernetes.io/projected/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-api-access-rzxgb\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.368416 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.373082 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.374013 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.374414 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.389517 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxgb\" (UniqueName: \"kubernetes.io/projected/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-api-access-rzxgb\") pod \"kube-state-metrics-0\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.461836 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.566054 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13eda4d3-ef97-4ed1-889d-bd7b60b91179" path="/var/lib/kubelet/pods/13eda4d3-ef97-4ed1-889d-bd7b60b91179/volumes" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.567541 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7462754d-ff0e-45ba-962b-d69596ba9d1d" path="/var/lib/kubelet/pods/7462754d-ff0e-45ba-962b-d69596ba9d1d/volumes" Mar 13 15:27:18 crc kubenswrapper[4786]: I0313 15:27:18.918358 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:27:19 crc kubenswrapper[4786]: I0313 15:27:19.084633 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14fdb1f3-fd7f-4b31-ac34-42438c44720a","Type":"ContainerStarted","Data":"e51693fd6cfbfdedf1aea0b3db8284ad8b36ad6d7676caae6af0a1fcd84ee1d2"} Mar 13 15:27:19 crc kubenswrapper[4786]: I0313 15:27:19.086042 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerStarted","Data":"54c19969ec82ba8eab1d86f51f63001d0f6437dcaa731188a732e397cb7fa037"} Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.384513 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.384869 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.398382 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.438664 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.588811 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.704151 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e660c9b2-94b9-4e3b-a92e-c965bacccaef-logs\") pod \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.704221 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-combined-ca-bundle\") pod \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.704330 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5fvf\" (UniqueName: \"kubernetes.io/projected/e660c9b2-94b9-4e3b-a92e-c965bacccaef-kube-api-access-l5fvf\") pod \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.704510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-config-data\") pod \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\" (UID: \"e660c9b2-94b9-4e3b-a92e-c965bacccaef\") " Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.707029 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e660c9b2-94b9-4e3b-a92e-c965bacccaef-logs" (OuterVolumeSpecName: "logs") pod "e660c9b2-94b9-4e3b-a92e-c965bacccaef" (UID: "e660c9b2-94b9-4e3b-a92e-c965bacccaef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.735395 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e660c9b2-94b9-4e3b-a92e-c965bacccaef-kube-api-access-l5fvf" (OuterVolumeSpecName: "kube-api-access-l5fvf") pod "e660c9b2-94b9-4e3b-a92e-c965bacccaef" (UID: "e660c9b2-94b9-4e3b-a92e-c965bacccaef"). InnerVolumeSpecName "kube-api-access-l5fvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.738955 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-config-data" (OuterVolumeSpecName: "config-data") pod "e660c9b2-94b9-4e3b-a92e-c965bacccaef" (UID: "e660c9b2-94b9-4e3b-a92e-c965bacccaef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.765047 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e660c9b2-94b9-4e3b-a92e-c965bacccaef" (UID: "e660c9b2-94b9-4e3b-a92e-c965bacccaef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.806162 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.806183 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e660c9b2-94b9-4e3b-a92e-c965bacccaef-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.806191 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e660c9b2-94b9-4e3b-a92e-c965bacccaef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:19.806203 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5fvf\" (UniqueName: \"kubernetes.io/projected/e660c9b2-94b9-4e3b-a92e-c965bacccaef-kube-api-access-l5fvf\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.098683 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerStarted","Data":"47be803f556e7898a193dcd67b8d03192aeb9b47c6863e42ba9fa07ef398ad95"} Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.100618 4786 generic.go:334] "Generic (PLEG): container finished" podID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerID="d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6" exitCode=0 Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.100658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e660c9b2-94b9-4e3b-a92e-c965bacccaef","Type":"ContainerDied","Data":"d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6"} Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.100678 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e660c9b2-94b9-4e3b-a92e-c965bacccaef","Type":"ContainerDied","Data":"f89ff398ec37d46e656058d85515f02362d35b6aca1d564d3434f4083ea34006"} Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.100699 4786 scope.go:117] "RemoveContainer" containerID="d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.100837 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.106901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14fdb1f3-fd7f-4b31-ac34-42438c44720a","Type":"ContainerStarted","Data":"6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b"} Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.107136 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.128271 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.138176 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.295617086 podStartE2EDuration="2.138158774s" podCreationTimestamp="2026-03-13 15:27:18 +0000 UTC" firstStartedPulling="2026-03-13 15:27:18.923804788 +0000 UTC m=+1469.087016599" lastFinishedPulling="2026-03-13 15:27:19.766346476 +0000 UTC m=+1469.929558287" observedRunningTime="2026-03-13 15:27:20.124818033 +0000 UTC m=+1470.288029854" watchObservedRunningTime="2026-03-13 15:27:20.138158774 +0000 UTC m=+1470.301370585" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.144582 4786 scope.go:117] "RemoveContainer" containerID="8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.156740 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.184924 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.193918 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:20 crc kubenswrapper[4786]: E0313 15:27:20.194328 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-log" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.194342 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-log" Mar 13 15:27:20 crc kubenswrapper[4786]: E0313 15:27:20.194385 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-api" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.194391 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-api" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.194559 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-api" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.194579 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" containerName="nova-api-log" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.195779 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.199329 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.199586 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.200679 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.230615 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.230781 4786 scope.go:117] "RemoveContainer" containerID="d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6" Mar 13 15:27:20 crc kubenswrapper[4786]: E0313 15:27:20.234115 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6\": container with ID starting with d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6 not found: ID does not exist" containerID="d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.234161 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6"} err="failed to get container status \"d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6\": rpc error: code = NotFound desc = could not find container \"d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6\": container with ID starting with d0884abef458fec0e4cc7f5b23fda55c46da3d239070078ae48d8ae2db041de6 not found: ID does not exist" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.234194 4786 scope.go:117] "RemoveContainer" containerID="8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe" Mar 13 15:27:20 crc kubenswrapper[4786]: E0313 15:27:20.234523 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe\": container with ID starting with 8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe not found: ID does not exist" containerID="8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.234541 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe"} err="failed to get container status \"8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe\": rpc error: code = NotFound desc = could not find container \"8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe\": container with ID starting with 8869bf015e9ae20ba156378a41ff827f9c3cb175c56d0c91c8a7706878dd57fe not found: ID does not exist" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.316488 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqwb7\" (UniqueName: \"kubernetes.io/projected/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-kube-api-access-qqwb7\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.316553 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-public-tls-certs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.316584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-logs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.316937 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.316968 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-config-data\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.316983 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.383743 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-57hdr"] Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.384909 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.387104 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.387532 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.393648 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-57hdr"] Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.412999 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.413264 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.421005 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.421057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-config-data\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.421080 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.421191 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqwb7\" (UniqueName: \"kubernetes.io/projected/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-kube-api-access-qqwb7\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.421234 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-public-tls-certs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.421260 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-logs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.421815 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-logs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.428055 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-config-data\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.431326 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.434471 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.434942 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-public-tls-certs\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.438166 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqwb7\" (UniqueName: \"kubernetes.io/projected/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-kube-api-access-qqwb7\") pod \"nova-api-0\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.522912 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-config-data\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.523028 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-scripts\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.523069 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265fr\" (UniqueName: \"kubernetes.io/projected/3c887202-25c6-42a6-975d-fa96e5e8673a-kube-api-access-265fr\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.523126 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.558158 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.564577 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e660c9b2-94b9-4e3b-a92e-c965bacccaef" path="/var/lib/kubelet/pods/e660c9b2-94b9-4e3b-a92e-c965bacccaef/volumes" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.625416 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-config-data\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.625463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-scripts\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.625493 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-265fr\" (UniqueName: \"kubernetes.io/projected/3c887202-25c6-42a6-975d-fa96e5e8673a-kube-api-access-265fr\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.625510 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.630669 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.630719 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-config-data\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.633160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-scripts\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.644432 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-265fr\" (UniqueName: \"kubernetes.io/projected/3c887202-25c6-42a6-975d-fa96e5e8673a-kube-api-access-265fr\") pod \"nova-cell1-cell-mapping-57hdr\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:20 crc kubenswrapper[4786]: I0313 15:27:20.853162 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:21 crc kubenswrapper[4786]: I0313 15:27:21.132201 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerStarted","Data":"61e893e71f5d32e3d116e00eb47db632836c3a9a1ab8a8825ae704c5493d8c9b"} Mar 13 15:27:21 crc kubenswrapper[4786]: I0313 15:27:21.216849 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:21 crc kubenswrapper[4786]: W0313 15:27:21.379896 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c887202_25c6_42a6_975d_fa96e5e8673a.slice/crio-84fc09b246a542a9d093260b7b87691fc707d3c1f72c63df11f720f441f13afa WatchSource:0}: Error finding container 84fc09b246a542a9d093260b7b87691fc707d3c1f72c63df11f720f441f13afa: Status 404 returned error can't find the container with id 84fc09b246a542a9d093260b7b87691fc707d3c1f72c63df11f720f441f13afa Mar 13 15:27:21 crc kubenswrapper[4786]: I0313 15:27:21.384239 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-57hdr"] Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.144557 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-57hdr" event={"ID":"3c887202-25c6-42a6-975d-fa96e5e8673a","Type":"ContainerStarted","Data":"ab2535b8e1de5140eefc200ad3169b4b8cccfd02a55308f9e31396c3a4208a9d"} Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.144916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-57hdr" event={"ID":"3c887202-25c6-42a6-975d-fa96e5e8673a","Type":"ContainerStarted","Data":"84fc09b246a542a9d093260b7b87691fc707d3c1f72c63df11f720f441f13afa"} Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.147370 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e","Type":"ContainerStarted","Data":"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8"} Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.147429 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e","Type":"ContainerStarted","Data":"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd"} Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.147446 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e","Type":"ContainerStarted","Data":"47b0182a16b4c9807fc4234e461ade119c989ed08ae8be914c256d323cc80362"} Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.173968 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-57hdr" podStartSLOduration=2.173952694 podStartE2EDuration="2.173952694s" podCreationTimestamp="2026-03-13 15:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:27:22.170441187 +0000 UTC m=+1472.333652998" watchObservedRunningTime="2026-03-13 15:27:22.173952694 +0000 UTC m=+1472.337164525" Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.195776 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.195758774 podStartE2EDuration="2.195758774s" podCreationTimestamp="2026-03-13 15:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:27:22.187751316 +0000 UTC m=+1472.350963127" watchObservedRunningTime="2026-03-13 15:27:22.195758774 +0000 UTC m=+1472.358970585" Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.511056 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.585642 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-9spf5"] Mar 13 15:27:22 crc kubenswrapper[4786]: I0313 15:27:22.585915 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69b4446475-9spf5" podUID="1c673124-31a1-48ec-a799-19ab9be5469a" containerName="dnsmasq-dns" containerID="cri-o://a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f" gracePeriod=10 Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.143124 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.174991 4786 generic.go:334] "Generic (PLEG): container finished" podID="1c673124-31a1-48ec-a799-19ab9be5469a" containerID="a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f" exitCode=0 Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.175068 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-9spf5" event={"ID":"1c673124-31a1-48ec-a799-19ab9be5469a","Type":"ContainerDied","Data":"a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f"} Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.175097 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69b4446475-9spf5" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.175127 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69b4446475-9spf5" event={"ID":"1c673124-31a1-48ec-a799-19ab9be5469a","Type":"ContainerDied","Data":"374fc183891be52576d21deac40615a8c5586547a84ff86b58022e9b60951d07"} Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.175148 4786 scope.go:117] "RemoveContainer" containerID="a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.207804 4786 scope.go:117] "RemoveContainer" containerID="f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.245776 4786 scope.go:117] "RemoveContainer" containerID="a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f" Mar 13 15:27:23 crc kubenswrapper[4786]: E0313 15:27:23.246202 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f\": container with ID starting with a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f not found: ID does not exist" containerID="a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.246232 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f"} err="failed to get container status \"a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f\": rpc error: code = NotFound desc = could not find container \"a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f\": container with ID starting with a24779576c09f4d8f71d20ace2fbc77187a8f38709f24760d592428bb54eb47f not found: ID does not exist" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.246249 4786 scope.go:117] "RemoveContainer" containerID="f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7" Mar 13 15:27:23 crc kubenswrapper[4786]: E0313 15:27:23.246532 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7\": container with ID starting with f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7 not found: ID does not exist" containerID="f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.246583 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7"} err="failed to get container status \"f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7\": rpc error: code = NotFound desc = could not find container \"f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7\": container with ID starting with f6ca48c88a9a2c801becfdeb6939d98705308fed83264b8b8f909cfe65342fc7 not found: ID does not exist" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.277498 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-swift-storage-0\") pod \"1c673124-31a1-48ec-a799-19ab9be5469a\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.277551 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-nb\") pod \"1c673124-31a1-48ec-a799-19ab9be5469a\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.277653 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjd6b\" (UniqueName: \"kubernetes.io/projected/1c673124-31a1-48ec-a799-19ab9be5469a-kube-api-access-xjd6b\") pod \"1c673124-31a1-48ec-a799-19ab9be5469a\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.277763 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-svc\") pod \"1c673124-31a1-48ec-a799-19ab9be5469a\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.277798 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-config\") pod \"1c673124-31a1-48ec-a799-19ab9be5469a\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.277845 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-sb\") pod \"1c673124-31a1-48ec-a799-19ab9be5469a\" (UID: \"1c673124-31a1-48ec-a799-19ab9be5469a\") " Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.301620 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c673124-31a1-48ec-a799-19ab9be5469a-kube-api-access-xjd6b" (OuterVolumeSpecName: "kube-api-access-xjd6b") pod "1c673124-31a1-48ec-a799-19ab9be5469a" (UID: "1c673124-31a1-48ec-a799-19ab9be5469a"). InnerVolumeSpecName "kube-api-access-xjd6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.336472 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1c673124-31a1-48ec-a799-19ab9be5469a" (UID: "1c673124-31a1-48ec-a799-19ab9be5469a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.345368 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-config" (OuterVolumeSpecName: "config") pod "1c673124-31a1-48ec-a799-19ab9be5469a" (UID: "1c673124-31a1-48ec-a799-19ab9be5469a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.358957 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c673124-31a1-48ec-a799-19ab9be5469a" (UID: "1c673124-31a1-48ec-a799-19ab9be5469a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.359548 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c673124-31a1-48ec-a799-19ab9be5469a" (UID: "1c673124-31a1-48ec-a799-19ab9be5469a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.372423 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c673124-31a1-48ec-a799-19ab9be5469a" (UID: "1c673124-31a1-48ec-a799-19ab9be5469a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.381428 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjd6b\" (UniqueName: \"kubernetes.io/projected/1c673124-31a1-48ec-a799-19ab9be5469a-kube-api-access-xjd6b\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.381467 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.381477 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.381487 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.381497 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.381504 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c673124-31a1-48ec-a799-19ab9be5469a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.510404 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-9spf5"] Mar 13 15:27:23 crc kubenswrapper[4786]: I0313 15:27:23.517881 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69b4446475-9spf5"] Mar 13 15:27:24 crc kubenswrapper[4786]: I0313 15:27:24.186364 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerStarted","Data":"dcb6fa036b56eb9683f3cc50dacb5d974b0573c6bd56860b17fab7dba49c8f4e"} Mar 13 15:27:24 crc kubenswrapper[4786]: I0313 15:27:24.186731 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:27:24 crc kubenswrapper[4786]: I0313 15:27:24.186510 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="proxy-httpd" containerID="cri-o://dcb6fa036b56eb9683f3cc50dacb5d974b0573c6bd56860b17fab7dba49c8f4e" gracePeriod=30 Mar 13 15:27:24 crc kubenswrapper[4786]: I0313 15:27:24.186477 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="ceilometer-central-agent" containerID="cri-o://54c19969ec82ba8eab1d86f51f63001d0f6437dcaa731188a732e397cb7fa037" gracePeriod=30 Mar 13 15:27:24 crc kubenswrapper[4786]: I0313 15:27:24.186532 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="sg-core" containerID="cri-o://61e893e71f5d32e3d116e00eb47db632836c3a9a1ab8a8825ae704c5493d8c9b" gracePeriod=30 Mar 13 15:27:24 crc kubenswrapper[4786]: I0313 15:27:24.186536 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="ceilometer-notification-agent" containerID="cri-o://47be803f556e7898a193dcd67b8d03192aeb9b47c6863e42ba9fa07ef398ad95" gracePeriod=30 Mar 13 15:27:24 crc kubenswrapper[4786]: I0313 15:27:24.223482 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.535024102 podStartE2EDuration="7.223451903s" podCreationTimestamp="2026-03-13 15:27:17 +0000 UTC" firstStartedPulling="2026-03-13 15:27:17.902244194 +0000 UTC m=+1468.065456005" lastFinishedPulling="2026-03-13 15:27:23.590671995 +0000 UTC m=+1473.753883806" observedRunningTime="2026-03-13 15:27:24.210205644 +0000 UTC m=+1474.373417465" watchObservedRunningTime="2026-03-13 15:27:24.223451903 +0000 UTC m=+1474.386663714" Mar 13 15:27:24 crc kubenswrapper[4786]: I0313 15:27:24.562198 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c673124-31a1-48ec-a799-19ab9be5469a" path="/var/lib/kubelet/pods/1c673124-31a1-48ec-a799-19ab9be5469a/volumes" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.214884 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerID="dcb6fa036b56eb9683f3cc50dacb5d974b0573c6bd56860b17fab7dba49c8f4e" exitCode=0 Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.214920 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerID="61e893e71f5d32e3d116e00eb47db632836c3a9a1ab8a8825ae704c5493d8c9b" exitCode=2 Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.214931 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerID="47be803f556e7898a193dcd67b8d03192aeb9b47c6863e42ba9fa07ef398ad95" exitCode=0 Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.214941 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerID="54c19969ec82ba8eab1d86f51f63001d0f6437dcaa731188a732e397cb7fa037" exitCode=0 Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.214965 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerDied","Data":"dcb6fa036b56eb9683f3cc50dacb5d974b0573c6bd56860b17fab7dba49c8f4e"} Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.214992 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerDied","Data":"61e893e71f5d32e3d116e00eb47db632836c3a9a1ab8a8825ae704c5493d8c9b"} Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.215005 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerDied","Data":"47be803f556e7898a193dcd67b8d03192aeb9b47c6863e42ba9fa07ef398ad95"} Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.215017 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerDied","Data":"54c19969ec82ba8eab1d86f51f63001d0f6437dcaa731188a732e397cb7fa037"} Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.379124 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.533420 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-scripts\") pod \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.533488 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-config-data\") pod \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.533544 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qrwc\" (UniqueName: \"kubernetes.io/projected/f5089f8e-a024-4142-8bd3-dfe2b7004efb-kube-api-access-2qrwc\") pod \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.533574 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-run-httpd\") pod \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.533631 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-combined-ca-bundle\") pod \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.533705 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-log-httpd\") pod \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.533794 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-sg-core-conf-yaml\") pod \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\" (UID: \"f5089f8e-a024-4142-8bd3-dfe2b7004efb\") " Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.535200 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f5089f8e-a024-4142-8bd3-dfe2b7004efb" (UID: "f5089f8e-a024-4142-8bd3-dfe2b7004efb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.535505 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f5089f8e-a024-4142-8bd3-dfe2b7004efb" (UID: "f5089f8e-a024-4142-8bd3-dfe2b7004efb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.541109 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-scripts" (OuterVolumeSpecName: "scripts") pod "f5089f8e-a024-4142-8bd3-dfe2b7004efb" (UID: "f5089f8e-a024-4142-8bd3-dfe2b7004efb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.557635 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5089f8e-a024-4142-8bd3-dfe2b7004efb-kube-api-access-2qrwc" (OuterVolumeSpecName: "kube-api-access-2qrwc") pod "f5089f8e-a024-4142-8bd3-dfe2b7004efb" (UID: "f5089f8e-a024-4142-8bd3-dfe2b7004efb"). InnerVolumeSpecName "kube-api-access-2qrwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.578387 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f5089f8e-a024-4142-8bd3-dfe2b7004efb" (UID: "f5089f8e-a024-4142-8bd3-dfe2b7004efb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.613206 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5089f8e-a024-4142-8bd3-dfe2b7004efb" (UID: "f5089f8e-a024-4142-8bd3-dfe2b7004efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.637362 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.637413 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.637432 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qrwc\" (UniqueName: \"kubernetes.io/projected/f5089f8e-a024-4142-8bd3-dfe2b7004efb-kube-api-access-2qrwc\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.637451 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.637470 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.637485 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5089f8e-a024-4142-8bd3-dfe2b7004efb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.655968 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-config-data" (OuterVolumeSpecName: "config-data") pod "f5089f8e-a024-4142-8bd3-dfe2b7004efb" (UID: "f5089f8e-a024-4142-8bd3-dfe2b7004efb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:25 crc kubenswrapper[4786]: I0313 15:27:25.738986 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5089f8e-a024-4142-8bd3-dfe2b7004efb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.228107 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5089f8e-a024-4142-8bd3-dfe2b7004efb","Type":"ContainerDied","Data":"1150e7eb0d695fc6922539ca97c39452355231d532758027c0499e9de5894a50"} Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.228441 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.228453 4786 scope.go:117] "RemoveContainer" containerID="dcb6fa036b56eb9683f3cc50dacb5d974b0573c6bd56860b17fab7dba49c8f4e" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.276529 4786 scope.go:117] "RemoveContainer" containerID="61e893e71f5d32e3d116e00eb47db632836c3a9a1ab8a8825ae704c5493d8c9b" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.286779 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.306684 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.311404 4786 scope.go:117] "RemoveContainer" containerID="47be803f556e7898a193dcd67b8d03192aeb9b47c6863e42ba9fa07ef398ad95" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.322100 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:26 crc kubenswrapper[4786]: E0313 15:27:26.322545 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="ceilometer-notification-agent" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328031 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="ceilometer-notification-agent" Mar 13 15:27:26 crc kubenswrapper[4786]: E0313 15:27:26.328080 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c673124-31a1-48ec-a799-19ab9be5469a" containerName="init" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328091 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c673124-31a1-48ec-a799-19ab9be5469a" containerName="init" Mar 13 15:27:26 crc kubenswrapper[4786]: E0313 15:27:26.328113 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c673124-31a1-48ec-a799-19ab9be5469a" containerName="dnsmasq-dns" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328122 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c673124-31a1-48ec-a799-19ab9be5469a" containerName="dnsmasq-dns" Mar 13 15:27:26 crc kubenswrapper[4786]: E0313 15:27:26.328145 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="proxy-httpd" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328153 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="proxy-httpd" Mar 13 15:27:26 crc kubenswrapper[4786]: E0313 15:27:26.328218 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="ceilometer-central-agent" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328227 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="ceilometer-central-agent" Mar 13 15:27:26 crc kubenswrapper[4786]: E0313 15:27:26.328244 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="sg-core" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328253 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="sg-core" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328588 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="proxy-httpd" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328619 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="ceilometer-notification-agent" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328633 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="sg-core" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328655 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" containerName="ceilometer-central-agent" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.328674 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c673124-31a1-48ec-a799-19ab9be5469a" containerName="dnsmasq-dns" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.330875 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.332653 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.332992 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.333500 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.337084 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.365204 4786 scope.go:117] "RemoveContainer" containerID="54c19969ec82ba8eab1d86f51f63001d0f6437dcaa731188a732e397cb7fa037" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.455324 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-config-data\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.455406 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.455432 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-run-httpd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.455479 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.455550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.455584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-scripts\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.455626 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-log-httpd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.455703 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xvwd\" (UniqueName: \"kubernetes.io/projected/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-kube-api-access-7xvwd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.557671 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.557728 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-scripts\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.557773 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-log-httpd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.557843 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xvwd\" (UniqueName: \"kubernetes.io/projected/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-kube-api-access-7xvwd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.557888 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-config-data\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.557923 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.557947 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-run-httpd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.557997 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.559526 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-run-httpd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.559578 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-log-httpd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.563384 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5089f8e-a024-4142-8bd3-dfe2b7004efb" path="/var/lib/kubelet/pods/f5089f8e-a024-4142-8bd3-dfe2b7004efb/volumes" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.564617 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.572585 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-scripts\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.574433 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.574705 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-config-data\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.575576 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xvwd\" (UniqueName: \"kubernetes.io/projected/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-kube-api-access-7xvwd\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.578796 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " pod="openstack/ceilometer-0" Mar 13 15:27:26 crc kubenswrapper[4786]: I0313 15:27:26.654467 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:27:27 crc kubenswrapper[4786]: I0313 15:27:27.115060 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:27:27 crc kubenswrapper[4786]: W0313 15:27:27.119136 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dc45915_09df_4248_8cb8_c7b11d1e4a4c.slice/crio-a1cf8b43df8175438aea1134c65a48b7c00f19499e90bd1a1e5b5367465b5c45 WatchSource:0}: Error finding container a1cf8b43df8175438aea1134c65a48b7c00f19499e90bd1a1e5b5367465b5c45: Status 404 returned error can't find the container with id a1cf8b43df8175438aea1134c65a48b7c00f19499e90bd1a1e5b5367465b5c45 Mar 13 15:27:27 crc kubenswrapper[4786]: I0313 15:27:27.238605 4786 generic.go:334] "Generic (PLEG): container finished" podID="3c887202-25c6-42a6-975d-fa96e5e8673a" containerID="ab2535b8e1de5140eefc200ad3169b4b8cccfd02a55308f9e31396c3a4208a9d" exitCode=0 Mar 13 15:27:27 crc kubenswrapper[4786]: I0313 15:27:27.238666 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-57hdr" event={"ID":"3c887202-25c6-42a6-975d-fa96e5e8673a","Type":"ContainerDied","Data":"ab2535b8e1de5140eefc200ad3169b4b8cccfd02a55308f9e31396c3a4208a9d"} Mar 13 15:27:27 crc kubenswrapper[4786]: I0313 15:27:27.240123 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerStarted","Data":"a1cf8b43df8175438aea1134c65a48b7c00f19499e90bd1a1e5b5367465b5c45"} Mar 13 15:27:27 crc kubenswrapper[4786]: I0313 15:27:27.384576 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 15:27:27 crc kubenswrapper[4786]: I0313 15:27:27.384629 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 15:27:27 crc kubenswrapper[4786]: I0313 15:27:27.998064 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69b4446475-9spf5" podUID="1c673124-31a1-48ec-a799-19ab9be5469a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.197:5353: i/o timeout" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.253066 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerStarted","Data":"2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388"} Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.478378 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.582978 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.694613 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-config-data\") pod \"3c887202-25c6-42a6-975d-fa96e5e8673a\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.694744 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-scripts\") pod \"3c887202-25c6-42a6-975d-fa96e5e8673a\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.694922 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-combined-ca-bundle\") pod \"3c887202-25c6-42a6-975d-fa96e5e8673a\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.695285 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-265fr\" (UniqueName: \"kubernetes.io/projected/3c887202-25c6-42a6-975d-fa96e5e8673a-kube-api-access-265fr\") pod \"3c887202-25c6-42a6-975d-fa96e5e8673a\" (UID: \"3c887202-25c6-42a6-975d-fa96e5e8673a\") " Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.699667 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c887202-25c6-42a6-975d-fa96e5e8673a-kube-api-access-265fr" (OuterVolumeSpecName: "kube-api-access-265fr") pod "3c887202-25c6-42a6-975d-fa96e5e8673a" (UID: "3c887202-25c6-42a6-975d-fa96e5e8673a"). InnerVolumeSpecName "kube-api-access-265fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.712310 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-scripts" (OuterVolumeSpecName: "scripts") pod "3c887202-25c6-42a6-975d-fa96e5e8673a" (UID: "3c887202-25c6-42a6-975d-fa96e5e8673a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.718608 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-config-data" (OuterVolumeSpecName: "config-data") pod "3c887202-25c6-42a6-975d-fa96e5e8673a" (UID: "3c887202-25c6-42a6-975d-fa96e5e8673a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.718676 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c887202-25c6-42a6-975d-fa96e5e8673a" (UID: "3c887202-25c6-42a6-975d-fa96e5e8673a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.798273 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.798307 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.798319 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-265fr\" (UniqueName: \"kubernetes.io/projected/3c887202-25c6-42a6-975d-fa96e5e8673a-kube-api-access-265fr\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:28 crc kubenswrapper[4786]: I0313 15:27:28.798329 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c887202-25c6-42a6-975d-fa96e5e8673a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.278046 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-57hdr" Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.278088 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-57hdr" event={"ID":"3c887202-25c6-42a6-975d-fa96e5e8673a","Type":"ContainerDied","Data":"84fc09b246a542a9d093260b7b87691fc707d3c1f72c63df11f720f441f13afa"} Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.278708 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84fc09b246a542a9d093260b7b87691fc707d3c1f72c63df11f720f441f13afa" Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.280711 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerStarted","Data":"06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab"} Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.388668 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.389339 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.400364 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.456063 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.456330 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerName="nova-api-log" containerID="cri-o://dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd" gracePeriod=30 Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.456607 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerName="nova-api-api" containerID="cri-o://df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8" gracePeriod=30 Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.469137 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.469317 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e63a2420-9298-4a45-b7c3-89a58226dddd" containerName="nova-scheduler-scheduler" containerID="cri-o://74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb" gracePeriod=30 Mar 13 15:27:29 crc kubenswrapper[4786]: I0313 15:27:29.489111 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.132616 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.228196 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-logs\") pod \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.228334 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-config-data\") pod \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.228364 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-internal-tls-certs\") pod \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.228404 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-combined-ca-bundle\") pod \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.228479 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqwb7\" (UniqueName: \"kubernetes.io/projected/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-kube-api-access-qqwb7\") pod \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.228561 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-public-tls-certs\") pod \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\" (UID: \"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e\") " Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.228796 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-logs" (OuterVolumeSpecName: "logs") pod "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" (UID: "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.229402 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.233599 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-kube-api-access-qqwb7" (OuterVolumeSpecName: "kube-api-access-qqwb7") pod "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" (UID: "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e"). InnerVolumeSpecName "kube-api-access-qqwb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.255818 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.256776 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-config-data" (OuterVolumeSpecName: "config-data") pod "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" (UID: "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.259113 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.260205 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" (UID: "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.260407 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.260454 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e63a2420-9298-4a45-b7c3-89a58226dddd" containerName="nova-scheduler-scheduler" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.281822 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" (UID: "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.295597 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" (UID: "3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.301396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerStarted","Data":"48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb"} Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.304019 4786 generic.go:334] "Generic (PLEG): container finished" podID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerID="df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8" exitCode=0 Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.304054 4786 generic.go:334] "Generic (PLEG): container finished" podID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerID="dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd" exitCode=143 Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.304967 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e","Type":"ContainerDied","Data":"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8"} Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.305065 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e","Type":"ContainerDied","Data":"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd"} Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.305159 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e","Type":"ContainerDied","Data":"47b0182a16b4c9807fc4234e461ade119c989ed08ae8be914c256d323cc80362"} Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.305227 4786 scope.go:117] "RemoveContainer" containerID="df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.304981 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.313001 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.332252 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.332325 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.332342 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.332356 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqwb7\" (UniqueName: \"kubernetes.io/projected/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-kube-api-access-qqwb7\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.332367 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.338899 4786 scope.go:117] "RemoveContainer" containerID="dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.383428 4786 scope.go:117] "RemoveContainer" containerID="df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8" Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.385333 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8\": container with ID starting with df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8 not found: ID does not exist" containerID="df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.385374 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8"} err="failed to get container status \"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8\": rpc error: code = NotFound desc = could not find container \"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8\": container with ID starting with df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8 not found: ID does not exist" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.385398 4786 scope.go:117] "RemoveContainer" containerID="dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.386553 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.387160 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd\": container with ID starting with dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd not found: ID does not exist" containerID="dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.387195 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd"} err="failed to get container status \"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd\": rpc error: code = NotFound desc = could not find container \"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd\": container with ID starting with dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd not found: ID does not exist" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.387213 4786 scope.go:117] "RemoveContainer" containerID="df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.387584 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8"} err="failed to get container status \"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8\": rpc error: code = NotFound desc = could not find container \"df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8\": container with ID starting with df16d2461480f389c70293c2a14c4de07cc980c227307dd571f68372d5c8c6c8 not found: ID does not exist" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.387605 4786 scope.go:117] "RemoveContainer" containerID="dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.388277 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd"} err="failed to get container status \"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd\": rpc error: code = NotFound desc = could not find container \"dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd\": container with ID starting with dcd218057c1038c0cdd0e6f8a8c050ed548a2af818d99de8c5db7010c62d2dbd not found: ID does not exist" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.405196 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.425210 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.425661 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerName="nova-api-log" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.425682 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerName="nova-api-log" Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.425702 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c887202-25c6-42a6-975d-fa96e5e8673a" containerName="nova-manage" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.425709 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c887202-25c6-42a6-975d-fa96e5e8673a" containerName="nova-manage" Mar 13 15:27:30 crc kubenswrapper[4786]: E0313 15:27:30.425726 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerName="nova-api-api" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.425732 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerName="nova-api-api" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.425937 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c887202-25c6-42a6-975d-fa96e5e8673a" containerName="nova-manage" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.425956 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerName="nova-api-api" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.425966 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" containerName="nova-api-log" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.426946 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.429748 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.430056 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.437222 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.444512 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.537347 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-config-data\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.537403 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7whtx\" (UniqueName: \"kubernetes.io/projected/e313e1cc-ed94-4e28-84f8-d053dcffb16a-kube-api-access-7whtx\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.537438 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e313e1cc-ed94-4e28-84f8-d053dcffb16a-logs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.537533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.537568 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.537605 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.561878 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e" path="/var/lib/kubelet/pods/3dd03f02-a9f9-4539-a1fb-88aaaeb88a0e/volumes" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.639892 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-config-data\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.639936 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7whtx\" (UniqueName: \"kubernetes.io/projected/e313e1cc-ed94-4e28-84f8-d053dcffb16a-kube-api-access-7whtx\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.639987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e313e1cc-ed94-4e28-84f8-d053dcffb16a-logs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.640042 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.640081 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.640126 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.640635 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e313e1cc-ed94-4e28-84f8-d053dcffb16a-logs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.644479 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.644577 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-config-data\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.647374 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.649591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.672782 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7whtx\" (UniqueName: \"kubernetes.io/projected/e313e1cc-ed94-4e28-84f8-d053dcffb16a-kube-api-access-7whtx\") pod \"nova-api-0\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " pod="openstack/nova-api-0" Mar 13 15:27:30 crc kubenswrapper[4786]: I0313 15:27:30.753684 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:27:31 crc kubenswrapper[4786]: I0313 15:27:31.245513 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:27:31 crc kubenswrapper[4786]: I0313 15:27:31.318051 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e313e1cc-ed94-4e28-84f8-d053dcffb16a","Type":"ContainerStarted","Data":"4a50e4da5ff6dc67e6d3badae4e378cb9557236b3ff8c20495e62517ed87ccb1"} Mar 13 15:27:31 crc kubenswrapper[4786]: I0313 15:27:31.318093 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-log" containerID="cri-o://5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22" gracePeriod=30 Mar 13 15:27:31 crc kubenswrapper[4786]: I0313 15:27:31.318149 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-metadata" containerID="cri-o://b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2" gracePeriod=30 Mar 13 15:27:32 crc kubenswrapper[4786]: I0313 15:27:32.328914 4786 generic.go:334] "Generic (PLEG): container finished" podID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerID="5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22" exitCode=143 Mar 13 15:27:32 crc kubenswrapper[4786]: I0313 15:27:32.329291 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2322d84c-3acc-433b-a70a-21b88d0f2aa1","Type":"ContainerDied","Data":"5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22"} Mar 13 15:27:32 crc kubenswrapper[4786]: I0313 15:27:32.331364 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e313e1cc-ed94-4e28-84f8-d053dcffb16a","Type":"ContainerStarted","Data":"89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a"} Mar 13 15:27:32 crc kubenswrapper[4786]: I0313 15:27:32.331394 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e313e1cc-ed94-4e28-84f8-d053dcffb16a","Type":"ContainerStarted","Data":"94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8"} Mar 13 15:27:32 crc kubenswrapper[4786]: I0313 15:27:32.333892 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerStarted","Data":"c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975"} Mar 13 15:27:32 crc kubenswrapper[4786]: I0313 15:27:32.366798 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.36658339 podStartE2EDuration="2.36658339s" podCreationTimestamp="2026-03-13 15:27:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:27:32.34725266 +0000 UTC m=+1482.510464481" watchObservedRunningTime="2026-03-13 15:27:32.36658339 +0000 UTC m=+1482.529795211" Mar 13 15:27:32 crc kubenswrapper[4786]: I0313 15:27:32.388754 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.63826415 podStartE2EDuration="6.388729869s" podCreationTimestamp="2026-03-13 15:27:26 +0000 UTC" firstStartedPulling="2026-03-13 15:27:27.121209521 +0000 UTC m=+1477.284421332" lastFinishedPulling="2026-03-13 15:27:31.87167523 +0000 UTC m=+1482.034887051" observedRunningTime="2026-03-13 15:27:32.382562266 +0000 UTC m=+1482.545774077" watchObservedRunningTime="2026-03-13 15:27:32.388729869 +0000 UTC m=+1482.551941710" Mar 13 15:27:33 crc kubenswrapper[4786]: I0313 15:27:33.343634 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.323802 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.353557 4786 generic.go:334] "Generic (PLEG): container finished" podID="e63a2420-9298-4a45-b7c3-89a58226dddd" containerID="74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb" exitCode=0 Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.353611 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.353632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e63a2420-9298-4a45-b7c3-89a58226dddd","Type":"ContainerDied","Data":"74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb"} Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.353715 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e63a2420-9298-4a45-b7c3-89a58226dddd","Type":"ContainerDied","Data":"e8e0fd87900005c54e3cf410934f67403a6731ab0cafeede7233f6d93e0a5342"} Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.353748 4786 scope.go:117] "RemoveContainer" containerID="74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.376983 4786 scope.go:117] "RemoveContainer" containerID="74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb" Mar 13 15:27:34 crc kubenswrapper[4786]: E0313 15:27:34.377353 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb\": container with ID starting with 74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb not found: ID does not exist" containerID="74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.377380 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb"} err="failed to get container status \"74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb\": rpc error: code = NotFound desc = could not find container \"74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb\": container with ID starting with 74f220bf239e75371aa8b65d29707b2ee22e5362ad0f8ba940b501384c6ecfeb not found: ID does not exist" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.423520 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-config-data\") pod \"e63a2420-9298-4a45-b7c3-89a58226dddd\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.423748 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-combined-ca-bundle\") pod \"e63a2420-9298-4a45-b7c3-89a58226dddd\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.423829 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhpkp\" (UniqueName: \"kubernetes.io/projected/e63a2420-9298-4a45-b7c3-89a58226dddd-kube-api-access-lhpkp\") pod \"e63a2420-9298-4a45-b7c3-89a58226dddd\" (UID: \"e63a2420-9298-4a45-b7c3-89a58226dddd\") " Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.441409 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63a2420-9298-4a45-b7c3-89a58226dddd-kube-api-access-lhpkp" (OuterVolumeSpecName: "kube-api-access-lhpkp") pod "e63a2420-9298-4a45-b7c3-89a58226dddd" (UID: "e63a2420-9298-4a45-b7c3-89a58226dddd"). InnerVolumeSpecName "kube-api-access-lhpkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.450183 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e63a2420-9298-4a45-b7c3-89a58226dddd" (UID: "e63a2420-9298-4a45-b7c3-89a58226dddd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.453322 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-config-data" (OuterVolumeSpecName: "config-data") pod "e63a2420-9298-4a45-b7c3-89a58226dddd" (UID: "e63a2420-9298-4a45-b7c3-89a58226dddd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.526932 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.527273 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhpkp\" (UniqueName: \"kubernetes.io/projected/e63a2420-9298-4a45-b7c3-89a58226dddd-kube-api-access-lhpkp\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.527361 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63a2420-9298-4a45-b7c3-89a58226dddd-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.675025 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.692497 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.709559 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:27:34 crc kubenswrapper[4786]: E0313 15:27:34.710196 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63a2420-9298-4a45-b7c3-89a58226dddd" containerName="nova-scheduler-scheduler" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.710220 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63a2420-9298-4a45-b7c3-89a58226dddd" containerName="nova-scheduler-scheduler" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.710454 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63a2420-9298-4a45-b7c3-89a58226dddd" containerName="nova-scheduler-scheduler" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.711893 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.715146 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.734956 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.833551 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-config-data\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.833929 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.833958 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckwqz\" (UniqueName: \"kubernetes.io/projected/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-kube-api-access-ckwqz\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.843652 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.935487 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-nova-metadata-tls-certs\") pod \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.935560 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-combined-ca-bundle\") pod \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.935687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2322d84c-3acc-433b-a70a-21b88d0f2aa1-logs\") pod \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.935774 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-config-data\") pod \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.935815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg5jv\" (UniqueName: \"kubernetes.io/projected/2322d84c-3acc-433b-a70a-21b88d0f2aa1-kube-api-access-wg5jv\") pod \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\" (UID: \"2322d84c-3acc-433b-a70a-21b88d0f2aa1\") " Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.936109 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-config-data\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.936195 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.936228 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckwqz\" (UniqueName: \"kubernetes.io/projected/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-kube-api-access-ckwqz\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.936570 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2322d84c-3acc-433b-a70a-21b88d0f2aa1-logs" (OuterVolumeSpecName: "logs") pod "2322d84c-3acc-433b-a70a-21b88d0f2aa1" (UID: "2322d84c-3acc-433b-a70a-21b88d0f2aa1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.941556 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-config-data\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.951380 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2322d84c-3acc-433b-a70a-21b88d0f2aa1-kube-api-access-wg5jv" (OuterVolumeSpecName: "kube-api-access-wg5jv") pod "2322d84c-3acc-433b-a70a-21b88d0f2aa1" (UID: "2322d84c-3acc-433b-a70a-21b88d0f2aa1"). InnerVolumeSpecName "kube-api-access-wg5jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.953440 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckwqz\" (UniqueName: \"kubernetes.io/projected/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-kube-api-access-ckwqz\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.956641 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " pod="openstack/nova-scheduler-0" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.969337 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2322d84c-3acc-433b-a70a-21b88d0f2aa1" (UID: "2322d84c-3acc-433b-a70a-21b88d0f2aa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:34 crc kubenswrapper[4786]: I0313 15:27:34.981237 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-config-data" (OuterVolumeSpecName: "config-data") pod "2322d84c-3acc-433b-a70a-21b88d0f2aa1" (UID: "2322d84c-3acc-433b-a70a-21b88d0f2aa1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.001108 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2322d84c-3acc-433b-a70a-21b88d0f2aa1" (UID: "2322d84c-3acc-433b-a70a-21b88d0f2aa1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.038277 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2322d84c-3acc-433b-a70a-21b88d0f2aa1-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.038319 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.038333 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg5jv\" (UniqueName: \"kubernetes.io/projected/2322d84c-3acc-433b-a70a-21b88d0f2aa1-kube-api-access-wg5jv\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.038347 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.038359 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2322d84c-3acc-433b-a70a-21b88d0f2aa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.050145 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.365789 4786 generic.go:334] "Generic (PLEG): container finished" podID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerID="b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2" exitCode=0 Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.366218 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2322d84c-3acc-433b-a70a-21b88d0f2aa1","Type":"ContainerDied","Data":"b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2"} Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.366259 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2322d84c-3acc-433b-a70a-21b88d0f2aa1","Type":"ContainerDied","Data":"013bf940a40626258858b2d34b9ed1ffde3c6b1bc42a322439f61256577dc5f8"} Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.366297 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.366303 4786 scope.go:117] "RemoveContainer" containerID="b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.393399 4786 scope.go:117] "RemoveContainer" containerID="5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.408796 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.418332 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.433247 4786 scope.go:117] "RemoveContainer" containerID="b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2" Mar 13 15:27:35 crc kubenswrapper[4786]: E0313 15:27:35.433731 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2\": container with ID starting with b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2 not found: ID does not exist" containerID="b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.433787 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2"} err="failed to get container status \"b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2\": rpc error: code = NotFound desc = could not find container \"b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2\": container with ID starting with b1063d69dde1008d19373bc5e56860acd6b3ad85885b8e5882b026f019e379f2 not found: ID does not exist" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.433819 4786 scope.go:117] "RemoveContainer" containerID="5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22" Mar 13 15:27:35 crc kubenswrapper[4786]: E0313 15:27:35.434909 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22\": container with ID starting with 5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22 not found: ID does not exist" containerID="5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.434958 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22"} err="failed to get container status \"5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22\": rpc error: code = NotFound desc = could not find container \"5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22\": container with ID starting with 5edf570dc17cc12afdf5f6a5666719fccffa2950bc28c3751a1ed6fa6f49bf22 not found: ID does not exist" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.438730 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:35 crc kubenswrapper[4786]: E0313 15:27:35.439235 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-metadata" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.439261 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-metadata" Mar 13 15:27:35 crc kubenswrapper[4786]: E0313 15:27:35.439295 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-log" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.439303 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-log" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.439501 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-metadata" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.439533 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" containerName="nova-metadata-log" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.440485 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.443183 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.443207 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.468378 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.512187 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.551918 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8402e30a-1517-41be-b468-1959c4b7621b-logs\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.552087 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.552180 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hfsx\" (UniqueName: \"kubernetes.io/projected/8402e30a-1517-41be-b468-1959c4b7621b-kube-api-access-6hfsx\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.552254 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-config-data\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.552338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.654404 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hfsx\" (UniqueName: \"kubernetes.io/projected/8402e30a-1517-41be-b468-1959c4b7621b-kube-api-access-6hfsx\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.654489 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-config-data\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.654561 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.654803 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8402e30a-1517-41be-b468-1959c4b7621b-logs\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.654919 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.656282 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8402e30a-1517-41be-b468-1959c4b7621b-logs\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.658980 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-config-data\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.660119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.661676 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.672029 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hfsx\" (UniqueName: \"kubernetes.io/projected/8402e30a-1517-41be-b468-1959c4b7621b-kube-api-access-6hfsx\") pod \"nova-metadata-0\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " pod="openstack/nova-metadata-0" Mar 13 15:27:35 crc kubenswrapper[4786]: I0313 15:27:35.757937 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:27:36 crc kubenswrapper[4786]: W0313 15:27:36.206745 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8402e30a_1517_41be_b468_1959c4b7621b.slice/crio-eb5d1863880bfc7d9eb63e471ad4f565fc87984711c146488a211011d3bacb97 WatchSource:0}: Error finding container eb5d1863880bfc7d9eb63e471ad4f565fc87984711c146488a211011d3bacb97: Status 404 returned error can't find the container with id eb5d1863880bfc7d9eb63e471ad4f565fc87984711c146488a211011d3bacb97 Mar 13 15:27:36 crc kubenswrapper[4786]: I0313 15:27:36.224178 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:27:36 crc kubenswrapper[4786]: I0313 15:27:36.376631 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e","Type":"ContainerStarted","Data":"afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83"} Mar 13 15:27:36 crc kubenswrapper[4786]: I0313 15:27:36.376701 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e","Type":"ContainerStarted","Data":"2689979b02e556a2ef958860b3ada3bb782622953f6709b64a670f5a6fad9ac7"} Mar 13 15:27:36 crc kubenswrapper[4786]: I0313 15:27:36.377787 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8402e30a-1517-41be-b468-1959c4b7621b","Type":"ContainerStarted","Data":"eb5d1863880bfc7d9eb63e471ad4f565fc87984711c146488a211011d3bacb97"} Mar 13 15:27:36 crc kubenswrapper[4786]: I0313 15:27:36.395722 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.395701386 podStartE2EDuration="2.395701386s" podCreationTimestamp="2026-03-13 15:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:27:36.394427364 +0000 UTC m=+1486.557639175" watchObservedRunningTime="2026-03-13 15:27:36.395701386 +0000 UTC m=+1486.558913197" Mar 13 15:27:36 crc kubenswrapper[4786]: I0313 15:27:36.563849 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2322d84c-3acc-433b-a70a-21b88d0f2aa1" path="/var/lib/kubelet/pods/2322d84c-3acc-433b-a70a-21b88d0f2aa1/volumes" Mar 13 15:27:36 crc kubenswrapper[4786]: I0313 15:27:36.564804 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63a2420-9298-4a45-b7c3-89a58226dddd" path="/var/lib/kubelet/pods/e63a2420-9298-4a45-b7c3-89a58226dddd/volumes" Mar 13 15:27:37 crc kubenswrapper[4786]: I0313 15:27:37.394470 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8402e30a-1517-41be-b468-1959c4b7621b","Type":"ContainerStarted","Data":"aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27"} Mar 13 15:27:37 crc kubenswrapper[4786]: I0313 15:27:37.394551 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8402e30a-1517-41be-b468-1959c4b7621b","Type":"ContainerStarted","Data":"c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13"} Mar 13 15:27:37 crc kubenswrapper[4786]: I0313 15:27:37.426026 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.426005657 podStartE2EDuration="2.426005657s" podCreationTimestamp="2026-03-13 15:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:27:37.424846949 +0000 UTC m=+1487.588058760" watchObservedRunningTime="2026-03-13 15:27:37.426005657 +0000 UTC m=+1487.589217488" Mar 13 15:27:37 crc kubenswrapper[4786]: I0313 15:27:37.868657 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:27:37 crc kubenswrapper[4786]: I0313 15:27:37.868746 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:27:40 crc kubenswrapper[4786]: I0313 15:27:40.050294 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 15:27:40 crc kubenswrapper[4786]: I0313 15:27:40.754793 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 15:27:40 crc kubenswrapper[4786]: I0313 15:27:40.755359 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 15:27:41 crc kubenswrapper[4786]: I0313 15:27:41.776010 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:27:41 crc kubenswrapper[4786]: I0313 15:27:41.776032 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:27:45 crc kubenswrapper[4786]: I0313 15:27:45.050705 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 15:27:45 crc kubenswrapper[4786]: I0313 15:27:45.095781 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 15:27:45 crc kubenswrapper[4786]: I0313 15:27:45.535709 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 15:27:45 crc kubenswrapper[4786]: I0313 15:27:45.758249 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 15:27:45 crc kubenswrapper[4786]: I0313 15:27:45.758713 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 15:27:46 crc kubenswrapper[4786]: I0313 15:27:46.773075 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:27:46 crc kubenswrapper[4786]: I0313 15:27:46.773079 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:27:48 crc kubenswrapper[4786]: I0313 15:27:48.754405 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 15:27:48 crc kubenswrapper[4786]: I0313 15:27:48.754457 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 15:27:50 crc kubenswrapper[4786]: I0313 15:27:50.764748 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 15:27:50 crc kubenswrapper[4786]: I0313 15:27:50.765158 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 15:27:50 crc kubenswrapper[4786]: I0313 15:27:50.772192 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 15:27:50 crc kubenswrapper[4786]: I0313 15:27:50.773459 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 15:27:53 crc kubenswrapper[4786]: I0313 15:27:53.758677 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 15:27:53 crc kubenswrapper[4786]: I0313 15:27:53.759180 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 15:27:55 crc kubenswrapper[4786]: I0313 15:27:55.763833 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 15:27:55 crc kubenswrapper[4786]: I0313 15:27:55.765036 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 15:27:55 crc kubenswrapper[4786]: I0313 15:27:55.768232 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 15:27:56 crc kubenswrapper[4786]: I0313 15:27:56.597852 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 15:27:56 crc kubenswrapper[4786]: I0313 15:27:56.671103 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.148067 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556928-vqdx2"] Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.150130 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.154008 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.159909 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.160157 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.161655 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556928-vqdx2"] Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.190697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hk5c\" (UniqueName: \"kubernetes.io/projected/acee06e2-3e35-4f56-8e16-f9dfbac79a59-kube-api-access-7hk5c\") pod \"auto-csr-approver-29556928-vqdx2\" (UID: \"acee06e2-3e35-4f56-8e16-f9dfbac79a59\") " pod="openshift-infra/auto-csr-approver-29556928-vqdx2" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.292960 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hk5c\" (UniqueName: \"kubernetes.io/projected/acee06e2-3e35-4f56-8e16-f9dfbac79a59-kube-api-access-7hk5c\") pod \"auto-csr-approver-29556928-vqdx2\" (UID: \"acee06e2-3e35-4f56-8e16-f9dfbac79a59\") " pod="openshift-infra/auto-csr-approver-29556928-vqdx2" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.318167 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hk5c\" (UniqueName: \"kubernetes.io/projected/acee06e2-3e35-4f56-8e16-f9dfbac79a59-kube-api-access-7hk5c\") pod \"auto-csr-approver-29556928-vqdx2\" (UID: \"acee06e2-3e35-4f56-8e16-f9dfbac79a59\") " pod="openshift-infra/auto-csr-approver-29556928-vqdx2" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.475425 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.910355 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556928-vqdx2"] Mar 13 15:28:00 crc kubenswrapper[4786]: I0313 15:28:00.917943 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:28:01 crc kubenswrapper[4786]: I0313 15:28:01.642372 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" event={"ID":"acee06e2-3e35-4f56-8e16-f9dfbac79a59","Type":"ContainerStarted","Data":"6cf22a52130818b1dbb6161b4f0670b6f978df4a5ec12fc513e0d9c35151b048"} Mar 13 15:28:02 crc kubenswrapper[4786]: I0313 15:28:02.662236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" event={"ID":"acee06e2-3e35-4f56-8e16-f9dfbac79a59","Type":"ContainerStarted","Data":"5bd5e837985b7d27d3ae9656f3b436ea8a4e18eb2f4dc859342ba05567ee81f9"} Mar 13 15:28:02 crc kubenswrapper[4786]: I0313 15:28:02.689879 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" podStartSLOduration=1.633333193 podStartE2EDuration="2.689837034s" podCreationTimestamp="2026-03-13 15:28:00 +0000 UTC" firstStartedPulling="2026-03-13 15:28:00.915115977 +0000 UTC m=+1511.078327788" lastFinishedPulling="2026-03-13 15:28:01.971619818 +0000 UTC m=+1512.134831629" observedRunningTime="2026-03-13 15:28:02.677507028 +0000 UTC m=+1512.840718849" watchObservedRunningTime="2026-03-13 15:28:02.689837034 +0000 UTC m=+1512.853048855" Mar 13 15:28:03 crc kubenswrapper[4786]: I0313 15:28:03.673722 4786 generic.go:334] "Generic (PLEG): container finished" podID="acee06e2-3e35-4f56-8e16-f9dfbac79a59" containerID="5bd5e837985b7d27d3ae9656f3b436ea8a4e18eb2f4dc859342ba05567ee81f9" exitCode=0 Mar 13 15:28:03 crc kubenswrapper[4786]: I0313 15:28:03.673843 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" event={"ID":"acee06e2-3e35-4f56-8e16-f9dfbac79a59","Type":"ContainerDied","Data":"5bd5e837985b7d27d3ae9656f3b436ea8a4e18eb2f4dc859342ba05567ee81f9"} Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.051417 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.183399 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hk5c\" (UniqueName: \"kubernetes.io/projected/acee06e2-3e35-4f56-8e16-f9dfbac79a59-kube-api-access-7hk5c\") pod \"acee06e2-3e35-4f56-8e16-f9dfbac79a59\" (UID: \"acee06e2-3e35-4f56-8e16-f9dfbac79a59\") " Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.190434 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acee06e2-3e35-4f56-8e16-f9dfbac79a59-kube-api-access-7hk5c" (OuterVolumeSpecName: "kube-api-access-7hk5c") pod "acee06e2-3e35-4f56-8e16-f9dfbac79a59" (UID: "acee06e2-3e35-4f56-8e16-f9dfbac79a59"). InnerVolumeSpecName "kube-api-access-7hk5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.287264 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hk5c\" (UniqueName: \"kubernetes.io/projected/acee06e2-3e35-4f56-8e16-f9dfbac79a59-kube-api-access-7hk5c\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.700974 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" event={"ID":"acee06e2-3e35-4f56-8e16-f9dfbac79a59","Type":"ContainerDied","Data":"6cf22a52130818b1dbb6161b4f0670b6f978df4a5ec12fc513e0d9c35151b048"} Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.701025 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf22a52130818b1dbb6161b4f0670b6f978df4a5ec12fc513e0d9c35151b048" Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.701085 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-vqdx2" Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.765755 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-kt9db"] Mar 13 15:28:05 crc kubenswrapper[4786]: I0313 15:28:05.777972 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-kt9db"] Mar 13 15:28:06 crc kubenswrapper[4786]: I0313 15:28:06.566134 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21189be6-18a4-47ac-a41d-f8f6ab6ff875" path="/var/lib/kubelet/pods/21189be6-18a4-47ac-a41d-f8f6ab6ff875/volumes" Mar 13 15:28:07 crc kubenswrapper[4786]: I0313 15:28:07.029959 4786 scope.go:117] "RemoveContainer" containerID="58df202a11d1b8b5d5b191c42d457275030fe01c20524509c28f992abe5986cb" Mar 13 15:28:07 crc kubenswrapper[4786]: I0313 15:28:07.868767 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:28:07 crc kubenswrapper[4786]: I0313 15:28:07.868831 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:28:14 crc kubenswrapper[4786]: I0313 15:28:14.897403 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 15:28:14 crc kubenswrapper[4786]: I0313 15:28:14.898173 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="bc33fb6e-0b09-479a-9825-3f7dfb100f37" containerName="openstackclient" containerID="cri-o://5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97" gracePeriod=2 Mar 13 15:28:14 crc kubenswrapper[4786]: I0313 15:28:14.916142 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.153173 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-df75-account-create-update-92ppw"] Mar 13 15:28:15 crc kubenswrapper[4786]: E0313 15:28:15.153713 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33fb6e-0b09-479a-9825-3f7dfb100f37" containerName="openstackclient" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.153725 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33fb6e-0b09-479a-9825-3f7dfb100f37" containerName="openstackclient" Mar 13 15:28:15 crc kubenswrapper[4786]: E0313 15:28:15.153781 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acee06e2-3e35-4f56-8e16-f9dfbac79a59" containerName="oc" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.153788 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="acee06e2-3e35-4f56-8e16-f9dfbac79a59" containerName="oc" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.153989 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33fb6e-0b09-479a-9825-3f7dfb100f37" containerName="openstackclient" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.154041 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="acee06e2-3e35-4f56-8e16-f9dfbac79a59" containerName="oc" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.155040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.169193 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.183251 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.225051 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-df75-account-create-update-92ppw"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.250598 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-744d-account-create-update-b756p"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.255076 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.259573 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.312068 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-operator-scripts\") pod \"nova-api-df75-account-create-update-92ppw\" (UID: \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\") " pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.312249 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf6jt\" (UniqueName: \"kubernetes.io/projected/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-kube-api-access-gf6jt\") pod \"nova-api-df75-account-create-update-92ppw\" (UID: \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\") " pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.330472 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-b756p"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.379688 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-df75-account-create-update-g659m"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.403926 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-29wj6"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.414148 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-operator-scripts\") pod \"nova-api-df75-account-create-update-92ppw\" (UID: \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\") " pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.414238 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7ctk\" (UniqueName: \"kubernetes.io/projected/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-kube-api-access-x7ctk\") pod \"nova-cell0-744d-account-create-update-b756p\" (UID: \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\") " pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.414270 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf6jt\" (UniqueName: \"kubernetes.io/projected/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-kube-api-access-gf6jt\") pod \"nova-api-df75-account-create-update-92ppw\" (UID: \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\") " pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.414334 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-operator-scripts\") pod \"nova-cell0-744d-account-create-update-b756p\" (UID: \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\") " pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.415370 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-operator-scripts\") pod \"nova-api-df75-account-create-update-92ppw\" (UID: \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\") " pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.427543 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-df75-account-create-update-g659m"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.459054 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cfc0-account-create-update-s4gmg"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.489301 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-29wj6"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.516911 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-operator-scripts\") pod \"nova-cell0-744d-account-create-update-b756p\" (UID: \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\") " pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.517316 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7ctk\" (UniqueName: \"kubernetes.io/projected/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-kube-api-access-x7ctk\") pod \"nova-cell0-744d-account-create-update-b756p\" (UID: \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\") " pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.518274 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-operator-scripts\") pod \"nova-cell0-744d-account-create-update-b756p\" (UID: \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\") " pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.522378 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf6jt\" (UniqueName: \"kubernetes.io/projected/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-kube-api-access-gf6jt\") pod \"nova-api-df75-account-create-update-92ppw\" (UID: \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\") " pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.556920 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cfc0-account-create-update-s4gmg"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.600509 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7ctk\" (UniqueName: \"kubernetes.io/projected/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-kube-api-access-x7ctk\") pod \"nova-cell0-744d-account-create-update-b756p\" (UID: \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\") " pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.620402 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-twsmd"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.621739 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.627300 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.651979 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cfc0-account-create-update-8f6fk"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.653396 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.655949 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.666478 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-twsmd"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.673331 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-2hb98"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.689921 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ds5d4"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.712055 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ds5d4"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.720805 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94c95\" (UniqueName: \"kubernetes.io/projected/1cbfa737-846e-425a-b747-869de3afaf89-kube-api-access-94c95\") pod \"root-account-create-update-twsmd\" (UID: \"1cbfa737-846e-425a-b747-869de3afaf89\") " pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.730172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6297c29d-09ec-49e7-ae22-6b20962603a7-operator-scripts\") pod \"barbican-cfc0-account-create-update-8f6fk\" (UID: \"6297c29d-09ec-49e7-ae22-6b20962603a7\") " pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.730243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbd8\" (UniqueName: \"kubernetes.io/projected/6297c29d-09ec-49e7-ae22-6b20962603a7-kube-api-access-4qbd8\") pod \"barbican-cfc0-account-create-update-8f6fk\" (UID: \"6297c29d-09ec-49e7-ae22-6b20962603a7\") " pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.730385 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbfa737-846e-425a-b747-869de3afaf89-operator-scripts\") pod \"root-account-create-update-twsmd\" (UID: \"1cbfa737-846e-425a-b747-869de3afaf89\") " pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.736015 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cfc0-account-create-update-8f6fk"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.744420 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-x8krv"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.757595 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.771553 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.785408 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.835010 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6297c29d-09ec-49e7-ae22-6b20962603a7-operator-scripts\") pod \"barbican-cfc0-account-create-update-8f6fk\" (UID: \"6297c29d-09ec-49e7-ae22-6b20962603a7\") " pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.835067 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbd8\" (UniqueName: \"kubernetes.io/projected/6297c29d-09ec-49e7-ae22-6b20962603a7-kube-api-access-4qbd8\") pod \"barbican-cfc0-account-create-update-8f6fk\" (UID: \"6297c29d-09ec-49e7-ae22-6b20962603a7\") " pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.835131 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srsv\" (UniqueName: \"kubernetes.io/projected/226d3f05-c83e-4267-b6da-e30ceae91436-kube-api-access-7srsv\") pod \"nova-cell1-a23c-account-create-update-x8krv\" (UID: \"226d3f05-c83e-4267-b6da-e30ceae91436\") " pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.835159 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbfa737-846e-425a-b747-869de3afaf89-operator-scripts\") pod \"root-account-create-update-twsmd\" (UID: \"1cbfa737-846e-425a-b747-869de3afaf89\") " pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.835185 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226d3f05-c83e-4267-b6da-e30ceae91436-operator-scripts\") pod \"nova-cell1-a23c-account-create-update-x8krv\" (UID: \"226d3f05-c83e-4267-b6da-e30ceae91436\") " pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.835273 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94c95\" (UniqueName: \"kubernetes.io/projected/1cbfa737-846e-425a-b747-869de3afaf89-kube-api-access-94c95\") pod \"root-account-create-update-twsmd\" (UID: \"1cbfa737-846e-425a-b747-869de3afaf89\") " pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.836234 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6297c29d-09ec-49e7-ae22-6b20962603a7-operator-scripts\") pod \"barbican-cfc0-account-create-update-8f6fk\" (UID: \"6297c29d-09ec-49e7-ae22-6b20962603a7\") " pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.851263 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qqx9f"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.859339 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbfa737-846e-425a-b747-869de3afaf89-operator-scripts\") pod \"root-account-create-update-twsmd\" (UID: \"1cbfa737-846e-425a-b747-869de3afaf89\") " pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.885939 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qqx9f"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.889633 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.890660 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94c95\" (UniqueName: \"kubernetes.io/projected/1cbfa737-846e-425a-b747-869de3afaf89-kube-api-access-94c95\") pod \"root-account-create-update-twsmd\" (UID: \"1cbfa737-846e-425a-b747-869de3afaf89\") " pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.902437 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbd8\" (UniqueName: \"kubernetes.io/projected/6297c29d-09ec-49e7-ae22-6b20962603a7-kube-api-access-4qbd8\") pod \"barbican-cfc0-account-create-update-8f6fk\" (UID: \"6297c29d-09ec-49e7-ae22-6b20962603a7\") " pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.909627 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.951918 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srsv\" (UniqueName: \"kubernetes.io/projected/226d3f05-c83e-4267-b6da-e30ceae91436-kube-api-access-7srsv\") pod \"nova-cell1-a23c-account-create-update-x8krv\" (UID: \"226d3f05-c83e-4267-b6da-e30ceae91436\") " pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.951968 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226d3f05-c83e-4267-b6da-e30ceae91436-operator-scripts\") pod \"nova-cell1-a23c-account-create-update-x8krv\" (UID: \"226d3f05-c83e-4267-b6da-e30ceae91436\") " pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:15 crc kubenswrapper[4786]: E0313 15:28:15.952452 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:15 crc kubenswrapper[4786]: E0313 15:28:15.952530 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data podName:65e5ca7c-1c5e-4f9e-85df-a92feaeddb43 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:16.452511689 +0000 UTC m=+1526.615723490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data") pod "rabbitmq-cell1-server-0" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43") : configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.958766 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226d3f05-c83e-4267-b6da-e30ceae91436-operator-scripts\") pod \"nova-cell1-a23c-account-create-update-x8krv\" (UID: \"226d3f05-c83e-4267-b6da-e30ceae91436\") " pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.973386 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bfm8s"] Mar 13 15:28:15 crc kubenswrapper[4786]: I0313 15:28:15.982734 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.011574 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srsv\" (UniqueName: \"kubernetes.io/projected/226d3f05-c83e-4267-b6da-e30ceae91436-kube-api-access-7srsv\") pod \"nova-cell1-a23c-account-create-update-x8krv\" (UID: \"226d3f05-c83e-4267-b6da-e30ceae91436\") " pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.027925 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-x8krv"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.039412 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.050914 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-46qvp"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.073604 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-46qvp"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.091194 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-jc2x9"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.091443 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-jc2x9" podUID="a9a45c6c-6521-40c9-af91-ac00f5427ce4" containerName="openstack-network-exporter" containerID="cri-o://8a0e90c6c16997dc571cfff737c2e9e9f5438e0dcf64bd4368c4878cc5a75790" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.144912 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ef50-account-create-update-bdrf5"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.146379 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.151204 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.154266 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.165946 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd4962-9f30-4961-8dcb-7604ef587af6-operator-scripts\") pod \"cinder-ef50-account-create-update-bdrf5\" (UID: \"f8bd4962-9f30-4961-8dcb-7604ef587af6\") " pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.166147 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rz4l\" (UniqueName: \"kubernetes.io/projected/f8bd4962-9f30-4961-8dcb-7604ef587af6-kube-api-access-8rz4l\") pod \"cinder-ef50-account-create-update-bdrf5\" (UID: \"f8bd4962-9f30-4961-8dcb-7604ef587af6\") " pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.195373 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ef50-account-create-update-bdrf5"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.215467 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-92a2-account-create-update-x5fmd"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.251933 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ef50-account-create-update-lt26g"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.268611 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd4962-9f30-4961-8dcb-7604ef587af6-operator-scripts\") pod \"cinder-ef50-account-create-update-bdrf5\" (UID: \"f8bd4962-9f30-4961-8dcb-7604ef587af6\") " pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.268729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rz4l\" (UniqueName: \"kubernetes.io/projected/f8bd4962-9f30-4961-8dcb-7604ef587af6-kube-api-access-8rz4l\") pod \"cinder-ef50-account-create-update-bdrf5\" (UID: \"f8bd4962-9f30-4961-8dcb-7604ef587af6\") " pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.273259 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd4962-9f30-4961-8dcb-7604ef587af6-operator-scripts\") pod \"cinder-ef50-account-create-update-bdrf5\" (UID: \"f8bd4962-9f30-4961-8dcb-7604ef587af6\") " pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.277085 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ft8mk"] Mar 13 15:28:16 crc kubenswrapper[4786]: E0313 15:28:16.289142 4786 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-bfm8s" message=< Mar 13 15:28:16 crc kubenswrapper[4786]: Exiting ovn-controller (1) [ OK ] Mar 13 15:28:16 crc kubenswrapper[4786]: > Mar 13 15:28:16 crc kubenswrapper[4786]: E0313 15:28:16.289182 4786 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-bfm8s" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerName="ovn-controller" containerID="cri-o://3d075c999ac56a272c8241240052d1c6ec3450ff98e7174795b8e3a6b89162b8" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.289218 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-bfm8s" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerName="ovn-controller" containerID="cri-o://3d075c999ac56a272c8241240052d1c6ec3450ff98e7174795b8e3a6b89162b8" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.290983 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ef50-account-create-update-lt26g"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.304890 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-92a2-account-create-update-x5fmd"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.329949 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rz4l\" (UniqueName: \"kubernetes.io/projected/f8bd4962-9f30-4961-8dcb-7604ef587af6-kube-api-access-8rz4l\") pod \"cinder-ef50-account-create-update-bdrf5\" (UID: \"f8bd4962-9f30-4961-8dcb-7604ef587af6\") " pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.344078 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ft8mk"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.363527 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.363881 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="ovn-northd" containerID="cri-o://c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.364457 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="openstack-network-exporter" containerID="cri-o://44fca03dce57cb826c43f7929ebd5d7925bdeb707e7610d77a9fed038f82c78f" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.397835 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-7tr8p"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.405876 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-7tr8p"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.414978 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-w8c7w"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.431288 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-w8c7w"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.438802 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zg25m"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.449668 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zg25m"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.462641 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.463285 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerName="openstack-network-exporter" containerID="cri-o://2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0" gracePeriod=300 Mar 13 15:28:16 crc kubenswrapper[4786]: E0313 15:28:16.479947 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:16 crc kubenswrapper[4786]: E0313 15:28:16.480271 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data podName:65e5ca7c-1c5e-4f9e-85df-a92feaeddb43 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:17.480249292 +0000 UTC m=+1527.643461103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data") pod "rabbitmq-cell1-server-0" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43") : configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.512393 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bd4f4c6c-zg28d"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.512797 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bd4f4c6c-zg28d" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-api" containerID="cri-o://92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.513494 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bd4f4c6c-zg28d" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-httpd" containerID="cri-o://84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.514205 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.627341 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a95f56a-1b06-43de-87e7-06dca92043be" path="/var/lib/kubelet/pods/0a95f56a-1b06-43de-87e7-06dca92043be/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.627975 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15cffa2d-4a68-4125-aaca-c8e972b36d53" path="/var/lib/kubelet/pods/15cffa2d-4a68-4125-aaca-c8e972b36d53/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.628815 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a2107c-c333-4776-ad2a-ce59edf18d04" path="/var/lib/kubelet/pods/17a2107c-c333-4776-ad2a-ce59edf18d04/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.641890 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1e5dac-cc9d-4362-a3c7-2ea79797cacf" path="/var/lib/kubelet/pods/1c1e5dac-cc9d-4362-a3c7-2ea79797cacf/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.650670 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21fe0b9a-4302-4672-945d-e0195f3f86b5" path="/var/lib/kubelet/pods/21fe0b9a-4302-4672-945d-e0195f3f86b5/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.653496 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cdd0010-08d5-4c55-98b8-c08ad54c7514" path="/var/lib/kubelet/pods/8cdd0010-08d5-4c55-98b8-c08ad54c7514/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.654263 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e4376c9-9b7f-4c5b-b2bc-91f1390dc118" path="/var/lib/kubelet/pods/8e4376c9-9b7f-4c5b-b2bc-91f1390dc118/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: E0313 15:28:16.659815 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:16 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:16 crc kubenswrapper[4786]: Mar 13 15:28:16 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:16 crc kubenswrapper[4786]: Mar 13 15:28:16 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:16 crc kubenswrapper[4786]: Mar 13 15:28:16 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:16 crc kubenswrapper[4786]: Mar 13 15:28:16 crc kubenswrapper[4786]: if [ -n "nova_api" ]; then Mar 13 15:28:16 crc kubenswrapper[4786]: GRANT_DATABASE="nova_api" Mar 13 15:28:16 crc kubenswrapper[4786]: else Mar 13 15:28:16 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:16 crc kubenswrapper[4786]: fi Mar 13 15:28:16 crc kubenswrapper[4786]: Mar 13 15:28:16 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:16 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:16 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:16 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:16 crc kubenswrapper[4786]: # support updates Mar 13 15:28:16 crc kubenswrapper[4786]: Mar 13 15:28:16 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:16 crc kubenswrapper[4786]: E0313 15:28:16.661556 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-df75-account-create-update-92ppw" podUID="cda350b1-fe2a-4ed0-b7e1-f9a425076f56" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.673845 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae" path="/var/lib/kubelet/pods/b9ec880a-a0d7-47ee-98f5-bd6c680cf5ae/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.675089 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6adeaf7-f94a-4c32-9069-822bcb8d31b8" path="/var/lib/kubelet/pods/d6adeaf7-f94a-4c32-9069-822bcb8d31b8/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.675951 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed27af11-89ca-4d9c-b654-800740dfc742" path="/var/lib/kubelet/pods/ed27af11-89ca-4d9c-b654-800740dfc742/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.689645 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb4e25a-01bd-46b0-8029-dde07c5bcfca" path="/var/lib/kubelet/pods/edb4e25a-01bd-46b0-8029-dde07c5bcfca/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.690843 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6de6c03-d7b5-49f7-b4cc-9c07c6593273" path="/var/lib/kubelet/pods/f6de6c03-d7b5-49f7-b4cc-9c07c6593273/volumes" Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.691531 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.691584 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cvs2v"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.691596 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cvs2v"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.691610 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-57hdr"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.691619 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nwdnc"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.691628 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-57hdr"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.692263 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerName="openstack-network-exporter" containerID="cri-o://d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78" gracePeriod=300 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.693784 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nwdnc"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.706785 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.707598 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-server" containerID="cri-o://d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708145 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="swift-recon-cron" containerID="cri-o://a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708235 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-updater" containerID="cri-o://6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708355 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-auditor" containerID="cri-o://add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708429 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-reaper" containerID="cri-o://11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708395 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-expirer" containerID="cri-o://020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708499 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-replicator" containerID="cri-o://e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708553 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-server" containerID="cri-o://209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708596 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-replicator" containerID="cri-o://0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708633 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-auditor" containerID="cri-o://03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708380 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="rsync" containerID="cri-o://779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708875 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-replicator" containerID="cri-o://328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.708941 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-server" containerID="cri-o://cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.709017 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-auditor" containerID="cri-o://06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.711299 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerName="ovsdbserver-sb" containerID="cri-o://97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049" gracePeriod=300 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.753420 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-updater" containerID="cri-o://ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.764784 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-2k7lw"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.810681 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-865786b7bb-9cnjb"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.810926 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-df75-account-create-update-92ppw"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.811524 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" podUID="4a092179-7f71-47ce-9764-df909331a819" containerName="dnsmasq-dns" containerID="cri-o://d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f" gracePeriod=10 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.815535 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-865786b7bb-9cnjb" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerName="placement-log" containerID="cri-o://241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.815817 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-865786b7bb-9cnjb" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerName="placement-api" containerID="cri-o://d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64" gracePeriod=30 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.827804 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.906381 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerName="ovsdbserver-nb" containerID="cri-o://19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13" gracePeriod=300 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.906506 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xlfs4"] Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.940346 4786 generic.go:334] "Generic (PLEG): container finished" podID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerID="3d075c999ac56a272c8241240052d1c6ec3450ff98e7174795b8e3a6b89162b8" exitCode=0 Mar 13 15:28:16 crc kubenswrapper[4786]: I0313 15:28:16.940460 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bfm8s" event={"ID":"5a56ecb5-18f5-4645-8626-03f231f99f03","Type":"ContainerDied","Data":"3d075c999ac56a272c8241240052d1c6ec3450ff98e7174795b8e3a6b89162b8"} Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.017903 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ebcabfa0-9931-4c7d-a0e7-a0337bb887ea/ovsdbserver-sb/0.log" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.017947 4786 generic.go:334] "Generic (PLEG): container finished" podID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerID="2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0" exitCode=2 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.018014 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea","Type":"ContainerDied","Data":"2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0"} Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.034394 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xlfs4"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.045711 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jc2x9_a9a45c6c-6521-40c9-af91-ac00f5427ce4/openstack-network-exporter/0.log" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.045755 4786 generic.go:334] "Generic (PLEG): container finished" podID="a9a45c6c-6521-40c9-af91-ac00f5427ce4" containerID="8a0e90c6c16997dc571cfff737c2e9e9f5438e0dcf64bd4368c4878cc5a75790" exitCode=2 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.045812 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jc2x9" event={"ID":"a9a45c6c-6521-40c9-af91-ac00f5427ce4","Type":"ContainerDied","Data":"8a0e90c6c16997dc571cfff737c2e9e9f5438e0dcf64bd4368c4878cc5a75790"} Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.094083 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1 is running failed: container process not found" containerID="c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.094919 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1 is running failed: container process not found" containerID="c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.097397 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1 is running failed: container process not found" containerID="c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.097461 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="ovn-northd" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.154152 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d" exitCode=0 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.154802 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364" exitCode=0 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.154913 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d"} Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.154940 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364"} Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.157970 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerName="rabbitmq" containerID="cri-o://542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e" gracePeriod=604800 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.167466 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e617-account-create-update-jrn5z"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.199740 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c/ovn-northd/0.log" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.199785 4786 generic.go:334] "Generic (PLEG): container finished" podID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerID="44fca03dce57cb826c43f7929ebd5d7925bdeb707e7610d77a9fed038f82c78f" exitCode=2 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.199801 4786 generic.go:334] "Generic (PLEG): container finished" podID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerID="c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1" exitCode=143 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.199867 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c","Type":"ContainerDied","Data":"44fca03dce57cb826c43f7929ebd5d7925bdeb707e7610d77a9fed038f82c78f"} Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.199892 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c","Type":"ContainerDied","Data":"c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1"} Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.209011 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-df75-account-create-update-92ppw" event={"ID":"cda350b1-fe2a-4ed0-b7e1-f9a425076f56","Type":"ContainerStarted","Data":"dbcf6f1d1806705bd451487457fdaa0afe9c1706f1f013b480eb48fbc8197d54"} Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.228430 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e617-account-create-update-jrn5z"] Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.233547 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:17 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: if [ -n "nova_api" ]; then Mar 13 15:28:17 crc kubenswrapper[4786]: GRANT_DATABASE="nova_api" Mar 13 15:28:17 crc kubenswrapper[4786]: else Mar 13 15:28:17 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:17 crc kubenswrapper[4786]: fi Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:17 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:17 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:17 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:17 crc kubenswrapper[4786]: # support updates Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.235256 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-df75-account-create-update-92ppw" podUID="cda350b1-fe2a-4ed0-b7e1-f9a425076f56" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.278913 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.279173 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerName="glance-log" containerID="cri-o://18ddb14c07d08e39a714e8241c91bc621e8b9460fed525911386dabe3a845484" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.279565 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerName="glance-httpd" containerID="cri-o://15848ce920e90b7648bab5e64f68ab325ba1bd7a143ec84d35fc3e9aa2a6e33f" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.295142 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.295454 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8609052d-1ba2-4888-b973-05c8e4663632" containerName="glance-log" containerID="cri-o://da85c0b8ba7100a5a7dd0b1919a32c21a917bfb0e41b3a25bb5709d1b826d78c" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.295598 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8609052d-1ba2-4888-b973-05c8e4663632" containerName="glance-httpd" containerID="cri-o://f8fb520920d825dd25aaead12b8048c225a995e2f744b27b8f73261839e24997" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.301916 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-cv6dz"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.307816 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-cv6dz"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.318000 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.318315 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" containerName="cinder-scheduler" containerID="cri-o://d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.318713 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" containerName="probe" containerID="cri-o://c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.331482 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ad0f-account-create-update-zfgm9"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.337370 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ad0f-account-create-update-zfgm9"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.352982 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kk4lt"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.360169 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kk4lt"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.360897 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jc2x9_a9a45c6c-6521-40c9-af91-ac00f5427ce4/openstack-network-exporter/0.log" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.360968 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.365431 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.365685 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api-log" containerID="cri-o://4c92acc295fd57ca23b0e288711fb9a3af052bd9a754638cb9ef30ab57e5b073" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.365805 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api" containerID="cri-o://6fee30319ab254dc362b9cb5404359a46216a536ddf834f8c5f2549a88b37dcc" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.374489 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.410486 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" containerID="cri-o://5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" gracePeriod=29 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.418203 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bfm8s" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.438345 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.439273 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-log" containerID="cri-o://c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.439439 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-metadata" containerID="cri-o://aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.439712 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a56ecb5-18f5-4645-8626-03f231f99f03-scripts\") pod \"5a56ecb5-18f5-4645-8626-03f231f99f03\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.439817 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-log-ovn\") pod \"5a56ecb5-18f5-4645-8626-03f231f99f03\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440021 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-metrics-certs-tls-certs\") pod \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440148 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-combined-ca-bundle\") pod \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440333 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-combined-ca-bundle\") pod \"5a56ecb5-18f5-4645-8626-03f231f99f03\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440437 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhdlz\" (UniqueName: \"kubernetes.io/projected/a9a45c6c-6521-40c9-af91-ac00f5427ce4-kube-api-access-fhdlz\") pod \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440544 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wchht\" (UniqueName: \"kubernetes.io/projected/5a56ecb5-18f5-4645-8626-03f231f99f03-kube-api-access-wchht\") pod \"5a56ecb5-18f5-4645-8626-03f231f99f03\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440643 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a45c6c-6521-40c9-af91-ac00f5427ce4-config\") pod \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440750 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-ovn-controller-tls-certs\") pod \"5a56ecb5-18f5-4645-8626-03f231f99f03\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovs-rundir\") pod \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440930 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run-ovn\") pod \"5a56ecb5-18f5-4645-8626-03f231f99f03\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.441025 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run\") pod \"5a56ecb5-18f5-4645-8626-03f231f99f03\" (UID: \"5a56ecb5-18f5-4645-8626-03f231f99f03\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.441111 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovn-rundir\") pod \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\" (UID: \"a9a45c6c-6521-40c9-af91-ac00f5427ce4\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.440263 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5a56ecb5-18f5-4645-8626-03f231f99f03" (UID: "5a56ecb5-18f5-4645-8626-03f231f99f03"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.441655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "a9a45c6c-6521-40c9-af91-ac00f5427ce4" (UID: "a9a45c6c-6521-40c9-af91-ac00f5427ce4"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.441683 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a45c6c-6521-40c9-af91-ac00f5427ce4-config" (OuterVolumeSpecName: "config") pod "a9a45c6c-6521-40c9-af91-ac00f5427ce4" (UID: "a9a45c6c-6521-40c9-af91-ac00f5427ce4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.442901 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a56ecb5-18f5-4645-8626-03f231f99f03-scripts" (OuterVolumeSpecName: "scripts") pod "5a56ecb5-18f5-4645-8626-03f231f99f03" (UID: "5a56ecb5-18f5-4645-8626-03f231f99f03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.444210 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5a56ecb5-18f5-4645-8626-03f231f99f03" (UID: "5a56ecb5-18f5-4645-8626-03f231f99f03"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.444241 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "a9a45c6c-6521-40c9-af91-ac00f5427ce4" (UID: "a9a45c6c-6521-40c9-af91-ac00f5427ce4"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.447342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run" (OuterVolumeSpecName: "var-run") pod "5a56ecb5-18f5-4645-8626-03f231f99f03" (UID: "5a56ecb5-18f5-4645-8626-03f231f99f03"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.471877 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a56ecb5-18f5-4645-8626-03f231f99f03-kube-api-access-wchht" (OuterVolumeSpecName: "kube-api-access-wchht") pod "5a56ecb5-18f5-4645-8626-03f231f99f03" (UID: "5a56ecb5-18f5-4645-8626-03f231f99f03"). InnerVolumeSpecName "kube-api-access-wchht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.472051 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a45c6c-6521-40c9-af91-ac00f5427ce4-kube-api-access-fhdlz" (OuterVolumeSpecName: "kube-api-access-fhdlz") pod "a9a45c6c-6521-40c9-af91-ac00f5427ce4" (UID: "a9a45c6c-6521-40c9-af91-ac00f5427ce4"). InnerVolumeSpecName "kube-api-access-fhdlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.509777 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" podUID="4a092179-7f71-47ce-9764-df909331a819" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.205:5353: connect: connection refused" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.510730 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a56ecb5-18f5-4645-8626-03f231f99f03" (UID: "5a56ecb5-18f5-4645-8626-03f231f99f03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.514683 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.515051 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-log" containerID="cri-o://94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.515479 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-api" containerID="cri-o://89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.520559 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9a45c6c-6521-40c9-af91-ac00f5427ce4" (UID: "a9a45c6c-6521-40c9-af91-ac00f5427ce4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.523337 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-57d8bd5bb-fsm9r"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.523613 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" containerName="barbican-keystone-listener-log" containerID="cri-o://fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.523996 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" containerName="barbican-keystone-listener" containerID="cri-o://3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.540670 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-df75-account-create-update-92ppw"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544308 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhdlz\" (UniqueName: \"kubernetes.io/projected/a9a45c6c-6521-40c9-af91-ac00f5427ce4-kube-api-access-fhdlz\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544337 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wchht\" (UniqueName: \"kubernetes.io/projected/5a56ecb5-18f5-4645-8626-03f231f99f03-kube-api-access-wchht\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544361 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a45c6c-6521-40c9-af91-ac00f5427ce4-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544370 4786 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544379 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544387 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544394 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a9a45c6c-6521-40c9-af91-ac00f5427ce4-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544408 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a56ecb5-18f5-4645-8626-03f231f99f03-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544417 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a56ecb5-18f5-4645-8626-03f231f99f03-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544440 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.544458 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.544530 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.544591 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data podName:65e5ca7c-1c5e-4f9e-85df-a92feaeddb43 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:19.544560918 +0000 UTC m=+1529.707772729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data") pod "rabbitmq-cell1-server-0" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43") : configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.561900 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8b6k9"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.568067 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" containerID="cri-o://679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" gracePeriod=29 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.576641 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7579f6547f-hnpzx"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.576927 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7579f6547f-hnpzx" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerName="barbican-worker-log" containerID="cri-o://1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.577355 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7579f6547f-hnpzx" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerName="barbican-worker" containerID="cri-o://26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.612173 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-x8krv"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.628941 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8b6k9"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.648164 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jw79s"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.662579 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "5a56ecb5-18f5-4645-8626-03f231f99f03" (UID: "5a56ecb5-18f5-4645-8626-03f231f99f03"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.671563 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jw79s"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.686614 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-2x4gs"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.697651 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "a9a45c6c-6521-40c9-af91-ac00f5427ce4" (UID: "a9a45c6c-6521-40c9-af91-ac00f5427ce4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.704898 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-2x4gs"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.712587 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-b756p"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.720536 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-tjlws"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.722181 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="45c96441-7032-49b6-b5fe-129ed26c4e38" containerName="galera" containerID="cri-o://28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.726271 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-tjlws"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.737670 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cfc0-account-create-update-8f6fk"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.747815 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a56ecb5-18f5-4645-8626-03f231f99f03-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.747873 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9a45c6c-6521-40c9-af91-ac00f5427ce4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.750935 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54bc4948fd-47bbp"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.751449 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54bc4948fd-47bbp" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api-log" containerID="cri-o://72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.751829 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54bc4948fd-47bbp" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api" containerID="cri-o://ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.760624 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-q97rk"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.767262 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-q97rk"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.773399 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c/ovn-northd/0.log" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.774002 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.777917 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ef50-account-create-update-bdrf5"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.808111 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.827970 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmld"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.850007 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrd25\" (UniqueName: \"kubernetes.io/projected/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-kube-api-access-xrd25\") pod \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.850075 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-scripts\") pod \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.850899 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-scripts" (OuterVolumeSpecName: "scripts") pod "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" (UID: "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.851080 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-metrics-certs-tls-certs\") pod \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.851157 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-combined-ca-bundle\") pod \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.852538 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-northd-tls-certs\") pod \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.855588 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-config\") pod \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.856415 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-rundir\") pod \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\" (UID: \"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c\") " Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.857951 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.857993 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-kube-api-access-xrd25" (OuterVolumeSpecName: "kube-api-access-xrd25") pod "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" (UID: "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c"). InnerVolumeSpecName "kube-api-access-xrd25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.858161 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-config" (OuterVolumeSpecName: "config") pod "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" (UID: "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.858240 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" (UID: "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.859377 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmld"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.875726 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s28qj"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.878541 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerName="rabbitmq" containerID="cri-o://789307556dd54b21497583b90b06c0b5ce70e7eed63ba1acf314c8edc36e15af" gracePeriod=604800 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.881579 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" (UID: "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.924916 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.925149 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="2c66255a-19d5-4417-bf43-f7f5bfff892a" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.945413 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s28qj"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.960177 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.960206 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.960216 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrd25\" (UniqueName: \"kubernetes.io/projected/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-kube-api-access-xrd25\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.960224 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.960941 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.961207 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f6b8537d-23ab-4c8d-9ca7-b307562baad8" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.971437 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-twsmd"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.974743 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ebcabfa0-9931-4c7d-a0e7-a0337bb887ea/ovsdbserver-sb/0.log" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.974805 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.974834 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:17 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: if [ -n "nova_cell0" ]; then Mar 13 15:28:17 crc kubenswrapper[4786]: GRANT_DATABASE="nova_cell0" Mar 13 15:28:17 crc kubenswrapper[4786]: else Mar 13 15:28:17 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:17 crc kubenswrapper[4786]: fi Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:17 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:17 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:17 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:17 crc kubenswrapper[4786]: # support updates Mar 13 15:28:17 crc kubenswrapper[4786]: Mar 13 15:28:17 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:17 crc kubenswrapper[4786]: E0313 15:28:17.976148 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-744d-account-create-update-b756p" podUID="304e2eba-fa9b-43b4-91b9-7bfdc48f9de8" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.984157 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.989556 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.990413 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3e362102-0c50-415e-8108-82eb18632381" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150" gracePeriod=30 Mar 13 15:28:17 crc kubenswrapper[4786]: I0313 15:28:17.996073 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" (UID: "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.013814 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.014494 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" containerName="nova-scheduler-scheduler" containerID="cri-o://afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83" gracePeriod=30 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.041053 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-b756p"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.051838 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-twsmd"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061075 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdb-rundir\") pod \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061130 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-combined-ca-bundle\") pod \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061158 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config-secret\") pod \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061187 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config\") pod \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061231 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-config\") pod \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061259 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-scripts\") pod \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061322 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnxlr\" (UniqueName: \"kubernetes.io/projected/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-kube-api-access-mnxlr\") pod \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061680 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061719 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82kt7\" (UniqueName: \"kubernetes.io/projected/bc33fb6e-0b09-479a-9825-3f7dfb100f37-kube-api-access-82kt7\") pod \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\" (UID: \"bc33fb6e-0b09-479a-9825-3f7dfb100f37\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdbserver-sb-tls-certs\") pod \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061850 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-combined-ca-bundle\") pod \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.061880 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-metrics-certs-tls-certs\") pod \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\" (UID: \"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.062312 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.063678 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" (UID: "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.063958 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-scripts" (OuterVolumeSpecName: "scripts") pod "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" (UID: "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.064606 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:18 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: if [ -n "" ]; then Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="" Mar 13 15:28:18 crc kubenswrapper[4786]: else Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:18 crc kubenswrapper[4786]: fi Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:18 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:18 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:18 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:18 crc kubenswrapper[4786]: # support updates Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.065064 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" (UID: "02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.065603 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-config" (OuterVolumeSpecName: "config") pod "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" (UID: "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.065735 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-twsmd" podUID="1cbfa737-846e-425a-b747-869de3afaf89" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.081869 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-kube-api-access-mnxlr" (OuterVolumeSpecName: "kube-api-access-mnxlr") pod "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" (UID: "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea"). InnerVolumeSpecName "kube-api-access-mnxlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.089669 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bc33fb6e-0b09-479a-9825-3f7dfb100f37" (UID: "bc33fb6e-0b09-479a-9825-3f7dfb100f37"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.093017 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" (UID: "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.102547 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc33fb6e-0b09-479a-9825-3f7dfb100f37-kube-api-access-82kt7" (OuterVolumeSpecName: "kube-api-access-82kt7") pod "bc33fb6e-0b09-479a-9825-3f7dfb100f37" (UID: "bc33fb6e-0b09-479a-9825-3f7dfb100f37"). InnerVolumeSpecName "kube-api-access-82kt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.117802 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" (UID: "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.122249 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc33fb6e-0b09-479a-9825-3f7dfb100f37" (UID: "bc33fb6e-0b09-479a-9825-3f7dfb100f37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.136064 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_73a4439f-50e7-4620-bf95-48d591ec6e3a/ovsdbserver-nb/0.log" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.136128 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.146447 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.152709 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bc33fb6e-0b09-479a-9825-3f7dfb100f37" (UID: "bc33fb6e-0b09-479a-9825-3f7dfb100f37"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165222 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-nb\") pod \"4a092179-7f71-47ce-9764-df909331a819\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-sb\") pod \"4a092179-7f71-47ce-9764-df909331a819\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165349 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-config\") pod \"4a092179-7f71-47ce-9764-df909331a819\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165378 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-scripts\") pod \"73a4439f-50e7-4620-bf95-48d591ec6e3a\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165411 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-combined-ca-bundle\") pod \"73a4439f-50e7-4620-bf95-48d591ec6e3a\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165464 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-config\") pod \"73a4439f-50e7-4620-bf95-48d591ec6e3a\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165511 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-swift-storage-0\") pod \"4a092179-7f71-47ce-9764-df909331a819\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165528 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdb-rundir\") pod \"73a4439f-50e7-4620-bf95-48d591ec6e3a\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165546 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk7rt\" (UniqueName: \"kubernetes.io/projected/73a4439f-50e7-4620-bf95-48d591ec6e3a-kube-api-access-dk7rt\") pod \"73a4439f-50e7-4620-bf95-48d591ec6e3a\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165582 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdbserver-nb-tls-certs\") pod \"73a4439f-50e7-4620-bf95-48d591ec6e3a\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165598 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"73a4439f-50e7-4620-bf95-48d591ec6e3a\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165623 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-metrics-certs-tls-certs\") pod \"73a4439f-50e7-4620-bf95-48d591ec6e3a\" (UID: \"73a4439f-50e7-4620-bf95-48d591ec6e3a\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165676 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-svc\") pod \"4a092179-7f71-47ce-9764-df909331a819\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.165710 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfv6x\" (UniqueName: \"kubernetes.io/projected/4a092179-7f71-47ce-9764-df909331a819-kube-api-access-hfv6x\") pod \"4a092179-7f71-47ce-9764-df909331a819\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166073 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnxlr\" (UniqueName: \"kubernetes.io/projected/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-kube-api-access-mnxlr\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166097 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166106 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82kt7\" (UniqueName: \"kubernetes.io/projected/bc33fb6e-0b09-479a-9825-3f7dfb100f37-kube-api-access-82kt7\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166115 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166124 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166132 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166142 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166151 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bc33fb6e-0b09-479a-9825-3f7dfb100f37-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166159 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166168 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.166176 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.168296 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "73a4439f-50e7-4620-bf95-48d591ec6e3a" (UID: "73a4439f-50e7-4620-bf95-48d591ec6e3a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.168685 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-config" (OuterVolumeSpecName: "config") pod "73a4439f-50e7-4620-bf95-48d591ec6e3a" (UID: "73a4439f-50e7-4620-bf95-48d591ec6e3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.170384 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-scripts" (OuterVolumeSpecName: "scripts") pod "73a4439f-50e7-4620-bf95-48d591ec6e3a" (UID: "73a4439f-50e7-4620-bf95-48d591ec6e3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.181718 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "73a4439f-50e7-4620-bf95-48d591ec6e3a" (UID: "73a4439f-50e7-4620-bf95-48d591ec6e3a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.182024 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a092179-7f71-47ce-9764-df909331a819-kube-api-access-hfv6x" (OuterVolumeSpecName: "kube-api-access-hfv6x") pod "4a092179-7f71-47ce-9764-df909331a819" (UID: "4a092179-7f71-47ce-9764-df909331a819"). InnerVolumeSpecName "kube-api-access-hfv6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.188309 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.198982 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a4439f-50e7-4620-bf95-48d591ec6e3a-kube-api-access-dk7rt" (OuterVolumeSpecName: "kube-api-access-dk7rt") pod "73a4439f-50e7-4620-bf95-48d591ec6e3a" (UID: "73a4439f-50e7-4620-bf95-48d591ec6e3a"). InnerVolumeSpecName "kube-api-access-dk7rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.237762 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-twsmd" event={"ID":"1cbfa737-846e-425a-b747-869de3afaf89","Type":"ContainerStarted","Data":"d4ccb898e983d485bc8f6c352fa8ca1d593a4657196659bf6df683e309259d74"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.265989 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-744d-account-create-update-b756p" event={"ID":"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8","Type":"ContainerStarted","Data":"1939160aa46ff65777a36824a8919155e6ed01b421a01430aa5cd7fbe93d5cd5"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.266084 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a092179-7f71-47ce-9764-df909331a819" (UID: "4a092179-7f71-47ce-9764-df909331a819"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.280710 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4a092179-7f71-47ce-9764-df909331a819" (UID: "4a092179-7f71-47ce-9764-df909331a819"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.281970 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-sb\") pod \"4a092179-7f71-47ce-9764-df909331a819\" (UID: \"4a092179-7f71-47ce-9764-df909331a819\") " Mar 13 15:28:18 crc kubenswrapper[4786]: W0313 15:28:18.282132 4786 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4a092179-7f71-47ce-9764-df909331a819/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282162 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a092179-7f71-47ce-9764-df909331a819" (UID: "4a092179-7f71-47ce-9764-df909331a819"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282709 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282728 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282741 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282750 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk7rt\" (UniqueName: \"kubernetes.io/projected/73a4439f-50e7-4620-bf95-48d591ec6e3a-kube-api-access-dk7rt\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282774 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282783 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfv6x\" (UniqueName: \"kubernetes.io/projected/4a092179-7f71-47ce-9764-df909331a819-kube-api-access-hfv6x\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282792 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282809 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.282817 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73a4439f-50e7-4620-bf95-48d591ec6e3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.293465 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_73a4439f-50e7-4620-bf95-48d591ec6e3a/ovsdbserver-nb/0.log" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.293513 4786 generic.go:334] "Generic (PLEG): container finished" podID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerID="d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78" exitCode=2 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.293530 4786 generic.go:334] "Generic (PLEG): container finished" podID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerID="19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.293601 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.293586 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"73a4439f-50e7-4620-bf95-48d591ec6e3a","Type":"ContainerDied","Data":"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.293642 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"73a4439f-50e7-4620-bf95-48d591ec6e3a","Type":"ContainerDied","Data":"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.293659 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"73a4439f-50e7-4620-bf95-48d591ec6e3a","Type":"ContainerDied","Data":"2758cd3bd25a52c3b20f2dd60ae0f14a4fd20cef44e5213d2d35be9d72126eb8"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.293679 4786 scope.go:117] "RemoveContainer" containerID="d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.304927 4786 generic.go:334] "Generic (PLEG): container finished" podID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerID="72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.305002 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54bc4948fd-47bbp" event={"ID":"94c381c8-c97e-4159-9bb4-3ede8f12d6e0","Type":"ContainerDied","Data":"72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.310241 4786 generic.go:334] "Generic (PLEG): container finished" podID="8402e30a-1517-41be-b468-1959c4b7621b" containerID="c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.310310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8402e30a-1517-41be-b468-1959c4b7621b","Type":"ContainerDied","Data":"c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.316848 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jc2x9_a9a45c6c-6521-40c9-af91-ac00f5427ce4/openstack-network-exporter/0.log" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.316962 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jc2x9" event={"ID":"a9a45c6c-6521-40c9-af91-ac00f5427ce4","Type":"ContainerDied","Data":"0c476858a0770da223391ca2917b88793837a68ba8245bc851494ca2ed801653"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.316993 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jc2x9" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.321043 4786 generic.go:334] "Generic (PLEG): container finished" podID="bc33fb6e-0b09-479a-9825-3f7dfb100f37" containerID="5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97" exitCode=137 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.321146 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.333350 4786 generic.go:334] "Generic (PLEG): container finished" podID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerID="94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.333480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e313e1cc-ed94-4e28-84f8-d053dcffb16a","Type":"ContainerDied","Data":"94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.337343 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a092179-7f71-47ce-9764-df909331a819" (UID: "4a092179-7f71-47ce-9764-df909331a819"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.338981 4786 scope.go:117] "RemoveContainer" containerID="19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.340765 4786 generic.go:334] "Generic (PLEG): container finished" podID="82f2e6fd-58ee-4002-b167-096b3b715233" containerID="c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.340818 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82f2e6fd-58ee-4002-b167-096b3b715233","Type":"ContainerDied","Data":"c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.342503 4786 generic.go:334] "Generic (PLEG): container finished" podID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerID="241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.342542 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-865786b7bb-9cnjb" event={"ID":"826794b9-41ec-4cab-bc85-d426d8e2a38b","Type":"ContainerDied","Data":"241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.344833 4786 generic.go:334] "Generic (PLEG): container finished" podID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.344890 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2hb98" event={"ID":"ee4ee4a6-b86a-454b-8952-6a0f16ce6353","Type":"ContainerDied","Data":"679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.351378 4786 generic.go:334] "Generic (PLEG): container finished" podID="31158646-2c0c-4098-bd3e-ea307fa78716" containerID="fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.351451 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" event={"ID":"31158646-2c0c-4098-bd3e-ea307fa78716","Type":"ContainerDied","Data":"fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.360928 4786 scope.go:117] "RemoveContainer" containerID="d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.361486 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78\": container with ID starting with d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78 not found: ID does not exist" containerID="d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.361510 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78"} err="failed to get container status \"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78\": rpc error: code = NotFound desc = could not find container \"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78\": container with ID starting with d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78 not found: ID does not exist" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.361528 4786 scope.go:117] "RemoveContainer" containerID="19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.361711 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13\": container with ID starting with 19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13 not found: ID does not exist" containerID="19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.361725 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13"} err="failed to get container status \"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13\": rpc error: code = NotFound desc = could not find container \"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13\": container with ID starting with 19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13 not found: ID does not exist" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.361738 4786 scope.go:117] "RemoveContainer" containerID="d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.362144 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" (UID: "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.362269 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78"} err="failed to get container status \"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78\": rpc error: code = NotFound desc = could not find container \"d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78\": container with ID starting with d15f0970f8025a64341b77e2535159eff9a0ebe848adb5cbb9818855cce0ca78 not found: ID does not exist" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.362284 4786 scope.go:117] "RemoveContainer" containerID="19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.363043 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13"} err="failed to get container status \"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13\": rpc error: code = NotFound desc = could not find container \"19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13\": container with ID starting with 19d7c1aff752750c8a385c5fe71c0bef9992da7a701af3a399d7c9c1a358ac13 not found: ID does not exist" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.363060 4786 scope.go:117] "RemoveContainer" containerID="8a0e90c6c16997dc571cfff737c2e9e9f5438e0dcf64bd4368c4878cc5a75790" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.363624 4786 generic.go:334] "Generic (PLEG): container finished" podID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerID="84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.363687 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd4f4c6c-zg28d" event={"ID":"21658ad3-b8e8-4743-b2c7-da4782850abc","Type":"ContainerDied","Data":"84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.373344 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bfm8s" event={"ID":"5a56ecb5-18f5-4645-8626-03f231f99f03","Type":"ContainerDied","Data":"6890e84989717cb2ec0c7f4516a8183cf76f220acdd3dbfcb76da2d18bb4274f"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.373404 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bfm8s" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.381799 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73a4439f-50e7-4620-bf95-48d591ec6e3a" (UID: "73a4439f-50e7-4620-bf95-48d591ec6e3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.382486 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ebcabfa0-9931-4c7d-a0e7-a0337bb887ea/ovsdbserver-sb/0.log" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.382534 4786 generic.go:334] "Generic (PLEG): container finished" podID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerID="97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.382588 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea","Type":"ContainerDied","Data":"97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.382616 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ebcabfa0-9931-4c7d-a0e7-a0337bb887ea","Type":"ContainerDied","Data":"da35b4d74f29bea1e8deaa12f4a584643b7d0e274d406b996bfe687261e117f2"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.382680 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.384548 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.384566 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.384579 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.388136 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a092179-7f71-47ce-9764-df909331a819" (UID: "4a092179-7f71-47ce-9764-df909331a819"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.391291 4786 generic.go:334] "Generic (PLEG): container finished" podID="8609052d-1ba2-4888-b973-05c8e4663632" containerID="da85c0b8ba7100a5a7dd0b1919a32c21a917bfb0e41b3a25bb5709d1b826d78c" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.391377 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8609052d-1ba2-4888-b973-05c8e4663632","Type":"ContainerDied","Data":"da85c0b8ba7100a5a7dd0b1919a32c21a917bfb0e41b3a25bb5709d1b826d78c"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.412109 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.426657 4786 scope.go:117] "RemoveContainer" containerID="5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.433028 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-config" (OuterVolumeSpecName: "config") pod "4a092179-7f71-47ce-9764-df909331a819" (UID: "4a092179-7f71-47ce-9764-df909331a819"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.442930 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.442954 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.442961 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.442967 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.442974 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.442980 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.442989 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.442996 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443005 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443011 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443020 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443029 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443067 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443093 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443104 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443113 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443121 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443130 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443138 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443156 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443165 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443175 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.443184 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.448486 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-jc2x9"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.467891 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c/ovn-northd/0.log" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.468007 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c","Type":"ContainerDied","Data":"e862aef3a0d6193b8389161bb11d80336a8633537f33da76b2dd7fdb646b3609"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.468120 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.471089 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "73a4439f-50e7-4620-bf95-48d591ec6e3a" (UID: "73a4439f-50e7-4620-bf95-48d591ec6e3a"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.477497 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-jc2x9"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.477989 4786 generic.go:334] "Generic (PLEG): container finished" podID="4a092179-7f71-47ce-9764-df909331a819" containerID="d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f" exitCode=0 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.478072 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" event={"ID":"4a092179-7f71-47ce-9764-df909331a819","Type":"ContainerDied","Data":"d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.478101 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" event={"ID":"4a092179-7f71-47ce-9764-df909331a819","Type":"ContainerDied","Data":"ca9e2922a461a1501211db7fa9e74b9e832dcc502667cdbe57945185d0cd0787"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.478179 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdb8f6449-2k7lw" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.485035 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "73a4439f-50e7-4620-bf95-48d591ec6e3a" (UID: "73a4439f-50e7-4620-bf95-48d591ec6e3a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.486733 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.486770 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.486783 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.486791 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a4439f-50e7-4620-bf95-48d591ec6e3a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.486800 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a092179-7f71-47ce-9764-df909331a819-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.487346 4786 generic.go:334] "Generic (PLEG): container finished" podID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerID="1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.487406 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579f6547f-hnpzx" event={"ID":"70dc1403-e7e9-4200-9a87-e3538a17c350","Type":"ContainerDied","Data":"1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.488501 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" (UID: "ebcabfa0-9931-4c7d-a0e7-a0337bb887ea"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.490204 4786 generic.go:334] "Generic (PLEG): container finished" podID="54a96a07-f63f-47d9-9191-0548996f01a7" containerID="4c92acc295fd57ca23b0e288711fb9a3af052bd9a754638cb9ef30ab57e5b073" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.490257 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a96a07-f63f-47d9-9191-0548996f01a7","Type":"ContainerDied","Data":"4c92acc295fd57ca23b0e288711fb9a3af052bd9a754638cb9ef30ab57e5b073"} Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.493138 4786 generic.go:334] "Generic (PLEG): container finished" podID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerID="18ddb14c07d08e39a714e8241c91bc621e8b9460fed525911386dabe3a845484" exitCode=143 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.493242 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ee1d973-a40c-4db0-8cc7-1c64ece074ac","Type":"ContainerDied","Data":"18ddb14c07d08e39a714e8241c91bc621e8b9460fed525911386dabe3a845484"} Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.495988 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:18 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: if [ -n "nova_api" ]; then Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="nova_api" Mar 13 15:28:18 crc kubenswrapper[4786]: else Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:18 crc kubenswrapper[4786]: fi Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:18 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:18 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:18 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:18 crc kubenswrapper[4786]: # support updates Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.497522 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-df75-account-create-update-92ppw" podUID="cda350b1-fe2a-4ed0-b7e1-f9a425076f56" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.499430 4786 scope.go:117] "RemoveContainer" containerID="5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.501876 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97\": container with ID starting with 5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97 not found: ID does not exist" containerID="5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.502143 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97"} err="failed to get container status \"5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97\": rpc error: code = NotFound desc = could not find container \"5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97\": container with ID starting with 5b9e9e2cd18e1519f6207dee7353b8bf62c3f40f8b1a93750d8100aaacad7b97 not found: ID does not exist" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.503199 4786 scope.go:117] "RemoveContainer" containerID="3d075c999ac56a272c8241240052d1c6ec3450ff98e7174795b8e3a6b89162b8" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.519184 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bfm8s"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.525734 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bfm8s"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.532064 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-58644c46cc-wt6m2"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.532327 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-58644c46cc-wt6m2" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerName="proxy-httpd" containerID="cri-o://53da64bef97437f332bd54aa2f803a2a48a781201e425bdc912c1c54d853dc83" gracePeriod=30 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.532567 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-58644c46cc-wt6m2" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerName="proxy-server" containerID="cri-o://04eabc29555d142e746eeaf8979b97c2eb9926d8687b2f8cbceebf6652f56b8c" gracePeriod=30 Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.538600 4786 scope.go:117] "RemoveContainer" containerID="2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.542544 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-2k7lw"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.580739 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c887202-25c6-42a6-975d-fa96e5e8673a" path="/var/lib/kubelet/pods/3c887202-25c6-42a6-975d-fa96e5e8673a/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.581436 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50342ecd-90d8-411c-aef8-53c0337267a9" path="/var/lib/kubelet/pods/50342ecd-90d8-411c-aef8-53c0337267a9/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.582310 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55b11a69-a25d-4d50-aaa6-b19905e48741" path="/var/lib/kubelet/pods/55b11a69-a25d-4d50-aaa6-b19905e48741/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.582768 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56def392-368a-4ab2-8958-559494d013cb" path="/var/lib/kubelet/pods/56def392-368a-4ab2-8958-559494d013cb/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.583954 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" path="/var/lib/kubelet/pods/5a56ecb5-18f5-4645-8626-03f231f99f03/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.585643 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d32db15-e92e-4497-82ed-94cfe47acde6" path="/var/lib/kubelet/pods/5d32db15-e92e-4497-82ed-94cfe47acde6/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.586160 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a874a7-d303-4a3a-b765-5d3316ad5c2b" path="/var/lib/kubelet/pods/75a874a7-d303-4a3a-b765-5d3316ad5c2b/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.588582 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.588626 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8576af97-74b4-4318-b12b-02c8b106a6eb" path="/var/lib/kubelet/pods/8576af97-74b4-4318-b12b-02c8b106a6eb/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.592083 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8803f262-0a85-48aa-86f4-b01a2fc3692b" path="/var/lib/kubelet/pods/8803f262-0a85-48aa-86f4-b01a2fc3692b/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.592834 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aecf238-05ea-4d9c-8cac-58a9aaa8490d" path="/var/lib/kubelet/pods/9aecf238-05ea-4d9c-8cac-58a9aaa8490d/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.596090 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4b51a9-f1dd-4b95-85a6-fb2098b786e4" path="/var/lib/kubelet/pods/9f4b51a9-f1dd-4b95-85a6-fb2098b786e4/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.596626 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9a45c6c-6521-40c9-af91-ac00f5427ce4" path="/var/lib/kubelet/pods/a9a45c6c-6521-40c9-af91-ac00f5427ce4/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.597469 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc33fb6e-0b09-479a-9825-3f7dfb100f37" path="/var/lib/kubelet/pods/bc33fb6e-0b09-479a-9825-3f7dfb100f37/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.638726 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda4c1d6-a015-4455-8462-cd93e9fed73e" path="/var/lib/kubelet/pods/bda4c1d6-a015-4455-8462-cd93e9fed73e/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.639781 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd170a9-ab19-4fbc-8948-6810a0ba8615" path="/var/lib/kubelet/pods/dcd170a9-ab19-4fbc-8948-6810a0ba8615/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.640555 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556" path="/var/lib/kubelet/pods/dfc0bcf0-d9d4-4cdb-bdbb-a6bd0b008556/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.641278 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed5fea62-85d0-4afd-a716-1a450c1baafb" path="/var/lib/kubelet/pods/ed5fea62-85d0-4afd-a716-1a450c1baafb/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.662770 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8a98ec-82a3-418d-82ea-d0ff210dd78d" path="/var/lib/kubelet/pods/fb8a98ec-82a3-418d-82ea-d0ff210dd78d/volumes" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.664283 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fdb8f6449-2k7lw"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.681170 4786 scope.go:117] "RemoveContainer" containerID="97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.687434 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.702100 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.731423 4786 scope.go:117] "RemoveContainer" containerID="2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.739518 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0\": container with ID starting with 2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0 not found: ID does not exist" containerID="2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.739571 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0"} err="failed to get container status \"2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0\": rpc error: code = NotFound desc = could not find container \"2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0\": container with ID starting with 2c116906b602048dd0e06c5f16a4fc2195328d38c866486ba242127db0aaa2a0 not found: ID does not exist" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.739595 4786 scope.go:117] "RemoveContainer" containerID="97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.739978 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049\": container with ID starting with 97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049 not found: ID does not exist" containerID="97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.740027 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049"} err="failed to get container status \"97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049\": rpc error: code = NotFound desc = could not find container \"97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049\": container with ID starting with 97a52518dcb386f11e331c7ba5fe06330877a47c06d25fb1ca45c8ff11e64049 not found: ID does not exist" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.740058 4786 scope.go:117] "RemoveContainer" containerID="44fca03dce57cb826c43f7929ebd5d7925bdeb707e7610d77a9fed038f82c78f" Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.750652 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.753519 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.766290 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.791767 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.823395 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ef50-account-create-update-bdrf5"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.875750 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cfc0-account-create-update-8f6fk"] Mar 13 15:28:18 crc kubenswrapper[4786]: I0313 15:28:18.902630 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-x8krv"] Mar 13 15:28:18 crc kubenswrapper[4786]: W0313 15:28:18.914626 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6297c29d_09ec_49e7_ae22_6b20962603a7.slice/crio-a851b1e516036752eea126d71749a496f752be33f1e55d9812d4b1b2b51d3fb0 WatchSource:0}: Error finding container a851b1e516036752eea126d71749a496f752be33f1e55d9812d4b1b2b51d3fb0: Status 404 returned error can't find the container with id a851b1e516036752eea126d71749a496f752be33f1e55d9812d4b1b2b51d3fb0 Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.927201 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:18 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: if [ -n "cinder" ]; then Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="cinder" Mar 13 15:28:18 crc kubenswrapper[4786]: else Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:18 crc kubenswrapper[4786]: fi Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:18 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:18 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:18 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:18 crc kubenswrapper[4786]: # support updates Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.928601 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-ef50-account-create-update-bdrf5" podUID="f8bd4962-9f30-4961-8dcb-7604ef587af6" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.929357 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:18 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: if [ -n "barbican" ]; then Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="barbican" Mar 13 15:28:18 crc kubenswrapper[4786]: else Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:18 crc kubenswrapper[4786]: fi Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:18 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:18 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:18 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:18 crc kubenswrapper[4786]: # support updates Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.930451 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-cfc0-account-create-update-8f6fk" podUID="6297c29d-09ec-49e7-ae22-6b20962603a7" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.935062 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:18 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: if [ -n "nova_cell1" ]; then Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="nova_cell1" Mar 13 15:28:18 crc kubenswrapper[4786]: else Mar 13 15:28:18 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:18 crc kubenswrapper[4786]: fi Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:18 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:18 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:18 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:18 crc kubenswrapper[4786]: # support updates Mar 13 15:28:18 crc kubenswrapper[4786]: Mar 13 15:28:18 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:18 crc kubenswrapper[4786]: E0313 15:28:18.936250 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-a23c-account-create-update-x8krv" podUID="226d3f05-c83e-4267-b6da-e30ceae91436" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.064612 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.068846 4786 scope.go:117] "RemoveContainer" containerID="c3e06b1c63981a1ee17faedf31473e5b94b2397e348e12cc2c507ce411aeb8c1" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.080888 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.098952 4786 scope.go:117] "RemoveContainer" containerID="d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.108080 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-operator-scripts\") pod \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\" (UID: \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.108250 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7ctk\" (UniqueName: \"kubernetes.io/projected/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-kube-api-access-x7ctk\") pod \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\" (UID: \"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.115428 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-kube-api-access-x7ctk" (OuterVolumeSpecName: "kube-api-access-x7ctk") pod "304e2eba-fa9b-43b4-91b9-7bfdc48f9de8" (UID: "304e2eba-fa9b-43b4-91b9-7bfdc48f9de8"). InnerVolumeSpecName "kube-api-access-x7ctk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.117506 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "304e2eba-fa9b-43b4-91b9-7bfdc48f9de8" (UID: "304e2eba-fa9b-43b4-91b9-7bfdc48f9de8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.132806 4786 scope.go:117] "RemoveContainer" containerID="3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.160276 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.187816 4786 scope.go:117] "RemoveContainer" containerID="d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.193420 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f\": container with ID starting with d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f not found: ID does not exist" containerID="d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.193476 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f"} err="failed to get container status \"d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f\": rpc error: code = NotFound desc = could not find container \"d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f\": container with ID starting with d8e2db39abea4e15dc9ab9c4d355ca796e5e8a25989f9f6ccf581af627a3244f not found: ID does not exist" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.193524 4786 scope.go:117] "RemoveContainer" containerID="3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.195185 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca\": container with ID starting with 3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca not found: ID does not exist" containerID="3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.195232 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca"} err="failed to get container status \"3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca\": rpc error: code = NotFound desc = could not find container \"3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca\": container with ID starting with 3831d83463ef19fae91d20fee983aeb1579854514a1ceee628cf9ee706d7fdca not found: ID does not exist" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.210041 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbfa737-846e-425a-b747-869de3afaf89-operator-scripts\") pod \"1cbfa737-846e-425a-b747-869de3afaf89\" (UID: \"1cbfa737-846e-425a-b747-869de3afaf89\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.210190 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94c95\" (UniqueName: \"kubernetes.io/projected/1cbfa737-846e-425a-b747-869de3afaf89-kube-api-access-94c95\") pod \"1cbfa737-846e-425a-b747-869de3afaf89\" (UID: \"1cbfa737-846e-425a-b747-869de3afaf89\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.210599 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.210611 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7ctk\" (UniqueName: \"kubernetes.io/projected/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8-kube-api-access-x7ctk\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.210787 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbfa737-846e-425a-b747-869de3afaf89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cbfa737-846e-425a-b747-869de3afaf89" (UID: "1cbfa737-846e-425a-b747-869de3afaf89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.216337 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbfa737-846e-425a-b747-869de3afaf89-kube-api-access-94c95" (OuterVolumeSpecName: "kube-api-access-94c95") pod "1cbfa737-846e-425a-b747-869de3afaf89" (UID: "1cbfa737-846e-425a-b747-869de3afaf89"). InnerVolumeSpecName "kube-api-access-94c95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.253544 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.311530 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"45c96441-7032-49b6-b5fe-129ed26c4e38\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.311613 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2q8\" (UniqueName: \"kubernetes.io/projected/45c96441-7032-49b6-b5fe-129ed26c4e38-kube-api-access-mq2q8\") pod \"45c96441-7032-49b6-b5fe-129ed26c4e38\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.311642 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-operator-scripts\") pod \"45c96441-7032-49b6-b5fe-129ed26c4e38\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.311682 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-kolla-config\") pod \"45c96441-7032-49b6-b5fe-129ed26c4e38\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.311774 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-default\") pod \"45c96441-7032-49b6-b5fe-129ed26c4e38\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.311806 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-galera-tls-certs\") pod \"45c96441-7032-49b6-b5fe-129ed26c4e38\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.311835 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-generated\") pod \"45c96441-7032-49b6-b5fe-129ed26c4e38\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.311914 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-combined-ca-bundle\") pod \"45c96441-7032-49b6-b5fe-129ed26c4e38\" (UID: \"45c96441-7032-49b6-b5fe-129ed26c4e38\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.312401 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbfa737-846e-425a-b747-869de3afaf89-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.312429 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94c95\" (UniqueName: \"kubernetes.io/projected/1cbfa737-846e-425a-b747-869de3afaf89-kube-api-access-94c95\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.314186 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45c96441-7032-49b6-b5fe-129ed26c4e38" (UID: "45c96441-7032-49b6-b5fe-129ed26c4e38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.314749 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "45c96441-7032-49b6-b5fe-129ed26c4e38" (UID: "45c96441-7032-49b6-b5fe-129ed26c4e38"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.315418 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "45c96441-7032-49b6-b5fe-129ed26c4e38" (UID: "45c96441-7032-49b6-b5fe-129ed26c4e38"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.316199 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "45c96441-7032-49b6-b5fe-129ed26c4e38" (UID: "45c96441-7032-49b6-b5fe-129ed26c4e38"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.323394 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c96441-7032-49b6-b5fe-129ed26c4e38-kube-api-access-mq2q8" (OuterVolumeSpecName: "kube-api-access-mq2q8") pod "45c96441-7032-49b6-b5fe-129ed26c4e38" (UID: "45c96441-7032-49b6-b5fe-129ed26c4e38"). InnerVolumeSpecName "kube-api-access-mq2q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.341972 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "45c96441-7032-49b6-b5fe-129ed26c4e38" (UID: "45c96441-7032-49b6-b5fe-129ed26c4e38"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.372192 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45c96441-7032-49b6-b5fe-129ed26c4e38" (UID: "45c96441-7032-49b6-b5fe-129ed26c4e38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.383704 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "45c96441-7032-49b6-b5fe-129ed26c4e38" (UID: "45c96441-7032-49b6-b5fe-129ed26c4e38"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.413766 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.413802 4786 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.413814 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45c96441-7032-49b6-b5fe-129ed26c4e38-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.413824 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c96441-7032-49b6-b5fe-129ed26c4e38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.413845 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.413866 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2q8\" (UniqueName: \"kubernetes.io/projected/45c96441-7032-49b6-b5fe-129ed26c4e38-kube-api-access-mq2q8\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.413876 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.413884 4786 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45c96441-7032-49b6-b5fe-129ed26c4e38-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.439444 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.452099 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.461479 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.508650 4786 generic.go:334] "Generic (PLEG): container finished" podID="45c96441-7032-49b6-b5fe-129ed26c4e38" containerID="28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83" exitCode=0 Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.508711 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45c96441-7032-49b6-b5fe-129ed26c4e38","Type":"ContainerDied","Data":"28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.508736 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45c96441-7032-49b6-b5fe-129ed26c4e38","Type":"ContainerDied","Data":"f6da44f243c1bf50ae5e7e7f96501f1b52ebb5ea2544e64b0397bc4f358ff661"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.508738 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.508755 4786 scope.go:117] "RemoveContainer" containerID="28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.510989 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef50-account-create-update-bdrf5" event={"ID":"f8bd4962-9f30-4961-8dcb-7604ef587af6","Type":"ContainerStarted","Data":"35caf65140a99c3295452984e333f7b6a6e992affd8d0a2ca4a42af11a1e6e7a"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.514479 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-config-data\") pod \"3e362102-0c50-415e-8108-82eb18632381\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.514524 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-combined-ca-bundle\") pod \"3e362102-0c50-415e-8108-82eb18632381\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.514580 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckwqz\" (UniqueName: \"kubernetes.io/projected/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-kube-api-access-ckwqz\") pod \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.514610 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-nova-novncproxy-tls-certs\") pod \"3e362102-0c50-415e-8108-82eb18632381\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.514731 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-vencrypt-tls-certs\") pod \"3e362102-0c50-415e-8108-82eb18632381\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.515431 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-combined-ca-bundle\") pod \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.515577 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6k9d\" (UniqueName: \"kubernetes.io/projected/3e362102-0c50-415e-8108-82eb18632381-kube-api-access-l6k9d\") pod \"3e362102-0c50-415e-8108-82eb18632381\" (UID: \"3e362102-0c50-415e-8108-82eb18632381\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.515608 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-config-data\") pod \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\" (UID: \"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.516276 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.517432 4786 generic.go:334] "Generic (PLEG): container finished" podID="3e362102-0c50-415e-8108-82eb18632381" containerID="2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150" exitCode=0 Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.517517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e362102-0c50-415e-8108-82eb18632381","Type":"ContainerDied","Data":"2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.517547 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e362102-0c50-415e-8108-82eb18632381","Type":"ContainerDied","Data":"3cdc000bb5d0f0c5280ba172145d2076ac73bf9e48cabd2a89dc4ed2dd4eecb9"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.517613 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.522991 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e362102-0c50-415e-8108-82eb18632381-kube-api-access-l6k9d" (OuterVolumeSpecName: "kube-api-access-l6k9d") pod "3e362102-0c50-415e-8108-82eb18632381" (UID: "3e362102-0c50-415e-8108-82eb18632381"). InnerVolumeSpecName "kube-api-access-l6k9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.534317 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a23c-account-create-update-x8krv" event={"ID":"226d3f05-c83e-4267-b6da-e30ceae91436","Type":"ContainerStarted","Data":"177690b7ecc97e889d5b429882cc19e17152f75db3f3a754c2f07b4f1ec96650"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.535842 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-kube-api-access-ckwqz" (OuterVolumeSpecName: "kube-api-access-ckwqz") pod "55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" (UID: "55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e"). InnerVolumeSpecName "kube-api-access-ckwqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.556315 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-twsmd" event={"ID":"1cbfa737-846e-425a-b747-869de3afaf89","Type":"ContainerDied","Data":"d4ccb898e983d485bc8f6c352fa8ca1d593a4657196659bf6df683e309259d74"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.556541 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-twsmd" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.561878 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-744d-account-create-update-b756p" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.561872 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-744d-account-create-update-b756p" event={"ID":"304e2eba-fa9b-43b4-91b9-7bfdc48f9de8","Type":"ContainerDied","Data":"1939160aa46ff65777a36824a8919155e6ed01b421a01430aa5cd7fbe93d5cd5"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.563983 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cfc0-account-create-update-8f6fk" event={"ID":"6297c29d-09ec-49e7-ae22-6b20962603a7","Type":"ContainerStarted","Data":"a851b1e516036752eea126d71749a496f752be33f1e55d9812d4b1b2b51d3fb0"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.619044 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6k9d\" (UniqueName: \"kubernetes.io/projected/3e362102-0c50-415e-8108-82eb18632381-kube-api-access-l6k9d\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.619075 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckwqz\" (UniqueName: \"kubernetes.io/projected/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-kube-api-access-ckwqz\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.619167 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.619214 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data podName:65e5ca7c-1c5e-4f9e-85df-a92feaeddb43 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:23.61919917 +0000 UTC m=+1533.782410981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data") pod "rabbitmq-cell1-server-0" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43") : configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.625181 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-config-data" (OuterVolumeSpecName: "config-data") pod "55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" (UID: "55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.625274 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" (UID: "55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.639782 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-config-data" (OuterVolumeSpecName: "config-data") pod "3e362102-0c50-415e-8108-82eb18632381" (UID: "3e362102-0c50-415e-8108-82eb18632381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.640134 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e362102-0c50-415e-8108-82eb18632381" (UID: "3e362102-0c50-415e-8108-82eb18632381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.643319 4786 generic.go:334] "Generic (PLEG): container finished" podID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerID="04eabc29555d142e746eeaf8979b97c2eb9926d8687b2f8cbceebf6652f56b8c" exitCode=0 Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.643359 4786 generic.go:334] "Generic (PLEG): container finished" podID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerID="53da64bef97437f332bd54aa2f803a2a48a781201e425bdc912c1c54d853dc83" exitCode=0 Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.643432 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58644c46cc-wt6m2" event={"ID":"3b415188-88f4-447e-a1e9-ca424047ee8e","Type":"ContainerDied","Data":"04eabc29555d142e746eeaf8979b97c2eb9926d8687b2f8cbceebf6652f56b8c"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.643462 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58644c46cc-wt6m2" event={"ID":"3b415188-88f4-447e-a1e9-ca424047ee8e","Type":"ContainerDied","Data":"53da64bef97437f332bd54aa2f803a2a48a781201e425bdc912c1c54d853dc83"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.647701 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "3e362102-0c50-415e-8108-82eb18632381" (UID: "3e362102-0c50-415e-8108-82eb18632381"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.673323 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "3e362102-0c50-415e-8108-82eb18632381" (UID: "3e362102-0c50-415e-8108-82eb18632381"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.676028 4786 generic.go:334] "Generic (PLEG): container finished" podID="55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" containerID="afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83" exitCode=0 Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.676075 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e","Type":"ContainerDied","Data":"afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.676097 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e","Type":"ContainerDied","Data":"2689979b02e556a2ef958860b3ada3bb782622953f6709b64a670f5a6fad9ac7"} Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.676174 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.721575 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.721606 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.721619 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.721631 4786 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.721642 4786 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e362102-0c50-415e-8108-82eb18632381-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.721654 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.765542 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.765689 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.769967 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.770089 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.772071 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.772108 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.772170 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.772196 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.781915 4786 scope.go:117] "RemoveContainer" containerID="0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.792348 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.804616 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.864153 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-twsmd"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.880042 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-twsmd"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.881928 4786 scope.go:117] "RemoveContainer" containerID="28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.882293 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.882308 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83\": container with ID starting with 28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83 not found: ID does not exist" containerID="28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.882527 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83"} err="failed to get container status \"28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83\": rpc error: code = NotFound desc = could not find container \"28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83\": container with ID starting with 28e54d435daafadc7b363e435d0d4d08f481b7d0ccc657960ad56beb8611ef83 not found: ID does not exist" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.882660 4786 scope.go:117] "RemoveContainer" containerID="0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.883182 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa\": container with ID starting with 0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa not found: ID does not exist" containerID="0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.883221 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa"} err="failed to get container status \"0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa\": rpc error: code = NotFound desc = could not find container \"0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa\": container with ID starting with 0690e16b40cf11bef9db2cb4f1818b283cdaea732ad51b462bf8e44c7a3fb0aa not found: ID does not exist" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.883275 4786 scope.go:117] "RemoveContainer" containerID="2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.897422 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.904668 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.910451 4786 scope.go:117] "RemoveContainer" containerID="2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.910909 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150\": container with ID starting with 2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150 not found: ID does not exist" containerID="2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.910939 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150"} err="failed to get container status \"2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150\": rpc error: code = NotFound desc = could not find container \"2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150\": container with ID starting with 2225eea975d36ecdfe3381d87b54ef4377a7c5765488ad913534e8e0c2e9a150 not found: ID does not exist" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.910963 4786 scope.go:117] "RemoveContainer" containerID="afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.922673 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-b756p"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.931169 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-744d-account-create-update-b756p"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.933325 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-config-data\") pod \"3b415188-88f4-447e-a1e9-ca424047ee8e\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.933376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-combined-ca-bundle\") pod \"3b415188-88f4-447e-a1e9-ca424047ee8e\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.933409 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-run-httpd\") pod \"3b415188-88f4-447e-a1e9-ca424047ee8e\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.933443 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-etc-swift\") pod \"3b415188-88f4-447e-a1e9-ca424047ee8e\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.933510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k54n6\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-kube-api-access-k54n6\") pod \"3b415188-88f4-447e-a1e9-ca424047ee8e\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.933585 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-public-tls-certs\") pod \"3b415188-88f4-447e-a1e9-ca424047ee8e\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.933663 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-log-httpd\") pod \"3b415188-88f4-447e-a1e9-ca424047ee8e\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.933794 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-internal-tls-certs\") pod \"3b415188-88f4-447e-a1e9-ca424047ee8e\" (UID: \"3b415188-88f4-447e-a1e9-ca424047ee8e\") " Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.934006 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3b415188-88f4-447e-a1e9-ca424047ee8e" (UID: "3b415188-88f4-447e-a1e9-ca424047ee8e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.934514 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3b415188-88f4-447e-a1e9-ca424047ee8e" (UID: "3b415188-88f4-447e-a1e9-ca424047ee8e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.940572 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.942020 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3b415188-88f4-447e-a1e9-ca424047ee8e" (UID: "3b415188-88f4-447e-a1e9-ca424047ee8e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.946968 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-kube-api-access-k54n6" (OuterVolumeSpecName: "kube-api-access-k54n6") pod "3b415188-88f4-447e-a1e9-ca424047ee8e" (UID: "3b415188-88f4-447e-a1e9-ca424047ee8e"). InnerVolumeSpecName "kube-api-access-k54n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.948026 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.949417 4786 scope.go:117] "RemoveContainer" containerID="afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83" Mar 13 15:28:19 crc kubenswrapper[4786]: E0313 15:28:19.950023 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83\": container with ID starting with afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83 not found: ID does not exist" containerID="afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83" Mar 13 15:28:19 crc kubenswrapper[4786]: I0313 15:28:19.950054 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83"} err="failed to get container status \"afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83\": rpc error: code = NotFound desc = could not find container \"afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83\": container with ID starting with afad54d28cd772782f044e5bfab06667f5382613e0f1a39277035f8fa7937e83 not found: ID does not exist" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.008246 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3b415188-88f4-447e-a1e9-ca424047ee8e" (UID: "3b415188-88f4-447e-a1e9-ca424047ee8e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.009007 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3b415188-88f4-447e-a1e9-ca424047ee8e" (UID: "3b415188-88f4-447e-a1e9-ca424047ee8e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.010074 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-config-data" (OuterVolumeSpecName: "config-data") pod "3b415188-88f4-447e-a1e9-ca424047ee8e" (UID: "3b415188-88f4-447e-a1e9-ca424047ee8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.045544 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.045603 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.045616 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.045626 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.045637 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k54n6\" (UniqueName: \"kubernetes.io/projected/3b415188-88f4-447e-a1e9-ca424047ee8e-kube-api-access-k54n6\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.045650 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.045660 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b415188-88f4-447e-a1e9-ca424047ee8e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.055102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b415188-88f4-447e-a1e9-ca424047ee8e" (UID: "3b415188-88f4-447e-a1e9-ca424047ee8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.057473 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-d567z"] Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.057889 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerName="ovsdbserver-nb" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.057903 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerName="ovsdbserver-nb" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.057930 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a45c6c-6521-40c9-af91-ac00f5427ce4" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.057939 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a45c6c-6521-40c9-af91-ac00f5427ce4" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.057950 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e362102-0c50-415e-8108-82eb18632381" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.057959 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e362102-0c50-415e-8108-82eb18632381" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.057971 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerName="proxy-httpd" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.057979 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerName="proxy-httpd" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.057991 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="ovn-northd" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.057998 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="ovn-northd" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058010 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerName="proxy-server" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058018 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerName="proxy-server" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058031 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058038 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058053 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c96441-7032-49b6-b5fe-129ed26c4e38" containerName="galera" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058061 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c96441-7032-49b6-b5fe-129ed26c4e38" containerName="galera" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058080 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058089 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058102 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a092179-7f71-47ce-9764-df909331a819" containerName="init" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058110 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a092179-7f71-47ce-9764-df909331a819" containerName="init" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058124 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c96441-7032-49b6-b5fe-129ed26c4e38" containerName="mysql-bootstrap" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058131 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c96441-7032-49b6-b5fe-129ed26c4e38" containerName="mysql-bootstrap" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058142 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" containerName="nova-scheduler-scheduler" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058150 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" containerName="nova-scheduler-scheduler" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058160 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058167 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058176 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerName="ovsdbserver-sb" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058379 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerName="ovsdbserver-sb" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058398 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a092179-7f71-47ce-9764-df909331a819" containerName="dnsmasq-dns" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058406 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a092179-7f71-47ce-9764-df909331a819" containerName="dnsmasq-dns" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.058425 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerName="ovn-controller" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058433 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerName="ovn-controller" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058847 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerName="ovsdbserver-sb" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058901 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerName="proxy-server" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058913 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058929 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c96441-7032-49b6-b5fe-129ed26c4e38" containerName="galera" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058939 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e362102-0c50-415e-8108-82eb18632381" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058949 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a45c6c-6521-40c9-af91-ac00f5427ce4" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058961 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" containerName="ovsdbserver-nb" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058976 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" containerName="proxy-httpd" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058985 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a092179-7f71-47ce-9764-df909331a819" containerName="dnsmasq-dns" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.058997 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a56ecb5-18f5-4645-8626-03f231f99f03" containerName="ovn-controller" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.059014 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.059022 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" containerName="openstack-network-exporter" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.059035 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" containerName="ovn-northd" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.059043 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" containerName="nova-scheduler-scheduler" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.060500 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d567z" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.065288 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.075746 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d567z"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.099919 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.110147 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.147454 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmf7t\" (UniqueName: \"kubernetes.io/projected/1355c383-567c-4c71-a13c-e46f29dd5f8e-kube-api-access-gmf7t\") pod \"root-account-create-update-d567z\" (UID: \"1355c383-567c-4c71-a13c-e46f29dd5f8e\") " pod="openstack/root-account-create-update-d567z" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.147751 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1355c383-567c-4c71-a13c-e46f29dd5f8e-operator-scripts\") pod \"root-account-create-update-d567z\" (UID: \"1355c383-567c-4c71-a13c-e46f29dd5f8e\") " pod="openstack/root-account-create-update-d567z" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.148076 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b415188-88f4-447e-a1e9-ca424047ee8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.209978 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.250924 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226d3f05-c83e-4267-b6da-e30ceae91436-operator-scripts\") pod \"226d3f05-c83e-4267-b6da-e30ceae91436\" (UID: \"226d3f05-c83e-4267-b6da-e30ceae91436\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.251006 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7srsv\" (UniqueName: \"kubernetes.io/projected/226d3f05-c83e-4267-b6da-e30ceae91436-kube-api-access-7srsv\") pod \"226d3f05-c83e-4267-b6da-e30ceae91436\" (UID: \"226d3f05-c83e-4267-b6da-e30ceae91436\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.251028 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rz4l\" (UniqueName: \"kubernetes.io/projected/f8bd4962-9f30-4961-8dcb-7604ef587af6-kube-api-access-8rz4l\") pod \"f8bd4962-9f30-4961-8dcb-7604ef587af6\" (UID: \"f8bd4962-9f30-4961-8dcb-7604ef587af6\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.251143 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd4962-9f30-4961-8dcb-7604ef587af6-operator-scripts\") pod \"f8bd4962-9f30-4961-8dcb-7604ef587af6\" (UID: \"f8bd4962-9f30-4961-8dcb-7604ef587af6\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.251403 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmf7t\" (UniqueName: \"kubernetes.io/projected/1355c383-567c-4c71-a13c-e46f29dd5f8e-kube-api-access-gmf7t\") pod \"root-account-create-update-d567z\" (UID: \"1355c383-567c-4c71-a13c-e46f29dd5f8e\") " pod="openstack/root-account-create-update-d567z" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.251438 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1355c383-567c-4c71-a13c-e46f29dd5f8e-operator-scripts\") pod \"root-account-create-update-d567z\" (UID: \"1355c383-567c-4c71-a13c-e46f29dd5f8e\") " pod="openstack/root-account-create-update-d567z" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.252416 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd4962-9f30-4961-8dcb-7604ef587af6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8bd4962-9f30-4961-8dcb-7604ef587af6" (UID: "f8bd4962-9f30-4961-8dcb-7604ef587af6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.252449 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/226d3f05-c83e-4267-b6da-e30ceae91436-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "226d3f05-c83e-4267-b6da-e30ceae91436" (UID: "226d3f05-c83e-4267-b6da-e30ceae91436"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.252710 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1355c383-567c-4c71-a13c-e46f29dd5f8e-operator-scripts\") pod \"root-account-create-update-d567z\" (UID: \"1355c383-567c-4c71-a13c-e46f29dd5f8e\") " pod="openstack/root-account-create-update-d567z" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.255447 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bd4962-9f30-4961-8dcb-7604ef587af6-kube-api-access-8rz4l" (OuterVolumeSpecName: "kube-api-access-8rz4l") pod "f8bd4962-9f30-4961-8dcb-7604ef587af6" (UID: "f8bd4962-9f30-4961-8dcb-7604ef587af6"). InnerVolumeSpecName "kube-api-access-8rz4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.257205 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226d3f05-c83e-4267-b6da-e30ceae91436-kube-api-access-7srsv" (OuterVolumeSpecName: "kube-api-access-7srsv") pod "226d3f05-c83e-4267-b6da-e30ceae91436" (UID: "226d3f05-c83e-4267-b6da-e30ceae91436"). InnerVolumeSpecName "kube-api-access-7srsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.270673 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmf7t\" (UniqueName: \"kubernetes.io/projected/1355c383-567c-4c71-a13c-e46f29dd5f8e-kube-api-access-gmf7t\") pod \"root-account-create-update-d567z\" (UID: \"1355c383-567c-4c71-a13c-e46f29dd5f8e\") " pod="openstack/root-account-create-update-d567z" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.284220 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.297927 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.352219 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qbd8\" (UniqueName: \"kubernetes.io/projected/6297c29d-09ec-49e7-ae22-6b20962603a7-kube-api-access-4qbd8\") pod \"6297c29d-09ec-49e7-ae22-6b20962603a7\" (UID: \"6297c29d-09ec-49e7-ae22-6b20962603a7\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.352463 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6297c29d-09ec-49e7-ae22-6b20962603a7-operator-scripts\") pod \"6297c29d-09ec-49e7-ae22-6b20962603a7\" (UID: \"6297c29d-09ec-49e7-ae22-6b20962603a7\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.352506 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf6jt\" (UniqueName: \"kubernetes.io/projected/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-kube-api-access-gf6jt\") pod \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\" (UID: \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.352566 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-operator-scripts\") pod \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\" (UID: \"cda350b1-fe2a-4ed0-b7e1-f9a425076f56\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.352940 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bd4962-9f30-4961-8dcb-7604ef587af6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.352953 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226d3f05-c83e-4267-b6da-e30ceae91436-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.352962 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7srsv\" (UniqueName: \"kubernetes.io/projected/226d3f05-c83e-4267-b6da-e30ceae91436-kube-api-access-7srsv\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.352974 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rz4l\" (UniqueName: \"kubernetes.io/projected/f8bd4962-9f30-4961-8dcb-7604ef587af6-kube-api-access-8rz4l\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.353354 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cda350b1-fe2a-4ed0-b7e1-f9a425076f56" (UID: "cda350b1-fe2a-4ed0-b7e1-f9a425076f56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.354297 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6297c29d-09ec-49e7-ae22-6b20962603a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6297c29d-09ec-49e7-ae22-6b20962603a7" (UID: "6297c29d-09ec-49e7-ae22-6b20962603a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.361078 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-kube-api-access-gf6jt" (OuterVolumeSpecName: "kube-api-access-gf6jt") pod "cda350b1-fe2a-4ed0-b7e1-f9a425076f56" (UID: "cda350b1-fe2a-4ed0-b7e1-f9a425076f56"). InnerVolumeSpecName "kube-api-access-gf6jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.361128 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6297c29d-09ec-49e7-ae22-6b20962603a7-kube-api-access-4qbd8" (OuterVolumeSpecName: "kube-api-access-4qbd8") pod "6297c29d-09ec-49e7-ae22-6b20962603a7" (UID: "6297c29d-09ec-49e7-ae22-6b20962603a7"). InnerVolumeSpecName "kube-api-access-4qbd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.393564 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d567z" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.454384 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.454424 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qbd8\" (UniqueName: \"kubernetes.io/projected/6297c29d-09ec-49e7-ae22-6b20962603a7-kube-api-access-4qbd8\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.454436 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6297c29d-09ec-49e7-ae22-6b20962603a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.454447 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf6jt\" (UniqueName: \"kubernetes.io/projected/cda350b1-fe2a-4ed0-b7e1-f9a425076f56-kube-api-access-gf6jt\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.457122 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.457619 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="ceilometer-central-agent" containerID="cri-o://2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388" gracePeriod=30 Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.458904 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="proxy-httpd" containerID="cri-o://c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975" gracePeriod=30 Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.459118 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="ceilometer-notification-agent" containerID="cri-o://06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab" gracePeriod=30 Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.459182 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="sg-core" containerID="cri-o://48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb" gracePeriod=30 Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.502174 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.502381 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="14fdb1f3-fd7f-4b31-ac34-42438c44720a" containerName="kube-state-metrics" containerID="cri-o://6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b" gracePeriod=30 Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.507161 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.557303 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt99p\" (UniqueName: \"kubernetes.io/projected/826794b9-41ec-4cab-bc85-d426d8e2a38b-kube-api-access-zt99p\") pod \"826794b9-41ec-4cab-bc85-d426d8e2a38b\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.557436 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-combined-ca-bundle\") pod \"826794b9-41ec-4cab-bc85-d426d8e2a38b\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.557486 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-internal-tls-certs\") pod \"826794b9-41ec-4cab-bc85-d426d8e2a38b\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.557573 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/826794b9-41ec-4cab-bc85-d426d8e2a38b-logs\") pod \"826794b9-41ec-4cab-bc85-d426d8e2a38b\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.557609 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-scripts\") pod \"826794b9-41ec-4cab-bc85-d426d8e2a38b\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.557645 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-config-data\") pod \"826794b9-41ec-4cab-bc85-d426d8e2a38b\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.557695 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-public-tls-certs\") pod \"826794b9-41ec-4cab-bc85-d426d8e2a38b\" (UID: \"826794b9-41ec-4cab-bc85-d426d8e2a38b\") " Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.561428 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/826794b9-41ec-4cab-bc85-d426d8e2a38b-logs" (OuterVolumeSpecName: "logs") pod "826794b9-41ec-4cab-bc85-d426d8e2a38b" (UID: "826794b9-41ec-4cab-bc85-d426d8e2a38b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.572670 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-scripts" (OuterVolumeSpecName: "scripts") pod "826794b9-41ec-4cab-bc85-d426d8e2a38b" (UID: "826794b9-41ec-4cab-bc85-d426d8e2a38b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.582557 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/826794b9-41ec-4cab-bc85-d426d8e2a38b-kube-api-access-zt99p" (OuterVolumeSpecName: "kube-api-access-zt99p") pod "826794b9-41ec-4cab-bc85-d426d8e2a38b" (UID: "826794b9-41ec-4cab-bc85-d426d8e2a38b"). InnerVolumeSpecName "kube-api-access-zt99p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.636670 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c" path="/var/lib/kubelet/pods/02cf3ce6-6cec-451b-82c0-fdf9f1b7e10c/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.638654 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cbfa737-846e-425a-b747-869de3afaf89" path="/var/lib/kubelet/pods/1cbfa737-846e-425a-b747-869de3afaf89/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.650606 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304e2eba-fa9b-43b4-91b9-7bfdc48f9de8" path="/var/lib/kubelet/pods/304e2eba-fa9b-43b4-91b9-7bfdc48f9de8/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.658156 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e362102-0c50-415e-8108-82eb18632381" path="/var/lib/kubelet/pods/3e362102-0c50-415e-8108-82eb18632381/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.659407 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c96441-7032-49b6-b5fe-129ed26c4e38" path="/var/lib/kubelet/pods/45c96441-7032-49b6-b5fe-129ed26c4e38/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.662468 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a092179-7f71-47ce-9764-df909331a819" path="/var/lib/kubelet/pods/4a092179-7f71-47ce-9764-df909331a819/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.673986 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e" path="/var/lib/kubelet/pods/55a6c6cd-96e1-4e00-a6ad-eb4c23d1585e/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.676995 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a4439f-50e7-4620-bf95-48d591ec6e3a" path="/var/lib/kubelet/pods/73a4439f-50e7-4620-bf95-48d591ec6e3a/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.678421 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebcabfa0-9931-4c7d-a0e7-a0337bb887ea" path="/var/lib/kubelet/pods/ebcabfa0-9931-4c7d-a0e7-a0337bb887ea/volumes" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.704792 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/826794b9-41ec-4cab-bc85-d426d8e2a38b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.704839 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.704867 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt99p\" (UniqueName: \"kubernetes.io/projected/826794b9-41ec-4cab-bc85-d426d8e2a38b-kube-api-access-zt99p\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.841640 4786 generic.go:334] "Generic (PLEG): container finished" podID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerID="d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64" exitCode=0 Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.841756 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-865786b7bb-9cnjb" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.848975 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "826794b9-41ec-4cab-bc85-d426d8e2a38b" (UID: "826794b9-41ec-4cab-bc85-d426d8e2a38b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.875798 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-865786b7bb-9cnjb" event={"ID":"826794b9-41ec-4cab-bc85-d426d8e2a38b","Type":"ContainerDied","Data":"d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64"} Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.875846 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-865786b7bb-9cnjb" event={"ID":"826794b9-41ec-4cab-bc85-d426d8e2a38b","Type":"ContainerDied","Data":"011b25deaef5769111da43bda3c7b349931dde136cd29387f48e5f16283a07da"} Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.875881 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.875899 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4eba-account-create-update-d9lrg"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.875910 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4eba-account-create-update-d9lrg"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.875924 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4eba-account-create-update-r4vjh"] Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.876261 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerName="placement-log" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.876279 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerName="placement-log" Mar 13 15:28:20 crc kubenswrapper[4786]: E0313 15:28:20.876302 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerName="placement-api" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.876308 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerName="placement-api" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.876462 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerName="placement-api" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.876481 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" containerName="placement-log" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.876997 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4eba-account-create-update-r4vjh"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.877018 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-n6qr2"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.877028 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cn57v"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.877093 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.879954 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="393ef3eb-1c5f-4a06-a815-fe394d372ee6" containerName="memcached" containerID="cri-o://02c8962944741a5e21d0094c41f35cebfff6ed06de1fa2ee62e1f8c35e344d05" gracePeriod=30 Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.880092 4786 scope.go:117] "RemoveContainer" containerID="d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.889187 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.889456 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "826794b9-41ec-4cab-bc85-d426d8e2a38b" (UID: "826794b9-41ec-4cab-bc85-d426d8e2a38b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.891131 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-config-data" (OuterVolumeSpecName: "config-data") pod "826794b9-41ec-4cab-bc85-d426d8e2a38b" (UID: "826794b9-41ec-4cab-bc85-d426d8e2a38b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.901408 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.902982 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7bd8dddbcb-wmbct"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.904266 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7bd8dddbcb-wmbct" podUID="9f459571-980e-439d-9dc2-72c0461a20c9" containerName="keystone-api" containerID="cri-o://9ac0d8df2995d1804d0af60083262cacb5da80d9832a9709ff3c01990e8e9cea" gracePeriod=30 Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.904728 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-df75-account-create-update-92ppw" event={"ID":"cda350b1-fe2a-4ed0-b7e1-f9a425076f56","Type":"ContainerDied","Data":"dbcf6f1d1806705bd451487457fdaa0afe9c1706f1f013b480eb48fbc8197d54"} Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.904877 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-df75-account-create-update-92ppw" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.917839 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.918258 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.918327 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.924508 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-n6qr2"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.950997 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cn57v"] Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.984516 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58644c46cc-wt6m2" event={"ID":"3b415188-88f4-447e-a1e9-ca424047ee8e","Type":"ContainerDied","Data":"363cdc4ab6757aea8542ba65a07dc6c93649f3ff64028068f1a81e799bc90fc5"} Mar 13 15:28:20 crc kubenswrapper[4786]: I0313 15:28:20.984701 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58644c46cc-wt6m2" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.023707 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts\") pod \"keystone-4eba-account-create-update-r4vjh\" (UID: \"40c8d88f-3851-455c-bfad-2e30c7531250\") " pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.023828 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shz6f\" (UniqueName: \"kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f\") pod \"keystone-4eba-account-create-update-r4vjh\" (UID: \"40c8d88f-3851-455c-bfad-2e30c7531250\") " pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.031903 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-x4fwm"] Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.035436 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "826794b9-41ec-4cab-bc85-d426d8e2a38b" (UID: "826794b9-41ec-4cab-bc85-d426d8e2a38b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.046198 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-x4fwm"] Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.053319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a23c-account-create-update-x8krv" event={"ID":"226d3f05-c83e-4267-b6da-e30ceae91436","Type":"ContainerDied","Data":"177690b7ecc97e889d5b429882cc19e17152f75db3f3a754c2f07b4f1ec96650"} Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.053443 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a23c-account-create-update-x8krv" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.060619 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4eba-account-create-update-r4vjh"] Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.066766 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.178:8776/healthcheck\": read tcp 10.217.0.2:36112->10.217.0.178:8776: read: connection reset by peer" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.124301 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d567z"] Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.125194 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts\") pod \"keystone-4eba-account-create-update-r4vjh\" (UID: \"40c8d88f-3851-455c-bfad-2e30c7531250\") " pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.125322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shz6f\" (UniqueName: \"kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f\") pod \"keystone-4eba-account-create-update-r4vjh\" (UID: \"40c8d88f-3851-455c-bfad-2e30c7531250\") " pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.125451 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/826794b9-41ec-4cab-bc85-d426d8e2a38b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.125775 4786 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.125825 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts podName:40c8d88f-3851-455c-bfad-2e30c7531250 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:21.625805811 +0000 UTC m=+1531.789017622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts") pod "keystone-4eba-account-create-update-r4vjh" (UID: "40c8d88f-3851-455c-bfad-2e30c7531250") : configmap "openstack-scripts" not found Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.132544 4786 projected.go:194] Error preparing data for projected volume kube-api-access-shz6f for pod openstack/keystone-4eba-account-create-update-r4vjh: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.132611 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f podName:40c8d88f-3851-455c-bfad-2e30c7531250 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:21.632593349 +0000 UTC m=+1531.795805160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-shz6f" (UniqueName: "kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f") pod "keystone-4eba-account-create-update-r4vjh" (UID: "40c8d88f-3851-455c-bfad-2e30c7531250") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.135419 4786 generic.go:334] "Generic (PLEG): container finished" podID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerID="c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975" exitCode=0 Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.135470 4786 generic.go:334] "Generic (PLEG): container finished" podID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerID="48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb" exitCode=2 Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.135507 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerDied","Data":"c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975"} Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.135549 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerDied","Data":"48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb"} Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.152215 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cfc0-account-create-update-8f6fk" event={"ID":"6297c29d-09ec-49e7-ae22-6b20962603a7","Type":"ContainerDied","Data":"a851b1e516036752eea126d71749a496f752be33f1e55d9812d4b1b2b51d3fb0"} Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.152348 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cfc0-account-create-update-8f6fk" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.157640 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ef50-account-create-update-bdrf5" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.157804 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ef50-account-create-update-bdrf5" event={"ID":"f8bd4962-9f30-4961-8dcb-7604ef587af6","Type":"ContainerDied","Data":"35caf65140a99c3295452984e333f7b6a6e992affd8d0a2ca4a42af11a1e6e7a"} Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.229214 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d567z"] Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.367480 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerName="galera" containerID="cri-o://ef204212cb679a6a1f4f8cb961bcf40917c2172c42b199b0070296a420877d86" gracePeriod=30 Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.390418 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef204212cb679a6a1f4f8cb961bcf40917c2172c42b199b0070296a420877d86" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.391832 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef204212cb679a6a1f4f8cb961bcf40917c2172c42b199b0070296a420877d86" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.393932 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef204212cb679a6a1f4f8cb961bcf40917c2172c42b199b0070296a420877d86" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.393969 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerName="galera" Mar 13 15:28:21 crc kubenswrapper[4786]: W0313 15:28:21.431076 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1355c383_567c_4c71_a13c_e46f29dd5f8e.slice/crio-02604f20799a709924ba9b2526fa3f3896d95e874833831f5f069013df8e6487 WatchSource:0}: Error finding container 02604f20799a709924ba9b2526fa3f3896d95e874833831f5f069013df8e6487: Status 404 returned error can't find the container with id 02604f20799a709924ba9b2526fa3f3896d95e874833831f5f069013df8e6487 Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.434289 4786 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 15:28:21 crc kubenswrapper[4786]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 13 15:28:21 crc kubenswrapper[4786]: Mar 13 15:28:21 crc kubenswrapper[4786]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 13 15:28:21 crc kubenswrapper[4786]: Mar 13 15:28:21 crc kubenswrapper[4786]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 13 15:28:21 crc kubenswrapper[4786]: Mar 13 15:28:21 crc kubenswrapper[4786]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 13 15:28:21 crc kubenswrapper[4786]: Mar 13 15:28:21 crc kubenswrapper[4786]: if [ -n "" ]; then Mar 13 15:28:21 crc kubenswrapper[4786]: GRANT_DATABASE="" Mar 13 15:28:21 crc kubenswrapper[4786]: else Mar 13 15:28:21 crc kubenswrapper[4786]: GRANT_DATABASE="*" Mar 13 15:28:21 crc kubenswrapper[4786]: fi Mar 13 15:28:21 crc kubenswrapper[4786]: Mar 13 15:28:21 crc kubenswrapper[4786]: # going for maximum compatibility here: Mar 13 15:28:21 crc kubenswrapper[4786]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 13 15:28:21 crc kubenswrapper[4786]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 13 15:28:21 crc kubenswrapper[4786]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 13 15:28:21 crc kubenswrapper[4786]: # support updates Mar 13 15:28:21 crc kubenswrapper[4786]: Mar 13 15:28:21 crc kubenswrapper[4786]: $MYSQL_CMD < logger="UnhandledError" Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.435399 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-d567z" podUID="1355c383-567c-4c71-a13c-e46f29dd5f8e" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.640638 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shz6f\" (UniqueName: \"kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f\") pod \"keystone-4eba-account-create-update-r4vjh\" (UID: \"40c8d88f-3851-455c-bfad-2e30c7531250\") " pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:21 crc kubenswrapper[4786]: I0313 15:28:21.641089 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts\") pod \"keystone-4eba-account-create-update-r4vjh\" (UID: \"40c8d88f-3851-455c-bfad-2e30c7531250\") " pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.641210 4786 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.641259 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts podName:40c8d88f-3851-455c-bfad-2e30c7531250 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:22.641242259 +0000 UTC m=+1532.804454070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts") pod "keystone-4eba-account-create-update-r4vjh" (UID: "40c8d88f-3851-455c-bfad-2e30c7531250") : configmap "openstack-scripts" not found Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.647197 4786 projected.go:194] Error preparing data for projected volume kube-api-access-shz6f for pod openstack/keystone-4eba-account-create-update-r4vjh: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.647254 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f podName:40c8d88f-3851-455c-bfad-2e30c7531250 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:22.647236528 +0000 UTC m=+1532.810448339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-shz6f" (UniqueName: "kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f") pod "keystone-4eba-account-create-update-r4vjh" (UID: "40c8d88f-3851-455c-bfad-2e30c7531250") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.648822 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0 is running failed: container process not found" containerID="8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.662820 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0 is running failed: container process not found" containerID="8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.664417 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0 is running failed: container process not found" containerID="8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 15:28:21 crc kubenswrapper[4786]: E0313 15:28:21.664485 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="2c66255a-19d5-4417-bf43-f7f5bfff892a" containerName="nova-cell0-conductor-conductor" Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.088652 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32 is running failed: container process not found" containerID="ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.089190 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32 is running failed: container process not found" containerID="ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.089573 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32 is running failed: container process not found" containerID="ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.089604 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f6b8537d-23ab-4c8d-9ca7-b307562baad8" containerName="nova-cell1-conductor-conductor" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.145251 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.164787 4786 scope.go:117] "RemoveContainer" containerID="241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.172350 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.173091 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-shz6f operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-4eba-account-create-update-r4vjh" podUID="40c8d88f-3851-455c-bfad-2e30c7531250" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.179130 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.180676 4786 generic.go:334] "Generic (PLEG): container finished" podID="8609052d-1ba2-4888-b973-05c8e4663632" containerID="f8fb520920d825dd25aaead12b8048c225a995e2f744b27b8f73261839e24997" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.180771 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8609052d-1ba2-4888-b973-05c8e4663632","Type":"ContainerDied","Data":"f8fb520920d825dd25aaead12b8048c225a995e2f744b27b8f73261839e24997"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.180814 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8609052d-1ba2-4888-b973-05c8e4663632","Type":"ContainerDied","Data":"b38ed59273199c0dc37509362794a0bd3dd98b8d473e82a4d867134cd3608924"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.180824 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38ed59273199c0dc37509362794a0bd3dd98b8d473e82a4d867134cd3608924" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.185489 4786 generic.go:334] "Generic (PLEG): container finished" podID="393ef3eb-1c5f-4a06-a815-fe394d372ee6" containerID="02c8962944741a5e21d0094c41f35cebfff6ed06de1fa2ee62e1f8c35e344d05" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.185565 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"393ef3eb-1c5f-4a06-a815-fe394d372ee6","Type":"ContainerDied","Data":"02c8962944741a5e21d0094c41f35cebfff6ed06de1fa2ee62e1f8c35e344d05"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.185597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"393ef3eb-1c5f-4a06-a815-fe394d372ee6","Type":"ContainerDied","Data":"f5923edfa2272384b6d33906e3b89ee8ea401a15793f63bd65c140f510f4b2a5"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.185611 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5923edfa2272384b6d33906e3b89ee8ea401a15793f63bd65c140f510f4b2a5" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.198011 4786 generic.go:334] "Generic (PLEG): container finished" podID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerID="89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.198117 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e313e1cc-ed94-4e28-84f8-d053dcffb16a","Type":"ContainerDied","Data":"89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.198195 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e313e1cc-ed94-4e28-84f8-d053dcffb16a","Type":"ContainerDied","Data":"4a50e4da5ff6dc67e6d3badae4e378cb9557236b3ff8c20495e62517ed87ccb1"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.198265 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.198950 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.202042 4786 generic.go:334] "Generic (PLEG): container finished" podID="14fdb1f3-fd7f-4b31-ac34-42438c44720a" containerID="6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b" exitCode=2 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.202104 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14fdb1f3-fd7f-4b31-ac34-42438c44720a","Type":"ContainerDied","Data":"6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.202129 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14fdb1f3-fd7f-4b31-ac34-42438c44720a","Type":"ContainerDied","Data":"e51693fd6cfbfdedf1aea0b3db8284ad8b36ad6d7676caae6af0a1fcd84ee1d2"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.202175 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.207622 4786 scope.go:117] "RemoveContainer" containerID="d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64" Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.208283 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64\": container with ID starting with d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64 not found: ID does not exist" containerID="d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.208313 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64"} err="failed to get container status \"d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64\": rpc error: code = NotFound desc = could not find container \"d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64\": container with ID starting with d651b494475b22814af03da351fd3f9c567348fe5c8f88b18e93899fecf1de64 not found: ID does not exist" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.208335 4786 scope.go:117] "RemoveContainer" containerID="241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e" Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.214456 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e\": container with ID starting with 241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e not found: ID does not exist" containerID="241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.214529 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e"} err="failed to get container status \"241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e\": rpc error: code = NotFound desc = could not find container \"241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e\": container with ID starting with 241e8a4975b00c96a19131b14d1a5471eaf130b065a0bbaa917b757ad5a5af3e not found: ID does not exist" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.214555 4786 scope.go:117] "RemoveContainer" containerID="04eabc29555d142e746eeaf8979b97c2eb9926d8687b2f8cbceebf6652f56b8c" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.215092 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ef50-account-create-update-bdrf5"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.227464 4786 generic.go:334] "Generic (PLEG): container finished" podID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerID="ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.227565 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54bc4948fd-47bbp" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.230743 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ef50-account-create-update-bdrf5"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.230784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54bc4948fd-47bbp" event={"ID":"94c381c8-c97e-4159-9bb4-3ede8f12d6e0","Type":"ContainerDied","Data":"ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.230815 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54bc4948fd-47bbp" event={"ID":"94c381c8-c97e-4159-9bb4-3ede8f12d6e0","Type":"ContainerDied","Data":"62377fae63cb2c4eef5f4e3219777937337e511debb2239e060c7b16f2d22b75"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.233476 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.239566 4786 generic.go:334] "Generic (PLEG): container finished" podID="54a96a07-f63f-47d9-9191-0548996f01a7" containerID="6fee30319ab254dc362b9cb5404359a46216a536ddf834f8c5f2549a88b37dcc" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.240042 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a96a07-f63f-47d9-9191-0548996f01a7","Type":"ContainerDied","Data":"6fee30319ab254dc362b9cb5404359a46216a536ddf834f8c5f2549a88b37dcc"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.240150 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54a96a07-f63f-47d9-9191-0548996f01a7","Type":"ContainerDied","Data":"713add33d33bda07f6a00ae2a6b7ef225bb7870c3cdb4671a24900eba46a92a6"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.240262 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="713add33d33bda07f6a00ae2a6b7ef225bb7870c3cdb4671a24900eba46a92a6" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.249656 4786 generic.go:334] "Generic (PLEG): container finished" podID="2c66255a-19d5-4417-bf43-f7f5bfff892a" containerID="8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.250020 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.250037 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2c66255a-19d5-4417-bf43-f7f5bfff892a","Type":"ContainerDied","Data":"8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.250355 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2c66255a-19d5-4417-bf43-f7f5bfff892a","Type":"ContainerDied","Data":"3a5b4bf2950734fade486ffb993657244ff2ff0cbddd1dfdee6d161458e13991"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.253602 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d567z" event={"ID":"1355c383-567c-4c71-a13c-e46f29dd5f8e","Type":"ContainerStarted","Data":"02604f20799a709924ba9b2526fa3f3896d95e874833831f5f069013df8e6487"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.254031 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.263238 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cfc0-account-create-update-8f6fk"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.264424 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-logs\") pod \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.264554 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-combined-ca-bundle\") pod \"8402e30a-1517-41be-b468-1959c4b7621b\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.264592 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-nova-metadata-tls-certs\") pod \"8402e30a-1517-41be-b468-1959c4b7621b\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.264618 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-combined-ca-bundle\") pod \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.264664 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-config-data\") pod \"8402e30a-1517-41be-b468-1959c4b7621b\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.264690 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8402e30a-1517-41be-b468-1959c4b7621b-logs\") pod \"8402e30a-1517-41be-b468-1959c4b7621b\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.264728 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-public-tls-certs\") pod \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.265427 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-combined-ca-bundle\") pod \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.265460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-public-tls-certs\") pod \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.265741 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-config-data\") pod \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.265774 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-internal-tls-certs\") pod \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.265869 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzxgb\" (UniqueName: \"kubernetes.io/projected/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-api-access-rzxgb\") pod \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.265898 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data-custom\") pod \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.265926 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd7qd\" (UniqueName: \"kubernetes.io/projected/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-kube-api-access-jd7qd\") pod \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.265951 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7whtx\" (UniqueName: \"kubernetes.io/projected/e313e1cc-ed94-4e28-84f8-d053dcffb16a-kube-api-access-7whtx\") pod \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.266267 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-internal-tls-certs\") pod \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.266292 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data\") pod \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\" (UID: \"94c381c8-c97e-4159-9bb4-3ede8f12d6e0\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.266355 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hfsx\" (UniqueName: \"kubernetes.io/projected/8402e30a-1517-41be-b468-1959c4b7621b-kube-api-access-6hfsx\") pod \"8402e30a-1517-41be-b468-1959c4b7621b\" (UID: \"8402e30a-1517-41be-b468-1959c4b7621b\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.266395 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-certs\") pod \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.266437 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e313e1cc-ed94-4e28-84f8-d053dcffb16a-logs\") pod \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.266498 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-config\") pod \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\" (UID: \"14fdb1f3-fd7f-4b31-ac34-42438c44720a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.266527 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-combined-ca-bundle\") pod \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\" (UID: \"e313e1cc-ed94-4e28-84f8-d053dcffb16a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.277694 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-logs" (OuterVolumeSpecName: "logs") pod "94c381c8-c97e-4159-9bb4-3ede8f12d6e0" (UID: "94c381c8-c97e-4159-9bb4-3ede8f12d6e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.278356 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cfc0-account-create-update-8f6fk"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.279754 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e313e1cc-ed94-4e28-84f8-d053dcffb16a-logs" (OuterVolumeSpecName: "logs") pod "e313e1cc-ed94-4e28-84f8-d053dcffb16a" (UID: "e313e1cc-ed94-4e28-84f8-d053dcffb16a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.281244 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-api-access-rzxgb" (OuterVolumeSpecName: "kube-api-access-rzxgb") pod "14fdb1f3-fd7f-4b31-ac34-42438c44720a" (UID: "14fdb1f3-fd7f-4b31-ac34-42438c44720a"). InnerVolumeSpecName "kube-api-access-rzxgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.281341 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.281894 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8402e30a-1517-41be-b468-1959c4b7621b-logs" (OuterVolumeSpecName: "logs") pod "8402e30a-1517-41be-b468-1959c4b7621b" (UID: "8402e30a-1517-41be-b468-1959c4b7621b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.282057 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8402e30a-1517-41be-b468-1959c4b7621b-kube-api-access-6hfsx" (OuterVolumeSpecName: "kube-api-access-6hfsx") pod "8402e30a-1517-41be-b468-1959c4b7621b" (UID: "8402e30a-1517-41be-b468-1959c4b7621b"). InnerVolumeSpecName "kube-api-access-6hfsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.291358 4786 scope.go:117] "RemoveContainer" containerID="53da64bef97437f332bd54aa2f803a2a48a781201e425bdc912c1c54d853dc83" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.296941 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.298516 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-x8krv"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.303366 4786 generic.go:334] "Generic (PLEG): container finished" podID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerID="2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.304907 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerDied","Data":"2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.307120 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a23c-account-create-update-x8krv"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.311973 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.316237 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.319942 4786 generic.go:334] "Generic (PLEG): container finished" podID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerID="26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.320086 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7579f6547f-hnpzx" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.320462 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-df75-account-create-update-92ppw"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.320752 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579f6547f-hnpzx" event={"ID":"70dc1403-e7e9-4200-9a87-e3538a17c350","Type":"ContainerDied","Data":"26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.321357 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7579f6547f-hnpzx" event={"ID":"70dc1403-e7e9-4200-9a87-e3538a17c350","Type":"ContainerDied","Data":"0a0d673bcf1c7d505deff9320e646f50ec818815f39bf1e9f9a5801612c21ae9"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.329298 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-df75-account-create-update-92ppw"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.333632 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e313e1cc-ed94-4e28-84f8-d053dcffb16a-kube-api-access-7whtx" (OuterVolumeSpecName: "kube-api-access-7whtx") pod "e313e1cc-ed94-4e28-84f8-d053dcffb16a" (UID: "e313e1cc-ed94-4e28-84f8-d053dcffb16a"). InnerVolumeSpecName "kube-api-access-7whtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.334410 4786 generic.go:334] "Generic (PLEG): container finished" podID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerID="15848ce920e90b7648bab5e64f68ab325ba1bd7a143ec84d35fc3e9aa2a6e33f" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.334600 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7ee1d973-a40c-4db0-8cc7-1c64ece074ac","Type":"ContainerDied","Data":"15848ce920e90b7648bab5e64f68ab325ba1bd7a143ec84d35fc3e9aa2a6e33f"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.334737 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.334953 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-865786b7bb-9cnjb"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.337301 4786 scope.go:117] "RemoveContainer" containerID="89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.338210 4786 generic.go:334] "Generic (PLEG): container finished" podID="8402e30a-1517-41be-b468-1959c4b7621b" containerID="aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.338250 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8402e30a-1517-41be-b468-1959c4b7621b","Type":"ContainerDied","Data":"aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.338280 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8402e30a-1517-41be-b468-1959c4b7621b","Type":"ContainerDied","Data":"eb5d1863880bfc7d9eb63e471ad4f565fc87984711c146488a211011d3bacb97"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.338330 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.340096 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-kube-api-access-jd7qd" (OuterVolumeSpecName: "kube-api-access-jd7qd") pod "94c381c8-c97e-4159-9bb4-3ede8f12d6e0" (UID: "94c381c8-c97e-4159-9bb4-3ede8f12d6e0"). InnerVolumeSpecName "kube-api-access-jd7qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.341209 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "94c381c8-c97e-4159-9bb4-3ede8f12d6e0" (UID: "94c381c8-c97e-4159-9bb4-3ede8f12d6e0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.349936 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-865786b7bb-9cnjb"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.349996 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6b8537d-23ab-4c8d-9ca7-b307562baad8","Type":"ContainerDied","Data":"ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32"} Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.348934 4786 generic.go:334] "Generic (PLEG): container finished" podID="f6b8537d-23ab-4c8d-9ca7-b307562baad8" containerID="ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32" exitCode=0 Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.352013 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-58644c46cc-wt6m2"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.358232 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-58644c46cc-wt6m2"] Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368331 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-config-data\") pod \"8609052d-1ba2-4888-b973-05c8e4663632\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-httpd-run\") pod \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368391 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-scripts\") pod \"8609052d-1ba2-4888-b973-05c8e4663632\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368429 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-combined-ca-bundle\") pod \"2c66255a-19d5-4417-bf43-f7f5bfff892a\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368445 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8609052d-1ba2-4888-b973-05c8e4663632\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368464 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data-custom\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368481 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-combined-ca-bundle\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368505 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-config-data\") pod \"2c66255a-19d5-4417-bf43-f7f5bfff892a\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368522 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-combined-ca-bundle\") pod \"70dc1403-e7e9-4200-9a87-e3538a17c350\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368538 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-combined-ca-bundle\") pod \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368579 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfn6l\" (UniqueName: \"kubernetes.io/projected/70dc1403-e7e9-4200-9a87-e3538a17c350-kube-api-access-kfn6l\") pod \"70dc1403-e7e9-4200-9a87-e3538a17c350\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368597 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-public-tls-certs\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368620 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5lg4\" (UniqueName: \"kubernetes.io/projected/8609052d-1ba2-4888-b973-05c8e4663632-kube-api-access-f5lg4\") pod \"8609052d-1ba2-4888-b973-05c8e4663632\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368635 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-internal-tls-certs\") pod \"8609052d-1ba2-4888-b973-05c8e4663632\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368652 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70dc1403-e7e9-4200-9a87-e3538a17c350-logs\") pod \"70dc1403-e7e9-4200-9a87-e3538a17c350\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368675 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnnxb\" (UniqueName: \"kubernetes.io/projected/54a96a07-f63f-47d9-9191-0548996f01a7-kube-api-access-hnnxb\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368691 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368707 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-config-data\") pod \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368720 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-httpd-run\") pod \"8609052d-1ba2-4888-b973-05c8e4663632\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368749 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-config-data\") pod \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368772 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-scripts\") pod \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368787 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-memcached-tls-certs\") pod \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data-custom\") pod \"70dc1403-e7e9-4200-9a87-e3538a17c350\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368837 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-internal-tls-certs\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368877 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qskg7\" (UniqueName: \"kubernetes.io/projected/2c66255a-19d5-4417-bf43-f7f5bfff892a-kube-api-access-qskg7\") pod \"2c66255a-19d5-4417-bf43-f7f5bfff892a\" (UID: \"2c66255a-19d5-4417-bf43-f7f5bfff892a\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368892 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a96a07-f63f-47d9-9191-0548996f01a7-logs\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368905 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-scripts\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368921 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vb6r\" (UniqueName: \"kubernetes.io/projected/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-kube-api-access-7vb6r\") pod \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368946 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-logs\") pod \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.368975 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kolla-config\") pod \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369002 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data\") pod \"70dc1403-e7e9-4200-9a87-e3538a17c350\" (UID: \"70dc1403-e7e9-4200-9a87-e3538a17c350\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369017 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-public-tls-certs\") pod \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\" (UID: \"7ee1d973-a40c-4db0-8cc7-1c64ece074ac\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369031 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-combined-ca-bundle\") pod \"8609052d-1ba2-4888-b973-05c8e4663632\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369068 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-combined-ca-bundle\") pod \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369083 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369099 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54a96a07-f63f-47d9-9191-0548996f01a7-etc-machine-id\") pod \"54a96a07-f63f-47d9-9191-0548996f01a7\" (UID: \"54a96a07-f63f-47d9-9191-0548996f01a7\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369125 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-logs\") pod \"8609052d-1ba2-4888-b973-05c8e4663632\" (UID: \"8609052d-1ba2-4888-b973-05c8e4663632\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369141 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vwnw\" (UniqueName: \"kubernetes.io/projected/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kube-api-access-4vwnw\") pod \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\" (UID: \"393ef3eb-1c5f-4a06-a815-fe394d372ee6\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369413 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8402e30a-1517-41be-b468-1959c4b7621b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369424 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzxgb\" (UniqueName: \"kubernetes.io/projected/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-api-access-rzxgb\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369434 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369442 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd7qd\" (UniqueName: \"kubernetes.io/projected/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-kube-api-access-jd7qd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369451 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7whtx\" (UniqueName: \"kubernetes.io/projected/e313e1cc-ed94-4e28-84f8-d053dcffb16a-kube-api-access-7whtx\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369459 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hfsx\" (UniqueName: \"kubernetes.io/projected/8402e30a-1517-41be-b468-1959c4b7621b-kube-api-access-6hfsx\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369466 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e313e1cc-ed94-4e28-84f8-d053dcffb16a-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.369474 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.390298 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8609052d-1ba2-4888-b973-05c8e4663632" (UID: "8609052d-1ba2-4888-b973-05c8e4663632"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.393587 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70dc1403-e7e9-4200-9a87-e3538a17c350-logs" (OuterVolumeSpecName: "logs") pod "70dc1403-e7e9-4200-9a87-e3538a17c350" (UID: "70dc1403-e7e9-4200-9a87-e3538a17c350"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.393632 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-config-data" (OuterVolumeSpecName: "config-data") pod "393ef3eb-1c5f-4a06-a815-fe394d372ee6" (UID: "393ef3eb-1c5f-4a06-a815-fe394d372ee6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.394059 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-logs" (OuterVolumeSpecName: "logs") pod "7ee1d973-a40c-4db0-8cc7-1c64ece074ac" (UID: "7ee1d973-a40c-4db0-8cc7-1c64ece074ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.394309 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "393ef3eb-1c5f-4a06-a815-fe394d372ee6" (UID: "393ef3eb-1c5f-4a06-a815-fe394d372ee6"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.401284 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54a96a07-f63f-47d9-9191-0548996f01a7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.401439 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7ee1d973-a40c-4db0-8cc7-1c64ece074ac" (UID: "7ee1d973-a40c-4db0-8cc7-1c64ece074ac"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.401515 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a96a07-f63f-47d9-9191-0548996f01a7-logs" (OuterVolumeSpecName: "logs") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.403084 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-logs" (OuterVolumeSpecName: "logs") pod "8609052d-1ba2-4888-b973-05c8e4663632" (UID: "8609052d-1ba2-4888-b973-05c8e4663632"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.425585 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kube-api-access-4vwnw" (OuterVolumeSpecName: "kube-api-access-4vwnw") pod "393ef3eb-1c5f-4a06-a815-fe394d372ee6" (UID: "393ef3eb-1c5f-4a06-a815-fe394d372ee6"). InnerVolumeSpecName "kube-api-access-4vwnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.425738 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "7ee1d973-a40c-4db0-8cc7-1c64ece074ac" (UID: "7ee1d973-a40c-4db0-8cc7-1c64ece074ac"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.425975 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70dc1403-e7e9-4200-9a87-e3538a17c350-kube-api-access-kfn6l" (OuterVolumeSpecName: "kube-api-access-kfn6l") pod "70dc1403-e7e9-4200-9a87-e3538a17c350" (UID: "70dc1403-e7e9-4200-9a87-e3538a17c350"). InnerVolumeSpecName "kube-api-access-kfn6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.429186 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8609052d-1ba2-4888-b973-05c8e4663632-kube-api-access-f5lg4" (OuterVolumeSpecName: "kube-api-access-f5lg4") pod "8609052d-1ba2-4888-b973-05c8e4663632" (UID: "8609052d-1ba2-4888-b973-05c8e4663632"). InnerVolumeSpecName "kube-api-access-f5lg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.430375 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-scripts" (OuterVolumeSpecName: "scripts") pod "7ee1d973-a40c-4db0-8cc7-1c64ece074ac" (UID: "7ee1d973-a40c-4db0-8cc7-1c64ece074ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.430460 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-scripts" (OuterVolumeSpecName: "scripts") pod "8609052d-1ba2-4888-b973-05c8e4663632" (UID: "8609052d-1ba2-4888-b973-05c8e4663632"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.430560 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a96a07-f63f-47d9-9191-0548996f01a7-kube-api-access-hnnxb" (OuterVolumeSpecName: "kube-api-access-hnnxb") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "kube-api-access-hnnxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.434291 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.436718 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c66255a-19d5-4417-bf43-f7f5bfff892a-kube-api-access-qskg7" (OuterVolumeSpecName: "kube-api-access-qskg7") pod "2c66255a-19d5-4417-bf43-f7f5bfff892a" (UID: "2c66255a-19d5-4417-bf43-f7f5bfff892a"). InnerVolumeSpecName "kube-api-access-qskg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.436827 4786 scope.go:117] "RemoveContainer" containerID="94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.438061 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-scripts" (OuterVolumeSpecName: "scripts") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.450514 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-kube-api-access-7vb6r" (OuterVolumeSpecName: "kube-api-access-7vb6r") pod "7ee1d973-a40c-4db0-8cc7-1c64ece074ac" (UID: "7ee1d973-a40c-4db0-8cc7-1c64ece074ac"). InnerVolumeSpecName "kube-api-access-7vb6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.450610 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "70dc1403-e7e9-4200-9a87-e3538a17c350" (UID: "70dc1403-e7e9-4200-9a87-e3538a17c350"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.472397 4786 scope.go:117] "RemoveContainer" containerID="89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a" Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.473061 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a\": container with ID starting with 89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a not found: ID does not exist" containerID="89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473088 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a"} err="failed to get container status \"89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a\": rpc error: code = NotFound desc = could not find container \"89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a\": container with ID starting with 89497a79ba70deab2d15ed43cd966b2aba9acdade7f9af6c2abf5acb36dc808a not found: ID does not exist" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473108 4786 scope.go:117] "RemoveContainer" containerID="94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473174 4786 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473205 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54a96a07-f63f-47d9-9191-0548996f01a7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473214 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vwnw\" (UniqueName: \"kubernetes.io/projected/393ef3eb-1c5f-4a06-a815-fe394d372ee6-kube-api-access-4vwnw\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473226 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473235 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473247 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473257 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473266 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfn6l\" (UniqueName: \"kubernetes.io/projected/70dc1403-e7e9-4200-9a87-e3538a17c350-kube-api-access-kfn6l\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473275 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5lg4\" (UniqueName: \"kubernetes.io/projected/8609052d-1ba2-4888-b973-05c8e4663632-kube-api-access-f5lg4\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473286 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70dc1403-e7e9-4200-9a87-e3538a17c350-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.473285 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8\": container with ID starting with 94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8 not found: ID does not exist" containerID="94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473305 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8"} err="failed to get container status \"94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8\": rpc error: code = NotFound desc = could not find container \"94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8\": container with ID starting with 94feb6a41287552cebdc794f24400d4e2062dbf4d281941edfd902ac9db4b1a8 not found: ID does not exist" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473316 4786 scope.go:117] "RemoveContainer" containerID="6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473295 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnnxb\" (UniqueName: \"kubernetes.io/projected/54a96a07-f63f-47d9-9191-0548996f01a7-kube-api-access-hnnxb\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473466 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473481 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8609052d-1ba2-4888-b973-05c8e4663632-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473500 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/393ef3eb-1c5f-4a06-a815-fe394d372ee6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473512 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473523 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473543 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54a96a07-f63f-47d9-9191-0548996f01a7-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473558 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qskg7\" (UniqueName: \"kubernetes.io/projected/2c66255a-19d5-4417-bf43-f7f5bfff892a-kube-api-access-qskg7\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473569 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473578 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vb6r\" (UniqueName: \"kubernetes.io/projected/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-kube-api-access-7vb6r\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.473591 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.479133 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.530008 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "8609052d-1ba2-4888-b973-05c8e4663632" (UID: "8609052d-1ba2-4888-b973-05c8e4663632"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.558155 4786 scope.go:117] "RemoveContainer" containerID="6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b" Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.560082 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b\": container with ID starting with 6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b not found: ID does not exist" containerID="6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.561160 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b"} err="failed to get container status \"6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b\": rpc error: code = NotFound desc = could not find container \"6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b\": container with ID starting with 6deb2ed0b4109d160a09c5b16a895d5eb7b6101f118d1ceee9304c7fc466796b not found: ID does not exist" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.561203 4786 scope.go:117] "RemoveContainer" containerID="ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.565830 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="226d3f05-c83e-4267-b6da-e30ceae91436" path="/var/lib/kubelet/pods/226d3f05-c83e-4267-b6da-e30ceae91436/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.567510 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b29131e-a3b3-45c1-945c-b28cdfff2773" path="/var/lib/kubelet/pods/2b29131e-a3b3-45c1-945c-b28cdfff2773/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.568836 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b415188-88f4-447e-a1e9-ca424047ee8e" path="/var/lib/kubelet/pods/3b415188-88f4-447e-a1e9-ca424047ee8e/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.569584 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6297c29d-09ec-49e7-ae22-6b20962603a7" path="/var/lib/kubelet/pods/6297c29d-09ec-49e7-ae22-6b20962603a7/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.570306 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6430e36a-b911-434c-8d5b-41bf29fa781f" path="/var/lib/kubelet/pods/6430e36a-b911-434c-8d5b-41bf29fa781f/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.572131 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="826794b9-41ec-4cab-bc85-d426d8e2a38b" path="/var/lib/kubelet/pods/826794b9-41ec-4cab-bc85-d426d8e2a38b/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.574963 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2djlj\" (UniqueName: \"kubernetes.io/projected/f6b8537d-23ab-4c8d-9ca7-b307562baad8-kube-api-access-2djlj\") pod \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.575204 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-combined-ca-bundle\") pod \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.575312 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-config-data\") pod \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\" (UID: \"f6b8537d-23ab-4c8d-9ca7-b307562baad8\") " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.575656 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.578557 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="880a2246-5ead-4e58-b471-d6d006ee3053" path="/var/lib/kubelet/pods/880a2246-5ead-4e58-b471-d6d006ee3053/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.579773 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda350b1-fe2a-4ed0-b7e1-f9a425076f56" path="/var/lib/kubelet/pods/cda350b1-fe2a-4ed0-b7e1-f9a425076f56/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.580394 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bd4962-9f30-4961-8dcb-7604ef587af6" path="/var/lib/kubelet/pods/f8bd4962-9f30-4961-8dcb-7604ef587af6/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.580925 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf5dc57-49cf-4a6a-97ff-4a946db6823d" path="/var/lib/kubelet/pods/fbf5dc57-49cf-4a6a-97ff-4a946db6823d/volumes" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.593276 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8402e30a-1517-41be-b468-1959c4b7621b" (UID: "8402e30a-1517-41be-b468-1959c4b7621b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.595760 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-config-data" (OuterVolumeSpecName: "config-data") pod "e313e1cc-ed94-4e28-84f8-d053dcffb16a" (UID: "e313e1cc-ed94-4e28-84f8-d053dcffb16a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.631639 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b8537d-23ab-4c8d-9ca7-b307562baad8-kube-api-access-2djlj" (OuterVolumeSpecName: "kube-api-access-2djlj") pod "f6b8537d-23ab-4c8d-9ca7-b307562baad8" (UID: "f6b8537d-23ab-4c8d-9ca7-b307562baad8"). InnerVolumeSpecName "kube-api-access-2djlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.632988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "14fdb1f3-fd7f-4b31-ac34-42438c44720a" (UID: "14fdb1f3-fd7f-4b31-ac34-42438c44720a"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.681898 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts\") pod \"keystone-4eba-account-create-update-r4vjh\" (UID: \"40c8d88f-3851-455c-bfad-2e30c7531250\") " pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.682796 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shz6f\" (UniqueName: \"kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f\") pod \"keystone-4eba-account-create-update-r4vjh\" (UID: \"40c8d88f-3851-455c-bfad-2e30c7531250\") " pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.684002 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.684024 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.684035 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2djlj\" (UniqueName: \"kubernetes.io/projected/f6b8537d-23ab-4c8d-9ca7-b307562baad8-kube-api-access-2djlj\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.684086 4786 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.684745 4786 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.684792 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts podName:40c8d88f-3851-455c-bfad-2e30c7531250 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:24.68477644 +0000 UTC m=+1534.847988251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts") pod "keystone-4eba-account-create-update-r4vjh" (UID: "40c8d88f-3851-455c-bfad-2e30c7531250") : configmap "openstack-scripts" not found Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.692509 4786 projected.go:194] Error preparing data for projected volume kube-api-access-shz6f for pod openstack/keystone-4eba-account-create-update-r4vjh: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 15:28:22 crc kubenswrapper[4786]: E0313 15:28:22.692567 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f podName:40c8d88f-3851-455c-bfad-2e30c7531250 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:24.692552082 +0000 UTC m=+1534.855763893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-shz6f" (UniqueName: "kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f") pod "keystone-4eba-account-create-update-r4vjh" (UID: "40c8d88f-3851-455c-bfad-2e30c7531250") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.718143 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "393ef3eb-1c5f-4a06-a815-fe394d372ee6" (UID: "393ef3eb-1c5f-4a06-a815-fe394d372ee6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.729433 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6bd4f4c6c-zg28d" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.168:9696/\": dial tcp 10.217.0.168:9696: connect: connection refused" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.739988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14fdb1f3-fd7f-4b31-ac34-42438c44720a" (UID: "14fdb1f3-fd7f-4b31-ac34-42438c44720a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.780160 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94c381c8-c97e-4159-9bb4-3ede8f12d6e0" (UID: "94c381c8-c97e-4159-9bb4-3ede8f12d6e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.791892 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e313e1cc-ed94-4e28-84f8-d053dcffb16a" (UID: "e313e1cc-ed94-4e28-84f8-d053dcffb16a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.792292 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.792315 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.792328 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.792342 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.813173 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6b8537d-23ab-4c8d-9ca7-b307562baad8" (UID: "f6b8537d-23ab-4c8d-9ca7-b307562baad8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.827527 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.843988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-config-data" (OuterVolumeSpecName: "config-data") pod "8402e30a-1517-41be-b468-1959c4b7621b" (UID: "8402e30a-1517-41be-b468-1959c4b7621b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.859087 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data" (OuterVolumeSpecName: "config-data") pod "94c381c8-c97e-4159-9bb4-3ede8f12d6e0" (UID: "94c381c8-c97e-4159-9bb4-3ede8f12d6e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.883288 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70dc1403-e7e9-4200-9a87-e3538a17c350" (UID: "70dc1403-e7e9-4200-9a87-e3538a17c350"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.886517 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7ee1d973-a40c-4db0-8cc7-1c64ece074ac" (UID: "7ee1d973-a40c-4db0-8cc7-1c64ece074ac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.894079 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.894116 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.894125 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.894134 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.894142 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.894186 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.896435 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-config-data" (OuterVolumeSpecName: "config-data") pod "8609052d-1ba2-4888-b973-05c8e4663632" (UID: "8609052d-1ba2-4888-b973-05c8e4663632"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.910998 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.916812 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.935482 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.939000 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-config-data" (OuterVolumeSpecName: "config-data") pod "7ee1d973-a40c-4db0-8cc7-1c64ece074ac" (UID: "7ee1d973-a40c-4db0-8cc7-1c64ece074ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.949308 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ee1d973-a40c-4db0-8cc7-1c64ece074ac" (UID: "7ee1d973-a40c-4db0-8cc7-1c64ece074ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.952951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "94c381c8-c97e-4159-9bb4-3ede8f12d6e0" (UID: "94c381c8-c97e-4159-9bb4-3ede8f12d6e0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.964085 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8609052d-1ba2-4888-b973-05c8e4663632" (UID: "8609052d-1ba2-4888-b973-05c8e4663632"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.968123 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "14fdb1f3-fd7f-4b31-ac34-42438c44720a" (UID: "14fdb1f3-fd7f-4b31-ac34-42438c44720a"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.978698 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8402e30a-1517-41be-b468-1959c4b7621b" (UID: "8402e30a-1517-41be-b468-1959c4b7621b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.979765 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e313e1cc-ed94-4e28-84f8-d053dcffb16a" (UID: "e313e1cc-ed94-4e28-84f8-d053dcffb16a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.981424 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-config-data" (OuterVolumeSpecName: "config-data") pod "2c66255a-19d5-4417-bf43-f7f5bfff892a" (UID: "2c66255a-19d5-4417-bf43-f7f5bfff892a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.984647 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8609052d-1ba2-4888-b973-05c8e4663632" (UID: "8609052d-1ba2-4888-b973-05c8e4663632"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.987085 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-config-data" (OuterVolumeSpecName: "config-data") pod "f6b8537d-23ab-4c8d-9ca7-b307562baad8" (UID: "f6b8537d-23ab-4c8d-9ca7-b307562baad8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.996863 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8402e30a-1517-41be-b468-1959c4b7621b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.997163 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.997292 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.997534 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.997678 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.997759 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.997868 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.998004 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b8537d-23ab-4c8d-9ca7-b307562baad8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.998130 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.998476 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.998660 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee1d973-a40c-4db0-8cc7-1c64ece074ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.998804 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.998982 4786 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/14fdb1f3-fd7f-4b31-ac34-42438c44720a-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.999099 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8609052d-1ba2-4888-b973-05c8e4663632-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:22 crc kubenswrapper[4786]: I0313 15:28:22.997010 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:22.999989 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c66255a-19d5-4417-bf43-f7f5bfff892a" (UID: "2c66255a-19d5-4417-bf43-f7f5bfff892a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.008931 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "393ef3eb-1c5f-4a06-a815-fe394d372ee6" (UID: "393ef3eb-1c5f-4a06-a815-fe394d372ee6"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.012724 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e313e1cc-ed94-4e28-84f8-d053dcffb16a" (UID: "e313e1cc-ed94-4e28-84f8-d053dcffb16a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.013128 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "94c381c8-c97e-4159-9bb4-3ede8f12d6e0" (UID: "94c381c8-c97e-4159-9bb4-3ede8f12d6e0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.019953 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data" (OuterVolumeSpecName: "config-data") pod "70dc1403-e7e9-4200-9a87-e3538a17c350" (UID: "70dc1403-e7e9-4200-9a87-e3538a17c350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.037081 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d567z" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.039118 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data" (OuterVolumeSpecName: "config-data") pod "54a96a07-f63f-47d9-9191-0548996f01a7" (UID: "54a96a07-f63f-47d9-9191-0548996f01a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.043482 4786 scope.go:117] "RemoveContainer" containerID="72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.099931 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmf7t\" (UniqueName: \"kubernetes.io/projected/1355c383-567c-4c71-a13c-e46f29dd5f8e-kube-api-access-gmf7t\") pod \"1355c383-567c-4c71-a13c-e46f29dd5f8e\" (UID: \"1355c383-567c-4c71-a13c-e46f29dd5f8e\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.101494 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1355c383-567c-4c71-a13c-e46f29dd5f8e-operator-scripts\") pod \"1355c383-567c-4c71-a13c-e46f29dd5f8e\" (UID: \"1355c383-567c-4c71-a13c-e46f29dd5f8e\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.102364 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c66255a-19d5-4417-bf43-f7f5bfff892a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.102386 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c381c8-c97e-4159-9bb4-3ede8f12d6e0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.102396 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e313e1cc-ed94-4e28-84f8-d053dcffb16a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.102408 4786 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/393ef3eb-1c5f-4a06-a815-fe394d372ee6-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.102420 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.102431 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dc1403-e7e9-4200-9a87-e3538a17c350-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.102443 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54a96a07-f63f-47d9-9191-0548996f01a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.113841 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1355c383-567c-4c71-a13c-e46f29dd5f8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1355c383-567c-4c71-a13c-e46f29dd5f8e" (UID: "1355c383-567c-4c71-a13c-e46f29dd5f8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.116611 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1355c383-567c-4c71-a13c-e46f29dd5f8e-kube-api-access-gmf7t" (OuterVolumeSpecName: "kube-api-access-gmf7t") pod "1355c383-567c-4c71-a13c-e46f29dd5f8e" (UID: "1355c383-567c-4c71-a13c-e46f29dd5f8e"). InnerVolumeSpecName "kube-api-access-gmf7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.129573 4786 scope.go:117] "RemoveContainer" containerID="ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87" Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.130191 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87\": container with ID starting with ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87 not found: ID does not exist" containerID="ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.130227 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87"} err="failed to get container status \"ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87\": rpc error: code = NotFound desc = could not find container \"ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87\": container with ID starting with ffc765b74f0e2545309d318bebea7c936262e28d63c3c21558ecdae50a0e0e87 not found: ID does not exist" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.130252 4786 scope.go:117] "RemoveContainer" containerID="72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca" Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.130494 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca\": container with ID starting with 72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca not found: ID does not exist" containerID="72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.130527 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca"} err="failed to get container status \"72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca\": rpc error: code = NotFound desc = could not find container \"72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca\": container with ID starting with 72eb9678e125153c0d4d36bffdd905037a55a19b7f2c825717a0f652767497ca not found: ID does not exist" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.130542 4786 scope.go:117] "RemoveContainer" containerID="8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.157679 4786 scope.go:117] "RemoveContainer" containerID="8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0" Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.158332 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0\": container with ID starting with 8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0 not found: ID does not exist" containerID="8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.159065 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0"} err="failed to get container status \"8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0\": rpc error: code = NotFound desc = could not find container \"8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0\": container with ID starting with 8f90ca00e24fcc8133f58fb1e269da68fbaa526d3031d30ce657a6dfa7cc21a0 not found: ID does not exist" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.159145 4786 scope.go:117] "RemoveContainer" containerID="26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.188654 4786 scope.go:117] "RemoveContainer" containerID="1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.198038 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54bc4948fd-47bbp"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.208200 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-54bc4948fd-47bbp"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.218174 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.218238 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmf7t\" (UniqueName: \"kubernetes.io/projected/1355c383-567c-4c71-a13c-e46f29dd5f8e-kube-api-access-gmf7t\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.218272 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1355c383-567c-4c71-a13c-e46f29dd5f8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.230459 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.237578 4786 scope.go:117] "RemoveContainer" containerID="26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e" Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.240705 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e\": container with ID starting with 26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e not found: ID does not exist" containerID="26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.240750 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e"} err="failed to get container status \"26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e\": rpc error: code = NotFound desc = could not find container \"26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e\": container with ID starting with 26d5646fa60a58d3c485a68a118a3f5fe8f4e4ed98f71b226347d1357beacf6e not found: ID does not exist" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.240783 4786 scope.go:117] "RemoveContainer" containerID="1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46" Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.244799 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46\": container with ID starting with 1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46 not found: ID does not exist" containerID="1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.244837 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46"} err="failed to get container status \"1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46\": rpc error: code = NotFound desc = could not find container \"1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46\": container with ID starting with 1e6b390b1d5f9ee8919207045ec17d5660a597a4337a3c8b6959e5620e08ad46 not found: ID does not exist" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.244884 4786 scope.go:117] "RemoveContainer" containerID="15848ce920e90b7648bab5e64f68ab325ba1bd7a143ec84d35fc3e9aa2a6e33f" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.248774 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.265593 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.279652 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.287277 4786 scope.go:117] "RemoveContainer" containerID="18ddb14c07d08e39a714e8241c91bc621e8b9460fed525911386dabe3a845484" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.295504 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.303666 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7579f6547f-hnpzx"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.312925 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7579f6547f-hnpzx"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.317717 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.326568 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.331785 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.337324 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.340678 4786 scope.go:117] "RemoveContainer" containerID="aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.365208 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d567z" event={"ID":"1355c383-567c-4c71-a13c-e46f29dd5f8e","Type":"ContainerDied","Data":"02604f20799a709924ba9b2526fa3f3896d95e874833831f5f069013df8e6487"} Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.365298 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d567z" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.378495 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6b8537d-23ab-4c8d-9ca7-b307562baad8","Type":"ContainerDied","Data":"7cc9c14bbb0fea5ff50b76a0016e4862b7b9658f981a4a53d341159aa0d5ccb3"} Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.378568 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.384555 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.384587 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.384560 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.385073 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.406322 4786 scope.go:117] "RemoveContainer" containerID="c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.407450 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.437093 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.448265 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.449059 4786 scope.go:117] "RemoveContainer" containerID="aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27" Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.449431 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27\": container with ID starting with aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27 not found: ID does not exist" containerID="aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.449458 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27"} err="failed to get container status \"aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27\": rpc error: code = NotFound desc = could not find container \"aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27\": container with ID starting with aace78206ad7421af7f5e6f6b90bb170ccd21620dc1732420c979a73a9ed8c27 not found: ID does not exist" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.449476 4786 scope.go:117] "RemoveContainer" containerID="c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13" Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.451598 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13\": container with ID starting with c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13 not found: ID does not exist" containerID="c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.451628 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13"} err="failed to get container status \"c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13\": rpc error: code = NotFound desc = could not find container \"c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13\": container with ID starting with c56205ed19f8909500c14c8227b6015aac0e6daa4ca2915c6be8220e1bcaef13 not found: ID does not exist" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.451643 4786 scope.go:117] "RemoveContainer" containerID="ec868644683e7766c12f25db8cd06a31599be654767ec3aee3347fee4d48ad32" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.483278 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.505956 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.523845 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.533015 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.570984 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d567z"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.576229 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-d567z"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.583270 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.588313 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.625399 4786 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:23 crc kubenswrapper[4786]: E0313 15:28:23.625471 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data podName:65e5ca7c-1c5e-4f9e-85df-a92feaeddb43 nodeName:}" failed. No retries permitted until 2026-03-13 15:28:31.625453139 +0000 UTC m=+1541.788664950 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data") pod "rabbitmq-cell1-server-0" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43") : configmap "rabbitmq-cell1-config-data" not found Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.880746 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.928954 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-config-data\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929023 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-plugins\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929288 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-erlang-cookie\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929333 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929370 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-pod-info\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929416 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-plugins-conf\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929434 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-tls\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929455 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-server-conf\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929504 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-erlang-cookie-secret\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929556 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfhvv\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-kube-api-access-tfhvv\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.929585 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-confd\") pod \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\" (UID: \"f964a2e6-aad3-42c0-8290-c3aa52d99e5b\") " Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.930939 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.932539 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.932548 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.935241 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.936187 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.938178 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-kube-api-access-tfhvv" (OuterVolumeSpecName: "kube-api-access-tfhvv") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "kube-api-access-tfhvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.938315 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.940232 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-pod-info" (OuterVolumeSpecName: "pod-info") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.974441 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-config-data" (OuterVolumeSpecName: "config-data") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:23 crc kubenswrapper[4786]: I0313 15:28:23.978571 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-server-conf" (OuterVolumeSpecName: "server-conf") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.009579 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f964a2e6-aad3-42c0-8290-c3aa52d99e5b" (UID: "f964a2e6-aad3-42c0-8290-c3aa52d99e5b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032220 4786 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032254 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfhvv\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-kube-api-access-tfhvv\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032266 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032275 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032284 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032293 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032327 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032335 4786 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032344 4786 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032352 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.032359 4786 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f964a2e6-aad3-42c0-8290-c3aa52d99e5b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.047639 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.134298 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.395195 4786 generic.go:334] "Generic (PLEG): container finished" podID="9f459571-980e-439d-9dc2-72c0461a20c9" containerID="9ac0d8df2995d1804d0af60083262cacb5da80d9832a9709ff3c01990e8e9cea" exitCode=0 Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.395257 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd8dddbcb-wmbct" event={"ID":"9f459571-980e-439d-9dc2-72c0461a20c9","Type":"ContainerDied","Data":"9ac0d8df2995d1804d0af60083262cacb5da80d9832a9709ff3c01990e8e9cea"} Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.396145 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="3e362102-0c50-415e-8108-82eb18632381" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.204:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.397735 4786 generic.go:334] "Generic (PLEG): container finished" podID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerID="542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e" exitCode=0 Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.397780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f964a2e6-aad3-42c0-8290-c3aa52d99e5b","Type":"ContainerDied","Data":"542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e"} Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.397799 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f964a2e6-aad3-42c0-8290-c3aa52d99e5b","Type":"ContainerDied","Data":"ceb6568d7895ac9e87026451c3bde662d7b26342da0acb09c93120f9f42f2c20"} Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.397816 4786 scope.go:117] "RemoveContainer" containerID="542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.397929 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.404913 4786 generic.go:334] "Generic (PLEG): container finished" podID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerID="789307556dd54b21497583b90b06c0b5ce70e7eed63ba1acf314c8edc36e15af" exitCode=0 Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.404958 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43","Type":"ContainerDied","Data":"789307556dd54b21497583b90b06c0b5ce70e7eed63ba1acf314c8edc36e15af"} Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.407246 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4eba-account-create-update-r4vjh" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.432296 4786 scope.go:117] "RemoveContainer" containerID="8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.487124 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4eba-account-create-update-r4vjh"] Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.494796 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4eba-account-create-update-r4vjh"] Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.501937 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.505651 4786 scope.go:117] "RemoveContainer" containerID="542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e" Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.506083 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e\": container with ID starting with 542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e not found: ID does not exist" containerID="542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.506112 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e"} err="failed to get container status \"542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e\": rpc error: code = NotFound desc = could not find container \"542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e\": container with ID starting with 542e4b17917fb11a0c77e0ccaaed449c0d7cf2b6d8aecdc3106441a44af3441e not found: ID does not exist" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.506135 4786 scope.go:117] "RemoveContainer" containerID="8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d" Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.506354 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d\": container with ID starting with 8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d not found: ID does not exist" containerID="8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.506374 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d"} err="failed to get container status \"8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d\": rpc error: code = NotFound desc = could not find container \"8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d\": container with ID starting with 8f42317caf6e81123841d8558130711c503b63291705d83570e4e4d22490817d not found: ID does not exist" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.508987 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.568096 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1355c383-567c-4c71-a13c-e46f29dd5f8e" path="/var/lib/kubelet/pods/1355c383-567c-4c71-a13c-e46f29dd5f8e/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.569044 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14fdb1f3-fd7f-4b31-ac34-42438c44720a" path="/var/lib/kubelet/pods/14fdb1f3-fd7f-4b31-ac34-42438c44720a/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.570282 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c66255a-19d5-4417-bf43-f7f5bfff892a" path="/var/lib/kubelet/pods/2c66255a-19d5-4417-bf43-f7f5bfff892a/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.572164 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="393ef3eb-1c5f-4a06-a815-fe394d372ee6" path="/var/lib/kubelet/pods/393ef3eb-1c5f-4a06-a815-fe394d372ee6/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.572882 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c8d88f-3851-455c-bfad-2e30c7531250" path="/var/lib/kubelet/pods/40c8d88f-3851-455c-bfad-2e30c7531250/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.573617 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" path="/var/lib/kubelet/pods/54a96a07-f63f-47d9-9191-0548996f01a7/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.574884 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" path="/var/lib/kubelet/pods/70dc1403-e7e9-4200-9a87-e3538a17c350/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.576345 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" path="/var/lib/kubelet/pods/7ee1d973-a40c-4db0-8cc7-1c64ece074ac/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.577158 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8402e30a-1517-41be-b468-1959c4b7621b" path="/var/lib/kubelet/pods/8402e30a-1517-41be-b468-1959c4b7621b/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.578779 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8609052d-1ba2-4888-b973-05c8e4663632" path="/var/lib/kubelet/pods/8609052d-1ba2-4888-b973-05c8e4663632/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.579614 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" path="/var/lib/kubelet/pods/94c381c8-c97e-4159-9bb4-3ede8f12d6e0/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.580372 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" path="/var/lib/kubelet/pods/e313e1cc-ed94-4e28-84f8-d053dcffb16a/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.581695 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b8537d-23ab-4c8d-9ca7-b307562baad8" path="/var/lib/kubelet/pods/f6b8537d-23ab-4c8d-9ca7-b307562baad8/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.582627 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" path="/var/lib/kubelet/pods/f964a2e6-aad3-42c0-8290-c3aa52d99e5b/volumes" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.658166 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c8d88f-3851-455c-bfad-2e30c7531250-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.658193 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shz6f\" (UniqueName: \"kubernetes.io/projected/40c8d88f-3851-455c-bfad-2e30c7531250-kube-api-access-shz6f\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.730421 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.738974 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.761786 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.762165 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.762486 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.762522 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.763002 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.764206 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.765566 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:24 crc kubenswrapper[4786]: E0313 15:28:24.765604 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.861495 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-server-conf\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.861542 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-config-data\") pod \"9f459571-980e-439d-9dc2-72c0461a20c9\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.861600 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-erlang-cookie-secret\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862124 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-pod-info\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862160 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-tls\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862187 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862209 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-combined-ca-bundle\") pod \"9f459571-980e-439d-9dc2-72c0461a20c9\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862227 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-confd\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862262 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-public-tls-certs\") pod \"9f459571-980e-439d-9dc2-72c0461a20c9\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-credential-keys\") pod \"9f459571-980e-439d-9dc2-72c0461a20c9\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862310 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-fernet-keys\") pod \"9f459571-980e-439d-9dc2-72c0461a20c9\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862326 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-plugins-conf\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862346 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-plugins\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgfxm\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-kube-api-access-sgfxm\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862378 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-internal-tls-certs\") pod \"9f459571-980e-439d-9dc2-72c0461a20c9\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862403 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5td5\" (UniqueName: \"kubernetes.io/projected/9f459571-980e-439d-9dc2-72c0461a20c9-kube-api-access-q5td5\") pod \"9f459571-980e-439d-9dc2-72c0461a20c9\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862420 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-scripts\") pod \"9f459571-980e-439d-9dc2-72c0461a20c9\" (UID: \"9f459571-980e-439d-9dc2-72c0461a20c9\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862458 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.862509 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-erlang-cookie\") pod \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\" (UID: \"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43\") " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.863295 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.867078 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.870179 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-pod-info" (OuterVolumeSpecName: "pod-info") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.871159 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-scripts" (OuterVolumeSpecName: "scripts") pod "9f459571-980e-439d-9dc2-72c0461a20c9" (UID: "9f459571-980e-439d-9dc2-72c0461a20c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.872837 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-kube-api-access-sgfxm" (OuterVolumeSpecName: "kube-api-access-sgfxm") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "kube-api-access-sgfxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.873296 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.873731 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.875570 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.876540 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.877374 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f459571-980e-439d-9dc2-72c0461a20c9-kube-api-access-q5td5" (OuterVolumeSpecName: "kube-api-access-q5td5") pod "9f459571-980e-439d-9dc2-72c0461a20c9" (UID: "9f459571-980e-439d-9dc2-72c0461a20c9"). InnerVolumeSpecName "kube-api-access-q5td5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.883106 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9f459571-980e-439d-9dc2-72c0461a20c9" (UID: "9f459571-980e-439d-9dc2-72c0461a20c9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.889379 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9f459571-980e-439d-9dc2-72c0461a20c9" (UID: "9f459571-980e-439d-9dc2-72c0461a20c9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.918465 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data" (OuterVolumeSpecName: "config-data") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.930929 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-config-data" (OuterVolumeSpecName: "config-data") pod "9f459571-980e-439d-9dc2-72c0461a20c9" (UID: "9f459571-980e-439d-9dc2-72c0461a20c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.943075 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9f459571-980e-439d-9dc2-72c0461a20c9" (UID: "9f459571-980e-439d-9dc2-72c0461a20c9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.952794 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f459571-980e-439d-9dc2-72c0461a20c9" (UID: "9f459571-980e-439d-9dc2-72c0461a20c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.958643 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-server-conf" (OuterVolumeSpecName: "server-conf") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964786 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964822 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgfxm\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-kube-api-access-sgfxm\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964843 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964859 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964870 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5td5\" (UniqueName: \"kubernetes.io/projected/9f459571-980e-439d-9dc2-72c0461a20c9-kube-api-access-q5td5\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964893 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964905 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964915 4786 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964925 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964937 4786 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964946 4786 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.964957 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.965100 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.965115 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.965125 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.965133 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.965143 4786 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.968747 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9f459571-980e-439d-9dc2-72c0461a20c9" (UID: "9f459571-980e-439d-9dc2-72c0461a20c9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.978614 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.981850 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 13 15:28:24 crc kubenswrapper[4786]: I0313 15:28:24.985891 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.029127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" (UID: "65e5ca7c-1c5e-4f9e-85df-a92feaeddb43"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066532 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-run-httpd\") pod \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066591 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-combined-ca-bundle\") pod \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066621 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-ceilometer-tls-certs\") pod \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066636 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-log-httpd\") pod \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066667 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-config-data\") pod \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066689 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-sg-core-conf-yaml\") pod \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066712 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-combined-ca-bundle\") pod \"31158646-2c0c-4098-bd3e-ea307fa78716\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066735 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xnb7\" (UniqueName: \"kubernetes.io/projected/31158646-2c0c-4098-bd3e-ea307fa78716-kube-api-access-9xnb7\") pod \"31158646-2c0c-4098-bd3e-ea307fa78716\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-scripts\") pod \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066787 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xvwd\" (UniqueName: \"kubernetes.io/projected/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-kube-api-access-7xvwd\") pod \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\" (UID: \"2dc45915-09df-4248-8cb8-c7b11d1e4a4c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066809 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31158646-2c0c-4098-bd3e-ea307fa78716-logs\") pod \"31158646-2c0c-4098-bd3e-ea307fa78716\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066826 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data-custom\") pod \"31158646-2c0c-4098-bd3e-ea307fa78716\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066860 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data\") pod \"31158646-2c0c-4098-bd3e-ea307fa78716\" (UID: \"31158646-2c0c-4098-bd3e-ea307fa78716\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.066888 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2dc45915-09df-4248-8cb8-c7b11d1e4a4c" (UID: "2dc45915-09df-4248-8cb8-c7b11d1e4a4c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.067162 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.067178 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.067187 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f459571-980e-439d-9dc2-72c0461a20c9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.067196 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.067693 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31158646-2c0c-4098-bd3e-ea307fa78716-logs" (OuterVolumeSpecName: "logs") pod "31158646-2c0c-4098-bd3e-ea307fa78716" (UID: "31158646-2c0c-4098-bd3e-ea307fa78716"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.069066 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2dc45915-09df-4248-8cb8-c7b11d1e4a4c" (UID: "2dc45915-09df-4248-8cb8-c7b11d1e4a4c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.069998 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-kube-api-access-7xvwd" (OuterVolumeSpecName: "kube-api-access-7xvwd") pod "2dc45915-09df-4248-8cb8-c7b11d1e4a4c" (UID: "2dc45915-09df-4248-8cb8-c7b11d1e4a4c"). InnerVolumeSpecName "kube-api-access-7xvwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.070655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31158646-2c0c-4098-bd3e-ea307fa78716-kube-api-access-9xnb7" (OuterVolumeSpecName: "kube-api-access-9xnb7") pod "31158646-2c0c-4098-bd3e-ea307fa78716" (UID: "31158646-2c0c-4098-bd3e-ea307fa78716"). InnerVolumeSpecName "kube-api-access-9xnb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.074190 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-scripts" (OuterVolumeSpecName: "scripts") pod "2dc45915-09df-4248-8cb8-c7b11d1e4a4c" (UID: "2dc45915-09df-4248-8cb8-c7b11d1e4a4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.074211 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "31158646-2c0c-4098-bd3e-ea307fa78716" (UID: "31158646-2c0c-4098-bd3e-ea307fa78716"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.091421 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31158646-2c0c-4098-bd3e-ea307fa78716" (UID: "31158646-2c0c-4098-bd3e-ea307fa78716"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.094041 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2dc45915-09df-4248-8cb8-c7b11d1e4a4c" (UID: "2dc45915-09df-4248-8cb8-c7b11d1e4a4c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.101873 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data" (OuterVolumeSpecName: "config-data") pod "31158646-2c0c-4098-bd3e-ea307fa78716" (UID: "31158646-2c0c-4098-bd3e-ea307fa78716"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.109830 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2dc45915-09df-4248-8cb8-c7b11d1e4a4c" (UID: "2dc45915-09df-4248-8cb8-c7b11d1e4a4c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.121873 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dc45915-09df-4248-8cb8-c7b11d1e4a4c" (UID: "2dc45915-09df-4248-8cb8-c7b11d1e4a4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.134532 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-config-data" (OuterVolumeSpecName: "config-data") pod "2dc45915-09df-4248-8cb8-c7b11d1e4a4c" (UID: "2dc45915-09df-4248-8cb8-c7b11d1e4a4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168437 4786 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168470 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168481 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168492 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168504 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168516 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xnb7\" (UniqueName: \"kubernetes.io/projected/31158646-2c0c-4098-bd3e-ea307fa78716-kube-api-access-9xnb7\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168528 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168537 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xvwd\" (UniqueName: \"kubernetes.io/projected/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-kube-api-access-7xvwd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168548 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31158646-2c0c-4098-bd3e-ea307fa78716-logs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168558 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168568 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31158646-2c0c-4098-bd3e-ea307fa78716-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.168578 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc45915-09df-4248-8cb8-c7b11d1e4a4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.421257 4786 generic.go:334] "Generic (PLEG): container finished" podID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerID="ef204212cb679a6a1f4f8cb961bcf40917c2172c42b199b0070296a420877d86" exitCode=0 Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.421343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74e7c2f9-5486-4c21-a0b7-07c81d85a24c","Type":"ContainerDied","Data":"ef204212cb679a6a1f4f8cb961bcf40917c2172c42b199b0070296a420877d86"} Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.422816 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd8dddbcb-wmbct" event={"ID":"9f459571-980e-439d-9dc2-72c0461a20c9","Type":"ContainerDied","Data":"cfa37fbb41c94d30c55519c33ef449409328ab30bb5ccdb5d0c677e069ab5d33"} Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.422840 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd8dddbcb-wmbct" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.422861 4786 scope.go:117] "RemoveContainer" containerID="9ac0d8df2995d1804d0af60083262cacb5da80d9832a9709ff3c01990e8e9cea" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.437475 4786 generic.go:334] "Generic (PLEG): container finished" podID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerID="06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab" exitCode=0 Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.437537 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerDied","Data":"06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab"} Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.437563 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2dc45915-09df-4248-8cb8-c7b11d1e4a4c","Type":"ContainerDied","Data":"a1cf8b43df8175438aea1134c65a48b7c00f19499e90bd1a1e5b5367465b5c45"} Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.437655 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.441084 4786 generic.go:334] "Generic (PLEG): container finished" podID="31158646-2c0c-4098-bd3e-ea307fa78716" containerID="3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f" exitCode=0 Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.441126 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" event={"ID":"31158646-2c0c-4098-bd3e-ea307fa78716","Type":"ContainerDied","Data":"3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f"} Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.441143 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" event={"ID":"31158646-2c0c-4098-bd3e-ea307fa78716","Type":"ContainerDied","Data":"1627c51a27d60b2c85430bf40f8807202c1ec2d2b8cc26ed99cdfaf0fb58beed"} Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.441193 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57d8bd5bb-fsm9r" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.452015 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7bd8dddbcb-wmbct"] Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.457335 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7bd8dddbcb-wmbct"] Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.464088 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65e5ca7c-1c5e-4f9e-85df-a92feaeddb43","Type":"ContainerDied","Data":"54846a642fd59e535935d429d2a622eb8c50104653f9ccfb1950cc77bd37e558"} Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.464178 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.484403 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-57d8bd5bb-fsm9r"] Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.494066 4786 scope.go:117] "RemoveContainer" containerID="c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.521628 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-57d8bd5bb-fsm9r"] Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.524230 4786 scope.go:117] "RemoveContainer" containerID="48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.532323 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.538417 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.542760 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.548954 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.551197 4786 scope.go:117] "RemoveContainer" containerID="06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.568736 4786 scope.go:117] "RemoveContainer" containerID="2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.603387 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.614147 4786 scope.go:117] "RemoveContainer" containerID="c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975" Mar 13 15:28:25 crc kubenswrapper[4786]: E0313 15:28:25.614634 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975\": container with ID starting with c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975 not found: ID does not exist" containerID="c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.614668 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975"} err="failed to get container status \"c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975\": rpc error: code = NotFound desc = could not find container \"c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975\": container with ID starting with c617cf40328d897823f3cda9b231f7f3e30d46af7071cfc70c25ecd9a1ad4975 not found: ID does not exist" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.614692 4786 scope.go:117] "RemoveContainer" containerID="48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb" Mar 13 15:28:25 crc kubenswrapper[4786]: E0313 15:28:25.615023 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb\": container with ID starting with 48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb not found: ID does not exist" containerID="48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.615049 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb"} err="failed to get container status \"48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb\": rpc error: code = NotFound desc = could not find container \"48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb\": container with ID starting with 48922808679b0a223f8aacc2978e974d039c66edc9f62b1910b9c4de0f1f05cb not found: ID does not exist" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.615066 4786 scope.go:117] "RemoveContainer" containerID="06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab" Mar 13 15:28:25 crc kubenswrapper[4786]: E0313 15:28:25.615306 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab\": container with ID starting with 06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab not found: ID does not exist" containerID="06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.615331 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab"} err="failed to get container status \"06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab\": rpc error: code = NotFound desc = could not find container \"06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab\": container with ID starting with 06705329f1b83d42995a9e74bea384d6ebbdf7334e10052b1925de30cef789ab not found: ID does not exist" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.615345 4786 scope.go:117] "RemoveContainer" containerID="2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388" Mar 13 15:28:25 crc kubenswrapper[4786]: E0313 15:28:25.615542 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388\": container with ID starting with 2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388 not found: ID does not exist" containerID="2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.615558 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388"} err="failed to get container status \"2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388\": rpc error: code = NotFound desc = could not find container \"2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388\": container with ID starting with 2958930e78c38029c5abdc5da115f22287be625e06cbad990931b456fd2e2388 not found: ID does not exist" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.615567 4786 scope.go:117] "RemoveContainer" containerID="3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.695370 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-default\") pod \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.695425 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-operator-scripts\") pod \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.695495 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-galera-tls-certs\") pod \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.695515 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kolla-config\") pod \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.695540 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lfzd\" (UniqueName: \"kubernetes.io/projected/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kube-api-access-5lfzd\") pod \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.695613 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-generated\") pod \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.695657 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-combined-ca-bundle\") pod \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.695681 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\" (UID: \"74e7c2f9-5486-4c21-a0b7-07c81d85a24c\") " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.696398 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "74e7c2f9-5486-4c21-a0b7-07c81d85a24c" (UID: "74e7c2f9-5486-4c21-a0b7-07c81d85a24c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.696690 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "74e7c2f9-5486-4c21-a0b7-07c81d85a24c" (UID: "74e7c2f9-5486-4c21-a0b7-07c81d85a24c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.696749 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "74e7c2f9-5486-4c21-a0b7-07c81d85a24c" (UID: "74e7c2f9-5486-4c21-a0b7-07c81d85a24c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.696771 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74e7c2f9-5486-4c21-a0b7-07c81d85a24c" (UID: "74e7c2f9-5486-4c21-a0b7-07c81d85a24c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.699311 4786 scope.go:117] "RemoveContainer" containerID="fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.701590 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kube-api-access-5lfzd" (OuterVolumeSpecName: "kube-api-access-5lfzd") pod "74e7c2f9-5486-4c21-a0b7-07c81d85a24c" (UID: "74e7c2f9-5486-4c21-a0b7-07c81d85a24c"). InnerVolumeSpecName "kube-api-access-5lfzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.706876 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "74e7c2f9-5486-4c21-a0b7-07c81d85a24c" (UID: "74e7c2f9-5486-4c21-a0b7-07c81d85a24c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.718542 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74e7c2f9-5486-4c21-a0b7-07c81d85a24c" (UID: "74e7c2f9-5486-4c21-a0b7-07c81d85a24c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.741368 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "74e7c2f9-5486-4c21-a0b7-07c81d85a24c" (UID: "74e7c2f9-5486-4c21-a0b7-07c81d85a24c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.794392 4786 scope.go:117] "RemoveContainer" containerID="3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f" Mar 13 15:28:25 crc kubenswrapper[4786]: E0313 15:28:25.795139 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f\": container with ID starting with 3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f not found: ID does not exist" containerID="3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.795183 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f"} err="failed to get container status \"3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f\": rpc error: code = NotFound desc = could not find container \"3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f\": container with ID starting with 3297e494b73163f06288f2cfd08fb16bd5dc333261826903758e08c71c0cee0f not found: ID does not exist" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.795207 4786 scope.go:117] "RemoveContainer" containerID="fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e" Mar 13 15:28:25 crc kubenswrapper[4786]: E0313 15:28:25.796137 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e\": container with ID starting with fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e not found: ID does not exist" containerID="fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.796158 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e"} err="failed to get container status \"fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e\": rpc error: code = NotFound desc = could not find container \"fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e\": container with ID starting with fac111eb78ea95dcb6e84b90d8088b5c6ccf6445cdd0c2b0e1838af83a7f051e not found: ID does not exist" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.796171 4786 scope.go:117] "RemoveContainer" containerID="789307556dd54b21497583b90b06c0b5ce70e7eed63ba1acf314c8edc36e15af" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.797630 4786 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.797651 4786 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.797661 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lfzd\" (UniqueName: \"kubernetes.io/projected/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-kube-api-access-5lfzd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.797672 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.797679 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.797706 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.797714 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.797723 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74e7c2f9-5486-4c21-a0b7-07c81d85a24c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.810898 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.820968 4786 scope.go:117] "RemoveContainer" containerID="4c81517ba6f8b5efac24e0644f61c61845b3759183639397f2234090a8627707" Mar 13 15:28:25 crc kubenswrapper[4786]: I0313 15:28:25.899674 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.476601 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"74e7c2f9-5486-4c21-a0b7-07c81d85a24c","Type":"ContainerDied","Data":"e4989570687deeb4e766b59c8004ec3ece9c7569eab2a71136649278bd41bb95"} Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.476668 4786 scope.go:117] "RemoveContainer" containerID="ef204212cb679a6a1f4f8cb961bcf40917c2172c42b199b0070296a420877d86" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.476620 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.513089 4786 scope.go:117] "RemoveContainer" containerID="4f360270643133e3341091a9416bbc2ba46f23a4951802ac45c190664297de77" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.525516 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.536365 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.572762 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" path="/var/lib/kubelet/pods/2dc45915-09df-4248-8cb8-c7b11d1e4a4c/volumes" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.573662 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" path="/var/lib/kubelet/pods/31158646-2c0c-4098-bd3e-ea307fa78716/volumes" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.575108 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" path="/var/lib/kubelet/pods/65e5ca7c-1c5e-4f9e-85df-a92feaeddb43/volumes" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.575942 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" path="/var/lib/kubelet/pods/74e7c2f9-5486-4c21-a0b7-07c81d85a24c/volumes" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.576570 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f459571-980e-439d-9dc2-72c0461a20c9" path="/var/lib/kubelet/pods/9f459571-980e-439d-9dc2-72c0461a20c9/volumes" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.677051 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54bc4948fd-47bbp" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:28:26 crc kubenswrapper[4786]: I0313 15:28:26.677103 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54bc4948fd-47bbp" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:28:29 crc kubenswrapper[4786]: E0313 15:28:29.770055 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:29 crc kubenswrapper[4786]: E0313 15:28:29.770202 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:29 crc kubenswrapper[4786]: E0313 15:28:29.770936 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:29 crc kubenswrapper[4786]: E0313 15:28:29.771367 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:29 crc kubenswrapper[4786]: E0313 15:28:29.771442 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:28:29 crc kubenswrapper[4786]: E0313 15:28:29.772292 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:29 crc kubenswrapper[4786]: E0313 15:28:29.773866 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:29 crc kubenswrapper[4786]: E0313 15:28:29.773933 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:28:34 crc kubenswrapper[4786]: E0313 15:28:34.759875 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:34 crc kubenswrapper[4786]: E0313 15:28:34.761564 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:34 crc kubenswrapper[4786]: E0313 15:28:34.762041 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:34 crc kubenswrapper[4786]: E0313 15:28:34.762465 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:34 crc kubenswrapper[4786]: E0313 15:28:34.762612 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:28:34 crc kubenswrapper[4786]: E0313 15:28:34.763160 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:34 crc kubenswrapper[4786]: E0313 15:28:34.764626 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:34 crc kubenswrapper[4786]: E0313 15:28:34.764665 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.554436 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.592148 4786 generic.go:334] "Generic (PLEG): container finished" podID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerID="92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060" exitCode=0 Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.592199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd4f4c6c-zg28d" event={"ID":"21658ad3-b8e8-4743-b2c7-da4782850abc","Type":"ContainerDied","Data":"92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060"} Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.592227 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd4f4c6c-zg28d" event={"ID":"21658ad3-b8e8-4743-b2c7-da4782850abc","Type":"ContainerDied","Data":"f35831a82cadd9eced7b3e06ee89906804990645daeb4e203ee4130dad3126e1"} Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.592201 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd4f4c6c-zg28d" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.592247 4786 scope.go:117] "RemoveContainer" containerID="84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.624576 4786 scope.go:117] "RemoveContainer" containerID="92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.643450 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-public-tls-certs\") pod \"21658ad3-b8e8-4743-b2c7-da4782850abc\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.643781 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-httpd-config\") pod \"21658ad3-b8e8-4743-b2c7-da4782850abc\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.643856 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-combined-ca-bundle\") pod \"21658ad3-b8e8-4743-b2c7-da4782850abc\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.643947 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-config\") pod \"21658ad3-b8e8-4743-b2c7-da4782850abc\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.643976 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-ovndb-tls-certs\") pod \"21658ad3-b8e8-4743-b2c7-da4782850abc\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.644015 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdlvj\" (UniqueName: \"kubernetes.io/projected/21658ad3-b8e8-4743-b2c7-da4782850abc-kube-api-access-fdlvj\") pod \"21658ad3-b8e8-4743-b2c7-da4782850abc\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.644037 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-internal-tls-certs\") pod \"21658ad3-b8e8-4743-b2c7-da4782850abc\" (UID: \"21658ad3-b8e8-4743-b2c7-da4782850abc\") " Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.646081 4786 scope.go:117] "RemoveContainer" containerID="84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0" Mar 13 15:28:35 crc kubenswrapper[4786]: E0313 15:28:35.647138 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0\": container with ID starting with 84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0 not found: ID does not exist" containerID="84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.647181 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0"} err="failed to get container status \"84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0\": rpc error: code = NotFound desc = could not find container \"84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0\": container with ID starting with 84aa180931454fb86cd5e3fe68270014c00e81f933dd960c42a2426faf3e7df0 not found: ID does not exist" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.647207 4786 scope.go:117] "RemoveContainer" containerID="92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060" Mar 13 15:28:35 crc kubenswrapper[4786]: E0313 15:28:35.649183 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060\": container with ID starting with 92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060 not found: ID does not exist" containerID="92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.649221 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060"} err="failed to get container status \"92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060\": rpc error: code = NotFound desc = could not find container \"92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060\": container with ID starting with 92fe4c1789bea3645fd66ef657722eccbdbb48bfa5fd0034b392f5191f60c060 not found: ID does not exist" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.649526 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "21658ad3-b8e8-4743-b2c7-da4782850abc" (UID: "21658ad3-b8e8-4743-b2c7-da4782850abc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.661070 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21658ad3-b8e8-4743-b2c7-da4782850abc-kube-api-access-fdlvj" (OuterVolumeSpecName: "kube-api-access-fdlvj") pod "21658ad3-b8e8-4743-b2c7-da4782850abc" (UID: "21658ad3-b8e8-4743-b2c7-da4782850abc"). InnerVolumeSpecName "kube-api-access-fdlvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.680129 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "21658ad3-b8e8-4743-b2c7-da4782850abc" (UID: "21658ad3-b8e8-4743-b2c7-da4782850abc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.681879 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "21658ad3-b8e8-4743-b2c7-da4782850abc" (UID: "21658ad3-b8e8-4743-b2c7-da4782850abc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.682186 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-config" (OuterVolumeSpecName: "config") pod "21658ad3-b8e8-4743-b2c7-da4782850abc" (UID: "21658ad3-b8e8-4743-b2c7-da4782850abc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.697181 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21658ad3-b8e8-4743-b2c7-da4782850abc" (UID: "21658ad3-b8e8-4743-b2c7-da4782850abc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.699631 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "21658ad3-b8e8-4743-b2c7-da4782850abc" (UID: "21658ad3-b8e8-4743-b2c7-da4782850abc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.746103 4786 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.746153 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdlvj\" (UniqueName: \"kubernetes.io/projected/21658ad3-b8e8-4743-b2c7-da4782850abc-kube-api-access-fdlvj\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.746167 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.746183 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.746196 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.746207 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.746219 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/21658ad3-b8e8-4743-b2c7-da4782850abc-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.927092 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bd4f4c6c-zg28d"] Mar 13 15:28:35 crc kubenswrapper[4786]: I0313 15:28:35.931780 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bd4f4c6c-zg28d"] Mar 13 15:28:36 crc kubenswrapper[4786]: I0313 15:28:36.561620 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" path="/var/lib/kubelet/pods/21658ad3-b8e8-4743-b2c7-da4782850abc/volumes" Mar 13 15:28:37 crc kubenswrapper[4786]: I0313 15:28:37.868602 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:28:37 crc kubenswrapper[4786]: I0313 15:28:37.869063 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:28:37 crc kubenswrapper[4786]: I0313 15:28:37.869166 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:28:37 crc kubenswrapper[4786]: I0313 15:28:37.870555 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5ef85bd6fdd298e112745a45310d0fedfe424a8161d80698615752d010dc319"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:28:37 crc kubenswrapper[4786]: I0313 15:28:37.870693 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://a5ef85bd6fdd298e112745a45310d0fedfe424a8161d80698615752d010dc319" gracePeriod=600 Mar 13 15:28:38 crc kubenswrapper[4786]: I0313 15:28:38.635646 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="a5ef85bd6fdd298e112745a45310d0fedfe424a8161d80698615752d010dc319" exitCode=0 Mar 13 15:28:38 crc kubenswrapper[4786]: I0313 15:28:38.635702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"a5ef85bd6fdd298e112745a45310d0fedfe424a8161d80698615752d010dc319"} Mar 13 15:28:38 crc kubenswrapper[4786]: I0313 15:28:38.637281 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a"} Mar 13 15:28:38 crc kubenswrapper[4786]: I0313 15:28:38.637414 4786 scope.go:117] "RemoveContainer" containerID="cedf575f572a8d2fa7d4acff7bfb9c6086d44a2d58bd68733b103bae3b833d49" Mar 13 15:28:39 crc kubenswrapper[4786]: E0313 15:28:39.763224 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:39 crc kubenswrapper[4786]: E0313 15:28:39.764204 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:39 crc kubenswrapper[4786]: E0313 15:28:39.764729 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:39 crc kubenswrapper[4786]: E0313 15:28:39.764687 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:39 crc kubenswrapper[4786]: E0313 15:28:39.764791 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:28:39 crc kubenswrapper[4786]: E0313 15:28:39.767260 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:39 crc kubenswrapper[4786]: E0313 15:28:39.769163 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:39 crc kubenswrapper[4786]: E0313 15:28:39.769249 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:28:44 crc kubenswrapper[4786]: E0313 15:28:44.759668 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:44 crc kubenswrapper[4786]: E0313 15:28:44.761741 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:44 crc kubenswrapper[4786]: E0313 15:28:44.762915 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:44 crc kubenswrapper[4786]: E0313 15:28:44.763417 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:44 crc kubenswrapper[4786]: E0313 15:28:44.763464 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 13 15:28:44 crc kubenswrapper[4786]: E0313 15:28:44.763543 4786 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:28:44 crc kubenswrapper[4786]: E0313 15:28:44.766307 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 13 15:28:44 crc kubenswrapper[4786]: E0313 15:28:44.766454 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2hb98" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:28:46 crc kubenswrapper[4786]: I0313 15:28:46.757225 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2hb98_ee4ee4a6-b86a-454b-8952-6a0f16ce6353/ovs-vswitchd/0.log" Mar 13 15:28:46 crc kubenswrapper[4786]: I0313 15:28:46.758505 4786 generic.go:334] "Generic (PLEG): container finished" podID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" exitCode=137 Mar 13 15:28:46 crc kubenswrapper[4786]: I0313 15:28:46.758554 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2hb98" event={"ID":"ee4ee4a6-b86a-454b-8952-6a0f16ce6353","Type":"ContainerDied","Data":"5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5"} Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.015838 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2hb98_ee4ee4a6-b86a-454b-8952-6a0f16ce6353/ovs-vswitchd/0.log" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.016765 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.128933 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-log\") pod \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129022 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-etc-ovs\") pod \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129220 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "ee4ee4a6-b86a-454b-8952-6a0f16ce6353" (UID: "ee4ee4a6-b86a-454b-8952-6a0f16ce6353"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129263 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-log" (OuterVolumeSpecName: "var-log") pod "ee4ee4a6-b86a-454b-8952-6a0f16ce6353" (UID: "ee4ee4a6-b86a-454b-8952-6a0f16ce6353"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129289 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-lib\") pod \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129334 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l4d7\" (UniqueName: \"kubernetes.io/projected/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-kube-api-access-9l4d7\") pod \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129356 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-run\") pod \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129395 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-scripts\") pod \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\" (UID: \"ee4ee4a6-b86a-454b-8952-6a0f16ce6353\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129796 4786 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.129822 4786 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.131044 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-scripts" (OuterVolumeSpecName: "scripts") pod "ee4ee4a6-b86a-454b-8952-6a0f16ce6353" (UID: "ee4ee4a6-b86a-454b-8952-6a0f16ce6353"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.131086 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-lib" (OuterVolumeSpecName: "var-lib") pod "ee4ee4a6-b86a-454b-8952-6a0f16ce6353" (UID: "ee4ee4a6-b86a-454b-8952-6a0f16ce6353"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.131453 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-run" (OuterVolumeSpecName: "var-run") pod "ee4ee4a6-b86a-454b-8952-6a0f16ce6353" (UID: "ee4ee4a6-b86a-454b-8952-6a0f16ce6353"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.141425 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-kube-api-access-9l4d7" (OuterVolumeSpecName: "kube-api-access-9l4d7") pod "ee4ee4a6-b86a-454b-8952-6a0f16ce6353" (UID: "ee4ee4a6-b86a-454b-8952-6a0f16ce6353"). InnerVolumeSpecName "kube-api-access-9l4d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.225433 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.232216 4786 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-lib\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.232272 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l4d7\" (UniqueName: \"kubernetes.io/projected/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-kube-api-access-9l4d7\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.232284 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.232292 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee4ee4a6-b86a-454b-8952-6a0f16ce6353-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.333332 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e503bc45-db60-4bc8-bb97-3472d2456fdb\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.333477 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e503bc45-db60-4bc8-bb97-3472d2456fdb-combined-ca-bundle\") pod \"e503bc45-db60-4bc8-bb97-3472d2456fdb\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.333567 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvbr8\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-kube-api-access-jvbr8\") pod \"e503bc45-db60-4bc8-bb97-3472d2456fdb\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.333735 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-lock\") pod \"e503bc45-db60-4bc8-bb97-3472d2456fdb\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.334338 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-lock" (OuterVolumeSpecName: "lock") pod "e503bc45-db60-4bc8-bb97-3472d2456fdb" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.334591 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-cache\") pod \"e503bc45-db60-4bc8-bb97-3472d2456fdb\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.335098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") pod \"e503bc45-db60-4bc8-bb97-3472d2456fdb\" (UID: \"e503bc45-db60-4bc8-bb97-3472d2456fdb\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.335037 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-cache" (OuterVolumeSpecName: "cache") pod "e503bc45-db60-4bc8-bb97-3472d2456fdb" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.335907 4786 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-lock\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.335994 4786 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e503bc45-db60-4bc8-bb97-3472d2456fdb-cache\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.337500 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-kube-api-access-jvbr8" (OuterVolumeSpecName: "kube-api-access-jvbr8") pod "e503bc45-db60-4bc8-bb97-3472d2456fdb" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb"). InnerVolumeSpecName "kube-api-access-jvbr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.337670 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "e503bc45-db60-4bc8-bb97-3472d2456fdb" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.339061 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e503bc45-db60-4bc8-bb97-3472d2456fdb" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.437088 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.437127 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvbr8\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-kube-api-access-jvbr8\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.437139 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e503bc45-db60-4bc8-bb97-3472d2456fdb-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.451751 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.554410 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.663297 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e503bc45-db60-4bc8-bb97-3472d2456fdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e503bc45-db60-4bc8-bb97-3472d2456fdb" (UID: "e503bc45-db60-4bc8-bb97-3472d2456fdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.677183 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.758355 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-scripts\") pod \"82f2e6fd-58ee-4002-b167-096b3b715233\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.758476 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4vh6\" (UniqueName: \"kubernetes.io/projected/82f2e6fd-58ee-4002-b167-096b3b715233-kube-api-access-n4vh6\") pod \"82f2e6fd-58ee-4002-b167-096b3b715233\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.758528 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82f2e6fd-58ee-4002-b167-096b3b715233-etc-machine-id\") pod \"82f2e6fd-58ee-4002-b167-096b3b715233\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.758763 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data-custom\") pod \"82f2e6fd-58ee-4002-b167-096b3b715233\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.758780 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82f2e6fd-58ee-4002-b167-096b3b715233-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "82f2e6fd-58ee-4002-b167-096b3b715233" (UID: "82f2e6fd-58ee-4002-b167-096b3b715233"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.758794 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-combined-ca-bundle\") pod \"82f2e6fd-58ee-4002-b167-096b3b715233\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.758917 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data\") pod \"82f2e6fd-58ee-4002-b167-096b3b715233\" (UID: \"82f2e6fd-58ee-4002-b167-096b3b715233\") " Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.759315 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82f2e6fd-58ee-4002-b167-096b3b715233-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.759332 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e503bc45-db60-4bc8-bb97-3472d2456fdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.761273 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-scripts" (OuterVolumeSpecName: "scripts") pod "82f2e6fd-58ee-4002-b167-096b3b715233" (UID: "82f2e6fd-58ee-4002-b167-096b3b715233"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.761439 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f2e6fd-58ee-4002-b167-096b3b715233-kube-api-access-n4vh6" (OuterVolumeSpecName: "kube-api-access-n4vh6") pod "82f2e6fd-58ee-4002-b167-096b3b715233" (UID: "82f2e6fd-58ee-4002-b167-096b3b715233"). InnerVolumeSpecName "kube-api-access-n4vh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.761804 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "82f2e6fd-58ee-4002-b167-096b3b715233" (UID: "82f2e6fd-58ee-4002-b167-096b3b715233"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.773332 4786 generic.go:334] "Generic (PLEG): container finished" podID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerID="a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db" exitCode=137 Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.773372 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db"} Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.773687 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e503bc45-db60-4bc8-bb97-3472d2456fdb","Type":"ContainerDied","Data":"69b0fc969286be40ea887307f460354e7cbd0ad76bfe68859e2fcdfa3007227f"} Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.773779 4786 scope.go:117] "RemoveContainer" containerID="a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.773499 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.775628 4786 generic.go:334] "Generic (PLEG): container finished" podID="82f2e6fd-58ee-4002-b167-096b3b715233" containerID="d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7" exitCode=137 Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.775661 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.775687 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82f2e6fd-58ee-4002-b167-096b3b715233","Type":"ContainerDied","Data":"d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7"} Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.775705 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82f2e6fd-58ee-4002-b167-096b3b715233","Type":"ContainerDied","Data":"5a6d50ecfb812efe49c91ee04726931e914a525e77cb44e45e64c17eff233e52"} Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.778635 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2hb98_ee4ee4a6-b86a-454b-8952-6a0f16ce6353/ovs-vswitchd/0.log" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.779491 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2hb98" event={"ID":"ee4ee4a6-b86a-454b-8952-6a0f16ce6353","Type":"ContainerDied","Data":"64c90c0674809177b1a8b5034d10a6b47f26b6a5e4e41edfb0306c72cbc6ba84"} Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.779550 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2hb98" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.797447 4786 scope.go:117] "RemoveContainer" containerID="779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.821527 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82f2e6fd-58ee-4002-b167-096b3b715233" (UID: "82f2e6fd-58ee-4002-b167-096b3b715233"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.822993 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.829895 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.836239 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data" (OuterVolumeSpecName: "config-data") pod "82f2e6fd-58ee-4002-b167-096b3b715233" (UID: "82f2e6fd-58ee-4002-b167-096b3b715233"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.840583 4786 scope.go:117] "RemoveContainer" containerID="020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.841043 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-2hb98"] Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.847201 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-2hb98"] Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.858583 4786 scope.go:117] "RemoveContainer" containerID="6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.860143 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4vh6\" (UniqueName: \"kubernetes.io/projected/82f2e6fd-58ee-4002-b167-096b3b715233-kube-api-access-n4vh6\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.860169 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.860181 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.860192 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.860203 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82f2e6fd-58ee-4002-b167-096b3b715233-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.876208 4786 scope.go:117] "RemoveContainer" containerID="06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.892552 4786 scope.go:117] "RemoveContainer" containerID="328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.918969 4786 scope.go:117] "RemoveContainer" containerID="cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.939133 4786 scope.go:117] "RemoveContainer" containerID="ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.955249 4786 scope.go:117] "RemoveContainer" containerID="add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.971705 4786 scope.go:117] "RemoveContainer" containerID="e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4" Mar 13 15:28:47 crc kubenswrapper[4786]: I0313 15:28:47.988343 4786 scope.go:117] "RemoveContainer" containerID="209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.003354 4786 scope.go:117] "RemoveContainer" containerID="11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.030508 4786 scope.go:117] "RemoveContainer" containerID="03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.052344 4786 scope.go:117] "RemoveContainer" containerID="0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.083336 4786 scope.go:117] "RemoveContainer" containerID="d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.111949 4786 scope.go:117] "RemoveContainer" containerID="a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.113152 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db\": container with ID starting with a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db not found: ID does not exist" containerID="a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.113208 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db"} err="failed to get container status \"a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db\": rpc error: code = NotFound desc = could not find container \"a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db\": container with ID starting with a5e245db1bbe122155b6e24a80c43b0e56d8d3cbd6c359e9fca65846e54a04db not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.113248 4786 scope.go:117] "RemoveContainer" containerID="779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.113630 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8\": container with ID starting with 779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8 not found: ID does not exist" containerID="779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.113674 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8"} err="failed to get container status \"779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8\": rpc error: code = NotFound desc = could not find container \"779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8\": container with ID starting with 779e613cd6ab273f9b8293b98e90a265825ef9fe71c13d2f3b5b3bac292481e8 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.113701 4786 scope.go:117] "RemoveContainer" containerID="020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.113959 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d\": container with ID starting with 020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d not found: ID does not exist" containerID="020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.113985 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d"} err="failed to get container status \"020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d\": rpc error: code = NotFound desc = could not find container \"020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d\": container with ID starting with 020d69f392816e50774f69d55582486f48455c1d8f891f5449b94fe92e5ae02d not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.114002 4786 scope.go:117] "RemoveContainer" containerID="6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.114494 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972\": container with ID starting with 6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972 not found: ID does not exist" containerID="6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.114511 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972"} err="failed to get container status \"6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972\": rpc error: code = NotFound desc = could not find container \"6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972\": container with ID starting with 6a9aadd35c4074822b5d39976beaa422d4853a2710ae5507d0a661e228760972 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.114525 4786 scope.go:117] "RemoveContainer" containerID="06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.115111 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d\": container with ID starting with 06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d not found: ID does not exist" containerID="06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.115146 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d"} err="failed to get container status \"06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d\": rpc error: code = NotFound desc = could not find container \"06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d\": container with ID starting with 06a4ab4416764d7a6a2c489dff6c915e552212e7e43f237c58de64315995ff4d not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.115173 4786 scope.go:117] "RemoveContainer" containerID="328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.115649 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c\": container with ID starting with 328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c not found: ID does not exist" containerID="328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.115705 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c"} err="failed to get container status \"328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c\": rpc error: code = NotFound desc = could not find container \"328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c\": container with ID starting with 328ed265f23c9319d7c327cb63245e44502ec884ffd603217c87e622daf90a1c not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.115732 4786 scope.go:117] "RemoveContainer" containerID="cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.116042 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57\": container with ID starting with cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57 not found: ID does not exist" containerID="cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.116098 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57"} err="failed to get container status \"cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57\": rpc error: code = NotFound desc = could not find container \"cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57\": container with ID starting with cdda139c3ee87570f378237ce6f819405c7145d4ed3f8f964fd34d27d6579d57 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.116118 4786 scope.go:117] "RemoveContainer" containerID="ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.116519 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072\": container with ID starting with ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072 not found: ID does not exist" containerID="ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.116547 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072"} err="failed to get container status \"ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072\": rpc error: code = NotFound desc = could not find container \"ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072\": container with ID starting with ae395ee49072bcdbf5f891da5f40b64938b79f4d65ffc387c878a76877703072 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.116565 4786 scope.go:117] "RemoveContainer" containerID="add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.116966 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79\": container with ID starting with add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79 not found: ID does not exist" containerID="add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.116998 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79"} err="failed to get container status \"add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79\": rpc error: code = NotFound desc = could not find container \"add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79\": container with ID starting with add2a401c0d09867e0eb5b1fa74cfe76ae3796e72896690edbb833379e7d5f79 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.117026 4786 scope.go:117] "RemoveContainer" containerID="e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.117326 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.117397 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4\": container with ID starting with e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4 not found: ID does not exist" containerID="e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.117414 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4"} err="failed to get container status \"e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4\": rpc error: code = NotFound desc = could not find container \"e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4\": container with ID starting with e3de8a471d894b5fa6e789b887c7c7ec44384e54e3ff5cc9df96634f6f07f3c4 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.117426 4786 scope.go:117] "RemoveContainer" containerID="209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.117759 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508\": container with ID starting with 209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508 not found: ID does not exist" containerID="209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.117822 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508"} err="failed to get container status \"209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508\": rpc error: code = NotFound desc = could not find container \"209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508\": container with ID starting with 209be32cad0691a63bb91b720108c602b034a7e38b486292c126d29a66575508 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.117844 4786 scope.go:117] "RemoveContainer" containerID="11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.118236 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364\": container with ID starting with 11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364 not found: ID does not exist" containerID="11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.118257 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364"} err="failed to get container status \"11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364\": rpc error: code = NotFound desc = could not find container \"11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364\": container with ID starting with 11f732adb4fa0cbb30459e80c6208e95040b054029df72a3bed05b575c994364 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.118271 4786 scope.go:117] "RemoveContainer" containerID="03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.118603 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d\": container with ID starting with 03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d not found: ID does not exist" containerID="03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.118645 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d"} err="failed to get container status \"03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d\": rpc error: code = NotFound desc = could not find container \"03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d\": container with ID starting with 03ed427f81905daee8927fa53e3ea21f256cb20c29dd7859bdeeffcd73ea223d not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.118675 4786 scope.go:117] "RemoveContainer" containerID="0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.119049 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205\": container with ID starting with 0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205 not found: ID does not exist" containerID="0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.119069 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205"} err="failed to get container status \"0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205\": rpc error: code = NotFound desc = could not find container \"0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205\": container with ID starting with 0e2c9ec48fc764e2f376a1f33986b75bd3acae5dd20820582c550c38dd9c0205 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.119082 4786 scope.go:117] "RemoveContainer" containerID="d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.119367 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a\": container with ID starting with d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a not found: ID does not exist" containerID="d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.119385 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a"} err="failed to get container status \"d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a\": rpc error: code = NotFound desc = could not find container \"d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a\": container with ID starting with d6c157dc465498f133bf96a1057c215b42e70c43c76732690c166a9d76658a7a not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.119399 4786 scope.go:117] "RemoveContainer" containerID="c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.129700 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.148206 4786 scope.go:117] "RemoveContainer" containerID="d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.173882 4786 scope.go:117] "RemoveContainer" containerID="c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.174374 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57\": container with ID starting with c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57 not found: ID does not exist" containerID="c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.174431 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57"} err="failed to get container status \"c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57\": rpc error: code = NotFound desc = could not find container \"c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57\": container with ID starting with c853d45c7c965c932fb5c6f663d4cb1680bada56253ce41a8af3743364d9ce57 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.174466 4786 scope.go:117] "RemoveContainer" containerID="d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7" Mar 13 15:28:48 crc kubenswrapper[4786]: E0313 15:28:48.174964 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7\": container with ID starting with d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7 not found: ID does not exist" containerID="d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.175015 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7"} err="failed to get container status \"d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7\": rpc error: code = NotFound desc = could not find container \"d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7\": container with ID starting with d3a4f98ae69486e87bdf41c85286faf04838833f521b770774b2bb5ed818d1f7 not found: ID does not exist" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.175049 4786 scope.go:117] "RemoveContainer" containerID="5e0c1f275341c97221bafceb6edfd50694e546dad3c9f4e5cc2907ea4e57d0f5" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.193031 4786 scope.go:117] "RemoveContainer" containerID="679c36eb1485a7711f44dae4f89ecedc7e304e4c3c3b1dc87a27225c45aab716" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.213097 4786 scope.go:117] "RemoveContainer" containerID="2ca2b9e0d44338cc2d8dc4cd9960b1316db2789e01d7d069963e6c479bc4ee2d" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.567133 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" path="/var/lib/kubelet/pods/82f2e6fd-58ee-4002-b167-096b3b715233/volumes" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.568547 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" path="/var/lib/kubelet/pods/e503bc45-db60-4bc8-bb97-3472d2456fdb/volumes" Mar 13 15:28:48 crc kubenswrapper[4786]: I0313 15:28:48.571172 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" path="/var/lib/kubelet/pods/ee4ee4a6-b86a-454b-8952-6a0f16ce6353/volumes" Mar 13 15:28:52 crc kubenswrapper[4786]: I0313 15:28:52.129770 4786 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6297c29d-09ec-49e7-ae22-6b20962603a7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6297c29d-09ec-49e7-ae22-6b20962603a7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6297c29d_09ec_49e7_ae22_6b20962603a7.slice" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.433601 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmmd"] Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434495 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerName="barbican-worker" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434509 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerName="barbican-worker" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434526 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerName="glance-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434533 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerName="glance-log" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434543 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerName="setup-container" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434551 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerName="setup-container" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434561 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434569 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434585 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerName="rabbitmq" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434592 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerName="rabbitmq" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434608 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" containerName="barbican-keystone-listener" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434617 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" containerName="barbican-keystone-listener" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434627 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434635 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434647 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="ceilometer-notification-agent" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434654 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="ceilometer-notification-agent" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434666 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434673 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434685 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" containerName="probe" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434692 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" containerName="probe" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434702 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server-init" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434709 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server-init" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434720 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434727 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434738 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8609052d-1ba2-4888-b973-05c8e4663632" containerName="glance-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434745 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8609052d-1ba2-4888-b973-05c8e4663632" containerName="glance-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434758 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerName="galera" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434766 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerName="galera" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434779 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" containerName="cinder-scheduler" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434786 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" containerName="cinder-scheduler" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434795 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434802 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434816 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-expirer" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434822 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-expirer" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434833 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434839 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434847 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="swift-recon-cron" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434872 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="swift-recon-cron" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434885 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434892 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434904 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393ef3eb-1c5f-4a06-a815-fe394d372ee6" containerName="memcached" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434911 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="393ef3eb-1c5f-4a06-a815-fe394d372ee6" containerName="memcached" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434923 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-server" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434932 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-server" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434940 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434947 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434957 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-server" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434964 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-server" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434972 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-updater" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434979 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-updater" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.434990 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-metadata" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.434997 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-metadata" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435010 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435016 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435029 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435036 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-api" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435048 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435055 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-api" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435068 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435075 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-log" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435085 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435092 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435100 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="rsync" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435107 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="rsync" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435114 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerName="glance-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435122 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerName="glance-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435130 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8609052d-1ba2-4888-b973-05c8e4663632" containerName="glance-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435138 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8609052d-1ba2-4888-b973-05c8e4663632" containerName="glance-log" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435147 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f459571-980e-439d-9dc2-72c0461a20c9" containerName="keystone-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435154 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f459571-980e-439d-9dc2-72c0461a20c9" containerName="keystone-api" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435166 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerName="mysql-bootstrap" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435173 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerName="mysql-bootstrap" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435181 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c66255a-19d5-4417-bf43-f7f5bfff892a" containerName="nova-cell0-conductor-conductor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435188 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c66255a-19d5-4417-bf43-f7f5bfff892a" containerName="nova-cell0-conductor-conductor" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435200 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435206 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435215 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14fdb1f3-fd7f-4b31-ac34-42438c44720a" containerName="kube-state-metrics" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435222 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="14fdb1f3-fd7f-4b31-ac34-42438c44720a" containerName="kube-state-metrics" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435232 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerName="barbican-worker-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435240 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerName="barbican-worker-log" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435252 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="proxy-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435259 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="proxy-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435267 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b8537d-23ab-4c8d-9ca7-b307562baad8" containerName="nova-cell1-conductor-conductor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435275 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b8537d-23ab-4c8d-9ca7-b307562baad8" containerName="nova-cell1-conductor-conductor" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435284 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-updater" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435291 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-updater" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435304 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerName="rabbitmq" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435312 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerName="rabbitmq" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435324 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" containerName="barbican-keystone-listener-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435331 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" containerName="barbican-keystone-listener-log" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435340 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="sg-core" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435346 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="sg-core" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435353 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-server" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435359 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-server" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435368 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435375 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435387 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435394 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435403 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-reaper" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435410 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-reaper" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435422 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435429 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435441 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="ceilometer-central-agent" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435449 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="ceilometer-central-agent" Mar 13 15:29:19 crc kubenswrapper[4786]: E0313 15:29:19.435459 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerName="setup-container" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435466 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerName="setup-container" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435627 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="rsync" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435644 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" containerName="barbican-keystone-listener" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435657 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435672 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e313e1cc-ed94-4e28-84f8-d053dcffb16a" containerName="nova-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435682 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435694 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="393ef3eb-1c5f-4a06-a815-fe394d372ee6" containerName="memcached" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435702 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8609052d-1ba2-4888-b973-05c8e4663632" containerName="glance-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435714 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="sg-core" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435725 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435735 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435745 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-updater" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435757 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435764 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-auditor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435772 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="ceilometer-notification-agent" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435778 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovs-vswitchd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435787 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-expirer" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435795 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="31158646-2c0c-4098-bd3e-ea307fa78716" containerName="barbican-keystone-listener-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435805 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="object-server" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435816 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435827 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b8537d-23ab-4c8d-9ca7-b307562baad8" containerName="nova-cell1-conductor-conductor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435839 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435847 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435876 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="21658ad3-b8e8-4743-b2c7-da4782850abc" containerName="neutron-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435893 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435901 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" containerName="cinder-scheduler" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435911 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="swift-recon-cron" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435922 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerName="glance-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435932 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8609052d-1ba2-4888-b973-05c8e4663632" containerName="glance-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435942 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerName="barbican-worker-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435951 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8402e30a-1517-41be-b468-1959c4b7621b" containerName="nova-metadata-metadata" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435958 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f459571-980e-439d-9dc2-72c0461a20c9" containerName="keystone-api" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435968 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c66255a-19d5-4417-bf43-f7f5bfff892a" containerName="nova-cell0-conductor-conductor" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435976 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-server" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435988 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a96a07-f63f-47d9-9191-0548996f01a7" containerName="cinder-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.435999 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee1d973-a40c-4db0-8cc7-1c64ece074ac" containerName="glance-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436011 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-updater" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436018 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="70dc1403-e7e9-4200-9a87-e3538a17c350" containerName="barbican-worker" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436027 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="14fdb1f3-fd7f-4b31-ac34-42438c44720a" containerName="kube-state-metrics" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436034 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f964a2e6-aad3-42c0-8290-c3aa52d99e5b" containerName="rabbitmq" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436046 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4ee4a6-b86a-454b-8952-6a0f16ce6353" containerName="ovsdb-server" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436056 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e5ca7c-1c5e-4f9e-85df-a92feaeddb43" containerName="rabbitmq" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436069 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="ceilometer-central-agent" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436079 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-replicator" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436086 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f2e6fd-58ee-4002-b167-096b3b715233" containerName="probe" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436097 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc45915-09df-4248-8cb8-c7b11d1e4a4c" containerName="proxy-httpd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436109 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e7c2f9-5486-4c21-a0b7-07c81d85a24c" containerName="galera" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436118 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="account-reaper" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436126 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e503bc45-db60-4bc8-bb97-3472d2456fdb" containerName="container-server" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.436135 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c381c8-c97e-4159-9bb4-3ede8f12d6e0" containerName="barbican-api-log" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.437252 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.456727 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmmd"] Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.473643 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-catalog-content\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.473794 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-utilities\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.473899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkhmh\" (UniqueName: \"kubernetes.io/projected/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-kube-api-access-kkhmh\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.574773 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-utilities\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.574841 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkhmh\" (UniqueName: \"kubernetes.io/projected/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-kube-api-access-kkhmh\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.574883 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-catalog-content\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.575334 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-utilities\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.575345 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-catalog-content\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.603022 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkhmh\" (UniqueName: \"kubernetes.io/projected/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-kube-api-access-kkhmh\") pod \"redhat-marketplace-mxmmd\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:19 crc kubenswrapper[4786]: I0313 15:29:19.755414 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:20 crc kubenswrapper[4786]: I0313 15:29:20.235013 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmmd"] Mar 13 15:29:21 crc kubenswrapper[4786]: I0313 15:29:21.065496 4786 generic.go:334] "Generic (PLEG): container finished" podID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerID="49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da" exitCode=0 Mar 13 15:29:21 crc kubenswrapper[4786]: I0313 15:29:21.065829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmmd" event={"ID":"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7","Type":"ContainerDied","Data":"49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da"} Mar 13 15:29:21 crc kubenswrapper[4786]: I0313 15:29:21.065903 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmmd" event={"ID":"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7","Type":"ContainerStarted","Data":"ef442b3b38b735317146106b99104a6973f812f02602cb8c9104bfd44e981065"} Mar 13 15:29:23 crc kubenswrapper[4786]: I0313 15:29:23.085714 4786 generic.go:334] "Generic (PLEG): container finished" podID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerID="0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670" exitCode=0 Mar 13 15:29:23 crc kubenswrapper[4786]: I0313 15:29:23.085852 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmmd" event={"ID":"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7","Type":"ContainerDied","Data":"0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670"} Mar 13 15:29:26 crc kubenswrapper[4786]: I0313 15:29:26.121431 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmmd" event={"ID":"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7","Type":"ContainerStarted","Data":"010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53"} Mar 13 15:29:26 crc kubenswrapper[4786]: I0313 15:29:26.139262 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mxmmd" podStartSLOduration=3.082806447 podStartE2EDuration="7.13923943s" podCreationTimestamp="2026-03-13 15:29:19 +0000 UTC" firstStartedPulling="2026-03-13 15:29:21.067567008 +0000 UTC m=+1591.230778819" lastFinishedPulling="2026-03-13 15:29:25.124000001 +0000 UTC m=+1595.287211802" observedRunningTime="2026-03-13 15:29:26.139015995 +0000 UTC m=+1596.302227826" watchObservedRunningTime="2026-03-13 15:29:26.13923943 +0000 UTC m=+1596.302451241" Mar 13 15:29:29 crc kubenswrapper[4786]: I0313 15:29:29.757254 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:29 crc kubenswrapper[4786]: I0313 15:29:29.757302 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:29 crc kubenswrapper[4786]: I0313 15:29:29.814024 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:30 crc kubenswrapper[4786]: I0313 15:29:30.197164 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:30 crc kubenswrapper[4786]: I0313 15:29:30.244277 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmmd"] Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.169761 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mxmmd" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerName="registry-server" containerID="cri-o://010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53" gracePeriod=2 Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.577269 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.750551 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-catalog-content\") pod \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.750744 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-utilities\") pod \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.750771 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkhmh\" (UniqueName: \"kubernetes.io/projected/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-kube-api-access-kkhmh\") pod \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\" (UID: \"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7\") " Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.751683 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-utilities" (OuterVolumeSpecName: "utilities") pod "23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" (UID: "23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.759175 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-kube-api-access-kkhmh" (OuterVolumeSpecName: "kube-api-access-kkhmh") pod "23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" (UID: "23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7"). InnerVolumeSpecName "kube-api-access-kkhmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.786653 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" (UID: "23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.852527 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkhmh\" (UniqueName: \"kubernetes.io/projected/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-kube-api-access-kkhmh\") on node \"crc\" DevicePath \"\"" Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.852568 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:29:32 crc kubenswrapper[4786]: I0313 15:29:32.852578 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.181437 4786 generic.go:334] "Generic (PLEG): container finished" podID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerID="010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53" exitCode=0 Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.181524 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxmmd" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.181547 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmmd" event={"ID":"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7","Type":"ContainerDied","Data":"010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53"} Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.182890 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxmmd" event={"ID":"23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7","Type":"ContainerDied","Data":"ef442b3b38b735317146106b99104a6973f812f02602cb8c9104bfd44e981065"} Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.182914 4786 scope.go:117] "RemoveContainer" containerID="010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.241886 4786 scope.go:117] "RemoveContainer" containerID="0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.270980 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmmd"] Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.278327 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxmmd"] Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.344016 4786 scope.go:117] "RemoveContainer" containerID="49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.390701 4786 scope.go:117] "RemoveContainer" containerID="010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53" Mar 13 15:29:33 crc kubenswrapper[4786]: E0313 15:29:33.394290 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53\": container with ID starting with 010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53 not found: ID does not exist" containerID="010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.394324 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53"} err="failed to get container status \"010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53\": rpc error: code = NotFound desc = could not find container \"010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53\": container with ID starting with 010a8079d0bb88df1e0d7e6cec5b1392612afe8cdac3bfcb599141bd6faa0e53 not found: ID does not exist" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.394346 4786 scope.go:117] "RemoveContainer" containerID="0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670" Mar 13 15:29:33 crc kubenswrapper[4786]: E0313 15:29:33.394689 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670\": container with ID starting with 0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670 not found: ID does not exist" containerID="0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.394738 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670"} err="failed to get container status \"0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670\": rpc error: code = NotFound desc = could not find container \"0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670\": container with ID starting with 0dc795c4ef5555b86c36ab6e4e2ad6fa4fb2c62ebf42215eb4fbaac5fbb10670 not found: ID does not exist" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.394766 4786 scope.go:117] "RemoveContainer" containerID="49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da" Mar 13 15:29:33 crc kubenswrapper[4786]: E0313 15:29:33.395121 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da\": container with ID starting with 49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da not found: ID does not exist" containerID="49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da" Mar 13 15:29:33 crc kubenswrapper[4786]: I0313 15:29:33.395146 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da"} err="failed to get container status \"49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da\": rpc error: code = NotFound desc = could not find container \"49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da\": container with ID starting with 49dd0404c4818debedd30e141faba3a6df9c323f80c78808df2b1070751cd4da not found: ID does not exist" Mar 13 15:29:34 crc kubenswrapper[4786]: I0313 15:29:34.559991 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" path="/var/lib/kubelet/pods/23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7/volumes" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.161530 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556930-6f5qf"] Mar 13 15:30:00 crc kubenswrapper[4786]: E0313 15:30:00.162366 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.162378 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4786]: E0313 15:30:00.162396 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerName="extract-utilities" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.162403 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerName="extract-utilities" Mar 13 15:30:00 crc kubenswrapper[4786]: E0313 15:30:00.162420 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerName="extract-content" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.162426 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerName="extract-content" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.162650 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e2bc48-ebc7-4ecc-b93b-aeea1072b4a7" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.163273 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-6f5qf" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.167634 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b"] Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.168568 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.170614 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.170835 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.171363 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.171612 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.171766 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.182804 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556930-6f5qf"] Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.212603 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b"] Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.339074 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25ccc7b5-7b80-4239-ae66-964942300583-config-volume\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.339309 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq75b\" (UniqueName: \"kubernetes.io/projected/25ccc7b5-7b80-4239-ae66-964942300583-kube-api-access-zq75b\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.339541 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glsdd\" (UniqueName: \"kubernetes.io/projected/8be7ee84-5588-4a0c-a24e-51cd4720b224-kube-api-access-glsdd\") pod \"auto-csr-approver-29556930-6f5qf\" (UID: \"8be7ee84-5588-4a0c-a24e-51cd4720b224\") " pod="openshift-infra/auto-csr-approver-29556930-6f5qf" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.339608 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25ccc7b5-7b80-4239-ae66-964942300583-secret-volume\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.440343 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glsdd\" (UniqueName: \"kubernetes.io/projected/8be7ee84-5588-4a0c-a24e-51cd4720b224-kube-api-access-glsdd\") pod \"auto-csr-approver-29556930-6f5qf\" (UID: \"8be7ee84-5588-4a0c-a24e-51cd4720b224\") " pod="openshift-infra/auto-csr-approver-29556930-6f5qf" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.440403 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25ccc7b5-7b80-4239-ae66-964942300583-secret-volume\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.440450 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25ccc7b5-7b80-4239-ae66-964942300583-config-volume\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.440516 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq75b\" (UniqueName: \"kubernetes.io/projected/25ccc7b5-7b80-4239-ae66-964942300583-kube-api-access-zq75b\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.441385 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25ccc7b5-7b80-4239-ae66-964942300583-config-volume\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.446027 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25ccc7b5-7b80-4239-ae66-964942300583-secret-volume\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.456047 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glsdd\" (UniqueName: \"kubernetes.io/projected/8be7ee84-5588-4a0c-a24e-51cd4720b224-kube-api-access-glsdd\") pod \"auto-csr-approver-29556930-6f5qf\" (UID: \"8be7ee84-5588-4a0c-a24e-51cd4720b224\") " pod="openshift-infra/auto-csr-approver-29556930-6f5qf" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.457916 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq75b\" (UniqueName: \"kubernetes.io/projected/25ccc7b5-7b80-4239-ae66-964942300583-kube-api-access-zq75b\") pod \"collect-profiles-29556930-kl66b\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.485757 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-6f5qf" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.497791 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.919112 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b"] Mar 13 15:30:00 crc kubenswrapper[4786]: I0313 15:30:00.981830 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556930-6f5qf"] Mar 13 15:30:00 crc kubenswrapper[4786]: W0313 15:30:00.984312 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8be7ee84_5588_4a0c_a24e_51cd4720b224.slice/crio-7e929fb38c0a9a3137dc8fb942ea286d093ca2d0d6c5269b8a5b944e449c2806 WatchSource:0}: Error finding container 7e929fb38c0a9a3137dc8fb942ea286d093ca2d0d6c5269b8a5b944e449c2806: Status 404 returned error can't find the container with id 7e929fb38c0a9a3137dc8fb942ea286d093ca2d0d6c5269b8a5b944e449c2806 Mar 13 15:30:01 crc kubenswrapper[4786]: I0313 15:30:01.427349 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-6f5qf" event={"ID":"8be7ee84-5588-4a0c-a24e-51cd4720b224","Type":"ContainerStarted","Data":"7e929fb38c0a9a3137dc8fb942ea286d093ca2d0d6c5269b8a5b944e449c2806"} Mar 13 15:30:01 crc kubenswrapper[4786]: I0313 15:30:01.429646 4786 generic.go:334] "Generic (PLEG): container finished" podID="25ccc7b5-7b80-4239-ae66-964942300583" containerID="a75c043559eb5f33d9b5551bd16525ea870f9c328e8a6e4b94f066427762e8ab" exitCode=0 Mar 13 15:30:01 crc kubenswrapper[4786]: I0313 15:30:01.429687 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" event={"ID":"25ccc7b5-7b80-4239-ae66-964942300583","Type":"ContainerDied","Data":"a75c043559eb5f33d9b5551bd16525ea870f9c328e8a6e4b94f066427762e8ab"} Mar 13 15:30:01 crc kubenswrapper[4786]: I0313 15:30:01.429709 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" event={"ID":"25ccc7b5-7b80-4239-ae66-964942300583","Type":"ContainerStarted","Data":"1c1c2492d5b944c19fdc9df0a0192dabd76c30b110bff0df9d5a7dcd945c21a4"} Mar 13 15:30:02 crc kubenswrapper[4786]: I0313 15:30:02.780696 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:02 crc kubenswrapper[4786]: I0313 15:30:02.969997 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq75b\" (UniqueName: \"kubernetes.io/projected/25ccc7b5-7b80-4239-ae66-964942300583-kube-api-access-zq75b\") pod \"25ccc7b5-7b80-4239-ae66-964942300583\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " Mar 13 15:30:02 crc kubenswrapper[4786]: I0313 15:30:02.970292 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25ccc7b5-7b80-4239-ae66-964942300583-config-volume\") pod \"25ccc7b5-7b80-4239-ae66-964942300583\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " Mar 13 15:30:02 crc kubenswrapper[4786]: I0313 15:30:02.970430 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25ccc7b5-7b80-4239-ae66-964942300583-secret-volume\") pod \"25ccc7b5-7b80-4239-ae66-964942300583\" (UID: \"25ccc7b5-7b80-4239-ae66-964942300583\") " Mar 13 15:30:02 crc kubenswrapper[4786]: I0313 15:30:02.971031 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ccc7b5-7b80-4239-ae66-964942300583-config-volume" (OuterVolumeSpecName: "config-volume") pod "25ccc7b5-7b80-4239-ae66-964942300583" (UID: "25ccc7b5-7b80-4239-ae66-964942300583"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:30:02 crc kubenswrapper[4786]: I0313 15:30:02.976476 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ccc7b5-7b80-4239-ae66-964942300583-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "25ccc7b5-7b80-4239-ae66-964942300583" (UID: "25ccc7b5-7b80-4239-ae66-964942300583"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:30:02 crc kubenswrapper[4786]: I0313 15:30:02.977009 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ccc7b5-7b80-4239-ae66-964942300583-kube-api-access-zq75b" (OuterVolumeSpecName: "kube-api-access-zq75b") pod "25ccc7b5-7b80-4239-ae66-964942300583" (UID: "25ccc7b5-7b80-4239-ae66-964942300583"). InnerVolumeSpecName "kube-api-access-zq75b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:30:03 crc kubenswrapper[4786]: I0313 15:30:03.072203 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25ccc7b5-7b80-4239-ae66-964942300583-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:03 crc kubenswrapper[4786]: I0313 15:30:03.072249 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25ccc7b5-7b80-4239-ae66-964942300583-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:03 crc kubenswrapper[4786]: I0313 15:30:03.072264 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq75b\" (UniqueName: \"kubernetes.io/projected/25ccc7b5-7b80-4239-ae66-964942300583-kube-api-access-zq75b\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:03 crc kubenswrapper[4786]: I0313 15:30:03.446910 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" event={"ID":"25ccc7b5-7b80-4239-ae66-964942300583","Type":"ContainerDied","Data":"1c1c2492d5b944c19fdc9df0a0192dabd76c30b110bff0df9d5a7dcd945c21a4"} Mar 13 15:30:03 crc kubenswrapper[4786]: I0313 15:30:03.447219 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c1c2492d5b944c19fdc9df0a0192dabd76c30b110bff0df9d5a7dcd945c21a4" Mar 13 15:30:03 crc kubenswrapper[4786]: I0313 15:30:03.446954 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b" Mar 13 15:30:03 crc kubenswrapper[4786]: I0313 15:30:03.449302 4786 generic.go:334] "Generic (PLEG): container finished" podID="8be7ee84-5588-4a0c-a24e-51cd4720b224" containerID="49a8d2fb95f7eabbe42eddaf38be0e77002493823be033535cb22fd0955d85a7" exitCode=0 Mar 13 15:30:03 crc kubenswrapper[4786]: I0313 15:30:03.449337 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-6f5qf" event={"ID":"8be7ee84-5588-4a0c-a24e-51cd4720b224","Type":"ContainerDied","Data":"49a8d2fb95f7eabbe42eddaf38be0e77002493823be033535cb22fd0955d85a7"} Mar 13 15:30:04 crc kubenswrapper[4786]: I0313 15:30:04.765268 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-6f5qf" Mar 13 15:30:04 crc kubenswrapper[4786]: I0313 15:30:04.895666 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glsdd\" (UniqueName: \"kubernetes.io/projected/8be7ee84-5588-4a0c-a24e-51cd4720b224-kube-api-access-glsdd\") pod \"8be7ee84-5588-4a0c-a24e-51cd4720b224\" (UID: \"8be7ee84-5588-4a0c-a24e-51cd4720b224\") " Mar 13 15:30:04 crc kubenswrapper[4786]: I0313 15:30:04.902597 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be7ee84-5588-4a0c-a24e-51cd4720b224-kube-api-access-glsdd" (OuterVolumeSpecName: "kube-api-access-glsdd") pod "8be7ee84-5588-4a0c-a24e-51cd4720b224" (UID: "8be7ee84-5588-4a0c-a24e-51cd4720b224"). InnerVolumeSpecName "kube-api-access-glsdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:30:04 crc kubenswrapper[4786]: I0313 15:30:04.997739 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glsdd\" (UniqueName: \"kubernetes.io/projected/8be7ee84-5588-4a0c-a24e-51cd4720b224-kube-api-access-glsdd\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:05 crc kubenswrapper[4786]: I0313 15:30:05.467534 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-6f5qf" event={"ID":"8be7ee84-5588-4a0c-a24e-51cd4720b224","Type":"ContainerDied","Data":"7e929fb38c0a9a3137dc8fb942ea286d093ca2d0d6c5269b8a5b944e449c2806"} Mar 13 15:30:05 crc kubenswrapper[4786]: I0313 15:30:05.467589 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e929fb38c0a9a3137dc8fb942ea286d093ca2d0d6c5269b8a5b944e449c2806" Mar 13 15:30:05 crc kubenswrapper[4786]: I0313 15:30:05.467602 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-6f5qf" Mar 13 15:30:05 crc kubenswrapper[4786]: I0313 15:30:05.824936 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-czchf"] Mar 13 15:30:05 crc kubenswrapper[4786]: I0313 15:30:05.832096 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-czchf"] Mar 13 15:30:06 crc kubenswrapper[4786]: I0313 15:30:06.567576 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3" path="/var/lib/kubelet/pods/48dd6149-3efc-4fa6-91b9-b6b32ed8a2e3/volumes" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.226973 4786 scope.go:117] "RemoveContainer" containerID="4e8b97f387f4fc2e4f6ac5319be8d84c34c6b50c92e4e873bda672174a618a08" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.252544 4786 scope.go:117] "RemoveContainer" containerID="cfade56165e64c5564e292137f43c0b63c9976a2de22095e5f66321f4a9d1220" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.299546 4786 scope.go:117] "RemoveContainer" containerID="b6d3b5c82813854a4d29327c7298f2fd417cb26077a9b2211e9e9e9f3281a1be" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.357739 4786 scope.go:117] "RemoveContainer" containerID="218185ba1cec541cc991701b18e7933dcc28c3b9c8ba80b603fd652e3058fda8" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.384891 4786 scope.go:117] "RemoveContainer" containerID="c3378ad848becf43c536e9376944ca88cfa4805b8b8bfd9f3012155f1ebc4f1c" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.415236 4786 scope.go:117] "RemoveContainer" containerID="ecc2577d6b2f2afdad59b20d9b01f674dd6de6f89593f69c27e47f63b8c4aa05" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.434152 4786 scope.go:117] "RemoveContainer" containerID="1205a8ab420b0ae6beac6025bfc70200476d490c756137e4c5d99d2fb21dcf74" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.463618 4786 scope.go:117] "RemoveContainer" containerID="5c01e07921c1f0119e885c5682397eb301c0509f4611ce41740be7679e8068a6" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.481116 4786 scope.go:117] "RemoveContainer" containerID="917b6a1970b845be8ffb05e14e44dd41a32d8494185628afdab72b15710941e5" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.510727 4786 scope.go:117] "RemoveContainer" containerID="4cb869cd4456dd89c676022d5997f15243c54fe9debbee32cf20df121c9b0bf8" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.528021 4786 scope.go:117] "RemoveContainer" containerID="0067f13081ec184761dd763d31be896d35e176fc5080d9886ecc604d62beab50" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.544032 4786 scope.go:117] "RemoveContainer" containerID="80be103a5b34c19bf82c7bb43c94d36cfe4c07dd80d0861ee4b562ee8a5e495d" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.561622 4786 scope.go:117] "RemoveContainer" containerID="dc737315c5feeb30b31f19b951dc0b3012f4726fea5f38b613fa8c0ef34159c8" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.579719 4786 scope.go:117] "RemoveContainer" containerID="321bb7aa80b43c241267fd5f31403d05a9a52408255cd6c724b195c5bb6ef716" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.607095 4786 scope.go:117] "RemoveContainer" containerID="267d7ab68da6aba0afd06d3f284ba02e6f436062867044b1a8025ccca64f775f" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.625139 4786 scope.go:117] "RemoveContainer" containerID="c5a558242d9db6738be911c2f2ead53cbee4619ab0b123e3cc7d2deb9efe88b0" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.653275 4786 scope.go:117] "RemoveContainer" containerID="c808c6542f268f29e39e9d5e67ec3597a53f8eb001ef4a8af43f8bcd4c925919" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.676343 4786 scope.go:117] "RemoveContainer" containerID="e8453d45bc476d4ec12b3b5bae95abb36c77cd13025a4bf085679749a3c0f337" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.694775 4786 scope.go:117] "RemoveContainer" containerID="0cf605181558cf4d31fe583a542de7db9acfb6209a581890ce5a50ea1ba5c372" Mar 13 15:30:08 crc kubenswrapper[4786]: I0313 15:30:08.720533 4786 scope.go:117] "RemoveContainer" containerID="02c8962944741a5e21d0094c41f35cebfff6ed06de1fa2ee62e1f8c35e344d05" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.365052 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xpxhf"] Mar 13 15:30:21 crc kubenswrapper[4786]: E0313 15:30:21.366249 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be7ee84-5588-4a0c-a24e-51cd4720b224" containerName="oc" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.366269 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be7ee84-5588-4a0c-a24e-51cd4720b224" containerName="oc" Mar 13 15:30:21 crc kubenswrapper[4786]: E0313 15:30:21.366321 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ccc7b5-7b80-4239-ae66-964942300583" containerName="collect-profiles" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.366333 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ccc7b5-7b80-4239-ae66-964942300583" containerName="collect-profiles" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.366559 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be7ee84-5588-4a0c-a24e-51cd4720b224" containerName="oc" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.366586 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ccc7b5-7b80-4239-ae66-964942300583" containerName="collect-profiles" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.367990 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.388138 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xpxhf"] Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.461264 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-utilities\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.461346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-catalog-content\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.461385 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jwj\" (UniqueName: \"kubernetes.io/projected/f28fdd18-7c14-41da-a289-c72262a7c457-kube-api-access-j7jwj\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.562270 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-catalog-content\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.562355 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jwj\" (UniqueName: \"kubernetes.io/projected/f28fdd18-7c14-41da-a289-c72262a7c457-kube-api-access-j7jwj\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.562436 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-utilities\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.563003 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-catalog-content\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.563015 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-utilities\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.593027 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jwj\" (UniqueName: \"kubernetes.io/projected/f28fdd18-7c14-41da-a289-c72262a7c457-kube-api-access-j7jwj\") pod \"community-operators-xpxhf\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:21 crc kubenswrapper[4786]: I0313 15:30:21.740831 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:22 crc kubenswrapper[4786]: I0313 15:30:22.265084 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xpxhf"] Mar 13 15:30:22 crc kubenswrapper[4786]: I0313 15:30:22.639248 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpxhf" event={"ID":"f28fdd18-7c14-41da-a289-c72262a7c457","Type":"ContainerStarted","Data":"eff9251f960da4cee31ad5847e845536ca4827b32683b230a905d686f0141f53"} Mar 13 15:30:23 crc kubenswrapper[4786]: I0313 15:30:23.654530 4786 generic.go:334] "Generic (PLEG): container finished" podID="f28fdd18-7c14-41da-a289-c72262a7c457" containerID="0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad" exitCode=0 Mar 13 15:30:23 crc kubenswrapper[4786]: I0313 15:30:23.654615 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpxhf" event={"ID":"f28fdd18-7c14-41da-a289-c72262a7c457","Type":"ContainerDied","Data":"0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad"} Mar 13 15:30:25 crc kubenswrapper[4786]: I0313 15:30:25.678366 4786 generic.go:334] "Generic (PLEG): container finished" podID="f28fdd18-7c14-41da-a289-c72262a7c457" containerID="15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91" exitCode=0 Mar 13 15:30:25 crc kubenswrapper[4786]: I0313 15:30:25.678448 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpxhf" event={"ID":"f28fdd18-7c14-41da-a289-c72262a7c457","Type":"ContainerDied","Data":"15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91"} Mar 13 15:30:26 crc kubenswrapper[4786]: I0313 15:30:26.690722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpxhf" event={"ID":"f28fdd18-7c14-41da-a289-c72262a7c457","Type":"ContainerStarted","Data":"081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5"} Mar 13 15:30:26 crc kubenswrapper[4786]: I0313 15:30:26.714110 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xpxhf" podStartSLOduration=3.255198596 podStartE2EDuration="5.714091522s" podCreationTimestamp="2026-03-13 15:30:21 +0000 UTC" firstStartedPulling="2026-03-13 15:30:23.658406661 +0000 UTC m=+1653.821618462" lastFinishedPulling="2026-03-13 15:30:26.117299557 +0000 UTC m=+1656.280511388" observedRunningTime="2026-03-13 15:30:26.711808964 +0000 UTC m=+1656.875020775" watchObservedRunningTime="2026-03-13 15:30:26.714091522 +0000 UTC m=+1656.877303333" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.754651 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ntjm5"] Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.756838 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.767202 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-catalog-content\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.767271 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j2bz\" (UniqueName: \"kubernetes.io/projected/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-kube-api-access-4j2bz\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.767320 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-utilities\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.776047 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntjm5"] Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.868352 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-catalog-content\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.868418 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j2bz\" (UniqueName: \"kubernetes.io/projected/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-kube-api-access-4j2bz\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.868467 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-utilities\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.868913 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-catalog-content\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.869326 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-utilities\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:27 crc kubenswrapper[4786]: I0313 15:30:27.890582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j2bz\" (UniqueName: \"kubernetes.io/projected/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-kube-api-access-4j2bz\") pod \"certified-operators-ntjm5\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:28 crc kubenswrapper[4786]: I0313 15:30:28.086195 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:28 crc kubenswrapper[4786]: I0313 15:30:28.532415 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntjm5"] Mar 13 15:30:28 crc kubenswrapper[4786]: I0313 15:30:28.712312 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntjm5" event={"ID":"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20","Type":"ContainerStarted","Data":"d0b98b18039093aac55edf0c5727ebc1ebc608598310bcef855110a75e7b7500"} Mar 13 15:30:29 crc kubenswrapper[4786]: I0313 15:30:29.725704 4786 generic.go:334] "Generic (PLEG): container finished" podID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerID="457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6" exitCode=0 Mar 13 15:30:29 crc kubenswrapper[4786]: I0313 15:30:29.725765 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntjm5" event={"ID":"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20","Type":"ContainerDied","Data":"457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6"} Mar 13 15:30:31 crc kubenswrapper[4786]: I0313 15:30:31.741203 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:31 crc kubenswrapper[4786]: I0313 15:30:31.741574 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:31 crc kubenswrapper[4786]: I0313 15:30:31.747144 4786 generic.go:334] "Generic (PLEG): container finished" podID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerID="0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e" exitCode=0 Mar 13 15:30:31 crc kubenswrapper[4786]: I0313 15:30:31.747188 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntjm5" event={"ID":"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20","Type":"ContainerDied","Data":"0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e"} Mar 13 15:30:31 crc kubenswrapper[4786]: I0313 15:30:31.822841 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:32 crc kubenswrapper[4786]: I0313 15:30:32.758622 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntjm5" event={"ID":"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20","Type":"ContainerStarted","Data":"db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587"} Mar 13 15:30:32 crc kubenswrapper[4786]: I0313 15:30:32.780801 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ntjm5" podStartSLOduration=3.335114264 podStartE2EDuration="5.780777916s" podCreationTimestamp="2026-03-13 15:30:27 +0000 UTC" firstStartedPulling="2026-03-13 15:30:29.727786574 +0000 UTC m=+1659.890998425" lastFinishedPulling="2026-03-13 15:30:32.173450246 +0000 UTC m=+1662.336662077" observedRunningTime="2026-03-13 15:30:32.776635362 +0000 UTC m=+1662.939847173" watchObservedRunningTime="2026-03-13 15:30:32.780777916 +0000 UTC m=+1662.943989737" Mar 13 15:30:32 crc kubenswrapper[4786]: I0313 15:30:32.821425 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:34 crc kubenswrapper[4786]: I0313 15:30:34.925787 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xpxhf"] Mar 13 15:30:34 crc kubenswrapper[4786]: I0313 15:30:34.926296 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xpxhf" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" containerName="registry-server" containerID="cri-o://081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5" gracePeriod=2 Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.319112 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.472818 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7jwj\" (UniqueName: \"kubernetes.io/projected/f28fdd18-7c14-41da-a289-c72262a7c457-kube-api-access-j7jwj\") pod \"f28fdd18-7c14-41da-a289-c72262a7c457\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.473306 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-catalog-content\") pod \"f28fdd18-7c14-41da-a289-c72262a7c457\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.473375 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-utilities\") pod \"f28fdd18-7c14-41da-a289-c72262a7c457\" (UID: \"f28fdd18-7c14-41da-a289-c72262a7c457\") " Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.474604 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-utilities" (OuterVolumeSpecName: "utilities") pod "f28fdd18-7c14-41da-a289-c72262a7c457" (UID: "f28fdd18-7c14-41da-a289-c72262a7c457"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.479079 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28fdd18-7c14-41da-a289-c72262a7c457-kube-api-access-j7jwj" (OuterVolumeSpecName: "kube-api-access-j7jwj") pod "f28fdd18-7c14-41da-a289-c72262a7c457" (UID: "f28fdd18-7c14-41da-a289-c72262a7c457"). InnerVolumeSpecName "kube-api-access-j7jwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.576652 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7jwj\" (UniqueName: \"kubernetes.io/projected/f28fdd18-7c14-41da-a289-c72262a7c457-kube-api-access-j7jwj\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.576690 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.619603 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f28fdd18-7c14-41da-a289-c72262a7c457" (UID: "f28fdd18-7c14-41da-a289-c72262a7c457"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.677392 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f28fdd18-7c14-41da-a289-c72262a7c457-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.795298 4786 generic.go:334] "Generic (PLEG): container finished" podID="f28fdd18-7c14-41da-a289-c72262a7c457" containerID="081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5" exitCode=0 Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.795351 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpxhf" event={"ID":"f28fdd18-7c14-41da-a289-c72262a7c457","Type":"ContainerDied","Data":"081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5"} Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.795370 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpxhf" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.795396 4786 scope.go:117] "RemoveContainer" containerID="081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.795383 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpxhf" event={"ID":"f28fdd18-7c14-41da-a289-c72262a7c457","Type":"ContainerDied","Data":"eff9251f960da4cee31ad5847e845536ca4827b32683b230a905d686f0141f53"} Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.827466 4786 scope.go:117] "RemoveContainer" containerID="15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.844726 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xpxhf"] Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.851988 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xpxhf"] Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.886071 4786 scope.go:117] "RemoveContainer" containerID="0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.907078 4786 scope.go:117] "RemoveContainer" containerID="081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5" Mar 13 15:30:35 crc kubenswrapper[4786]: E0313 15:30:35.907439 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5\": container with ID starting with 081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5 not found: ID does not exist" containerID="081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.907477 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5"} err="failed to get container status \"081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5\": rpc error: code = NotFound desc = could not find container \"081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5\": container with ID starting with 081c2e470761f0fa84950593ebde9d3d329997b102a4ee4f31fd3c0f5757b8b5 not found: ID does not exist" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.907505 4786 scope.go:117] "RemoveContainer" containerID="15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91" Mar 13 15:30:35 crc kubenswrapper[4786]: E0313 15:30:35.907992 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91\": container with ID starting with 15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91 not found: ID does not exist" containerID="15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.908025 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91"} err="failed to get container status \"15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91\": rpc error: code = NotFound desc = could not find container \"15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91\": container with ID starting with 15008b89b372c66924ec5ecbdd33d66d3b8780f4235e51d2a57639ca42ba8d91 not found: ID does not exist" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.908043 4786 scope.go:117] "RemoveContainer" containerID="0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad" Mar 13 15:30:35 crc kubenswrapper[4786]: E0313 15:30:35.908554 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad\": container with ID starting with 0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad not found: ID does not exist" containerID="0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad" Mar 13 15:30:35 crc kubenswrapper[4786]: I0313 15:30:35.908853 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad"} err="failed to get container status \"0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad\": rpc error: code = NotFound desc = could not find container \"0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad\": container with ID starting with 0498d5c3d77bb393ef7c9ae538acb3c4518214f7a3ce10ea3c390ae7334a8aad not found: ID does not exist" Mar 13 15:30:36 crc kubenswrapper[4786]: I0313 15:30:36.567611 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" path="/var/lib/kubelet/pods/f28fdd18-7c14-41da-a289-c72262a7c457/volumes" Mar 13 15:30:38 crc kubenswrapper[4786]: I0313 15:30:38.086564 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:38 crc kubenswrapper[4786]: I0313 15:30:38.086620 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:38 crc kubenswrapper[4786]: I0313 15:30:38.152217 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:38 crc kubenswrapper[4786]: I0313 15:30:38.880000 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:39 crc kubenswrapper[4786]: I0313 15:30:39.131804 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntjm5"] Mar 13 15:30:40 crc kubenswrapper[4786]: I0313 15:30:40.843158 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ntjm5" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerName="registry-server" containerID="cri-o://db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587" gracePeriod=2 Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.253097 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.359737 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-utilities\") pod \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.359823 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-catalog-content\") pod \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.359922 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j2bz\" (UniqueName: \"kubernetes.io/projected/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-kube-api-access-4j2bz\") pod \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\" (UID: \"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20\") " Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.361491 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-utilities" (OuterVolumeSpecName: "utilities") pod "0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" (UID: "0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.370983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-kube-api-access-4j2bz" (OuterVolumeSpecName: "kube-api-access-4j2bz") pod "0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" (UID: "0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20"). InnerVolumeSpecName "kube-api-access-4j2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.449920 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" (UID: "0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.461641 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.461680 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.461691 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j2bz\" (UniqueName: \"kubernetes.io/projected/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20-kube-api-access-4j2bz\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.854746 4786 generic.go:334] "Generic (PLEG): container finished" podID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerID="db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587" exitCode=0 Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.854811 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntjm5" event={"ID":"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20","Type":"ContainerDied","Data":"db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587"} Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.854905 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntjm5" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.855231 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntjm5" event={"ID":"0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20","Type":"ContainerDied","Data":"d0b98b18039093aac55edf0c5727ebc1ebc608598310bcef855110a75e7b7500"} Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.855419 4786 scope.go:117] "RemoveContainer" containerID="db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.889294 4786 scope.go:117] "RemoveContainer" containerID="0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.911834 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntjm5"] Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.925203 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ntjm5"] Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.937936 4786 scope.go:117] "RemoveContainer" containerID="457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.963720 4786 scope.go:117] "RemoveContainer" containerID="db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587" Mar 13 15:30:41 crc kubenswrapper[4786]: E0313 15:30:41.964640 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587\": container with ID starting with db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587 not found: ID does not exist" containerID="db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.964677 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587"} err="failed to get container status \"db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587\": rpc error: code = NotFound desc = could not find container \"db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587\": container with ID starting with db7244667e730dc99390881091d3ca492c8b636f03de4f062f6d3f351dd8f587 not found: ID does not exist" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.964708 4786 scope.go:117] "RemoveContainer" containerID="0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e" Mar 13 15:30:41 crc kubenswrapper[4786]: E0313 15:30:41.965276 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e\": container with ID starting with 0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e not found: ID does not exist" containerID="0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.965357 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e"} err="failed to get container status \"0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e\": rpc error: code = NotFound desc = could not find container \"0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e\": container with ID starting with 0ec2b13dae876fda932540578ebb52b1e8786794b54e3173dc1bf29e9ef4321e not found: ID does not exist" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.965408 4786 scope.go:117] "RemoveContainer" containerID="457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6" Mar 13 15:30:41 crc kubenswrapper[4786]: E0313 15:30:41.966159 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6\": container with ID starting with 457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6 not found: ID does not exist" containerID="457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6" Mar 13 15:30:41 crc kubenswrapper[4786]: I0313 15:30:41.966199 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6"} err="failed to get container status \"457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6\": rpc error: code = NotFound desc = could not find container \"457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6\": container with ID starting with 457ff83a52a65bd4821b2d802cd887ec3d6cff1b9feb0a4a8783b8ab6740f1d6 not found: ID does not exist" Mar 13 15:30:42 crc kubenswrapper[4786]: I0313 15:30:42.566979 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" path="/var/lib/kubelet/pods/0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20/volumes" Mar 13 15:31:07 crc kubenswrapper[4786]: I0313 15:31:07.868684 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:31:07 crc kubenswrapper[4786]: I0313 15:31:07.869452 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.034580 4786 scope.go:117] "RemoveContainer" containerID="093d9f552c30bdec922a9a62149d9ced8460871a61712767fba2e030b08a33c1" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.069423 4786 scope.go:117] "RemoveContainer" containerID="c56b988be8539efcb5b56a2c8c1a56c0540f0cc6b1d6619fa49940db16f80c4e" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.096621 4786 scope.go:117] "RemoveContainer" containerID="df580d52b6d0ead148d878f686b085d8186767fa428863bb7af2712c22cea3ab" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.117958 4786 scope.go:117] "RemoveContainer" containerID="1a635506d88c5049e300f4b6bd956300aa9a7758cc6f9b567b6c714e5f08cbd1" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.140164 4786 scope.go:117] "RemoveContainer" containerID="99d73e1c554b9b1b11403cbffe4ce8b73fd05d82195bf4be2a43decef93be656" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.194047 4786 scope.go:117] "RemoveContainer" containerID="2825a787b40c80d07e8d382d6f8fdf4a59b0f22bd3cbc70aa4ef72572e2f218d" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.212342 4786 scope.go:117] "RemoveContainer" containerID="127103e9eac9cb6727d347c51ea0538ede6898b83fbb9348b73c4b1c3cb88cb9" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.322447 4786 scope.go:117] "RemoveContainer" containerID="531d2a78c9146820b5a2df89d9ec1e84eb160d49878916f2704fc6e9e8fd25f6" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.373148 4786 scope.go:117] "RemoveContainer" containerID="8bbd1bdba96d406cca8c0659cce4f0f05a674440ee5ba65657adc3d6eb97a621" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.430266 4786 scope.go:117] "RemoveContainer" containerID="6eb31899fcb44ea87d84b174734072a3750561c394aa977ab3b130b0cdf09a59" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.445068 4786 scope.go:117] "RemoveContainer" containerID="149ae29761fcea916eb11613dc8da76bbebea576e3bc59e445903d136869a7db" Mar 13 15:31:09 crc kubenswrapper[4786]: I0313 15:31:09.462871 4786 scope.go:117] "RemoveContainer" containerID="b288097f55569918eb38ccdfc6bf25211a56db02df55dbab27d3c330c057a809" Mar 13 15:31:37 crc kubenswrapper[4786]: I0313 15:31:37.869278 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:31:37 crc kubenswrapper[4786]: I0313 15:31:37.870105 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.146923 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556932-97z5t"] Mar 13 15:32:00 crc kubenswrapper[4786]: E0313 15:32:00.147922 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" containerName="registry-server" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.147945 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" containerName="registry-server" Mar 13 15:32:00 crc kubenswrapper[4786]: E0313 15:32:00.147979 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerName="extract-content" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.147993 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerName="extract-content" Mar 13 15:32:00 crc kubenswrapper[4786]: E0313 15:32:00.148018 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerName="registry-server" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.148031 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerName="registry-server" Mar 13 15:32:00 crc kubenswrapper[4786]: E0313 15:32:00.148048 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" containerName="extract-content" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.148061 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" containerName="extract-content" Mar 13 15:32:00 crc kubenswrapper[4786]: E0313 15:32:00.148076 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" containerName="extract-utilities" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.148089 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" containerName="extract-utilities" Mar 13 15:32:00 crc kubenswrapper[4786]: E0313 15:32:00.148124 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerName="extract-utilities" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.148139 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerName="extract-utilities" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.148450 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8c0bcc-0897-4e8b-8317-d46e6fdb9a20" containerName="registry-server" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.148480 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28fdd18-7c14-41da-a289-c72262a7c457" containerName="registry-server" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.149186 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-97z5t" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.153613 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.153805 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.154040 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.159270 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556932-97z5t"] Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.305490 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbbw\" (UniqueName: \"kubernetes.io/projected/44dcb8ac-ebda-413e-b1ea-2a118379f8f7-kube-api-access-khbbw\") pod \"auto-csr-approver-29556932-97z5t\" (UID: \"44dcb8ac-ebda-413e-b1ea-2a118379f8f7\") " pod="openshift-infra/auto-csr-approver-29556932-97z5t" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.407686 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbbw\" (UniqueName: \"kubernetes.io/projected/44dcb8ac-ebda-413e-b1ea-2a118379f8f7-kube-api-access-khbbw\") pod \"auto-csr-approver-29556932-97z5t\" (UID: \"44dcb8ac-ebda-413e-b1ea-2a118379f8f7\") " pod="openshift-infra/auto-csr-approver-29556932-97z5t" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.427478 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbbw\" (UniqueName: \"kubernetes.io/projected/44dcb8ac-ebda-413e-b1ea-2a118379f8f7-kube-api-access-khbbw\") pod \"auto-csr-approver-29556932-97z5t\" (UID: \"44dcb8ac-ebda-413e-b1ea-2a118379f8f7\") " pod="openshift-infra/auto-csr-approver-29556932-97z5t" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.491084 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-97z5t" Mar 13 15:32:00 crc kubenswrapper[4786]: I0313 15:32:00.957213 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556932-97z5t"] Mar 13 15:32:01 crc kubenswrapper[4786]: I0313 15:32:01.543418 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-97z5t" event={"ID":"44dcb8ac-ebda-413e-b1ea-2a118379f8f7","Type":"ContainerStarted","Data":"2210967b2da7dd498efdb654bc9893a189b41fb24a69c1dac801a9858d696808"} Mar 13 15:32:02 crc kubenswrapper[4786]: I0313 15:32:02.560647 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-97z5t" event={"ID":"44dcb8ac-ebda-413e-b1ea-2a118379f8f7","Type":"ContainerStarted","Data":"970c749725ea3a162d26b0176de43baa644794ad8505dfabc493bf34b9e2809a"} Mar 13 15:32:02 crc kubenswrapper[4786]: I0313 15:32:02.584431 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556932-97z5t" podStartSLOduration=1.4445961170000001 podStartE2EDuration="2.584408163s" podCreationTimestamp="2026-03-13 15:32:00 +0000 UTC" firstStartedPulling="2026-03-13 15:32:00.973691679 +0000 UTC m=+1751.136903490" lastFinishedPulling="2026-03-13 15:32:02.113503725 +0000 UTC m=+1752.276715536" observedRunningTime="2026-03-13 15:32:02.57832076 +0000 UTC m=+1752.741532571" watchObservedRunningTime="2026-03-13 15:32:02.584408163 +0000 UTC m=+1752.747619974" Mar 13 15:32:03 crc kubenswrapper[4786]: I0313 15:32:03.570277 4786 generic.go:334] "Generic (PLEG): container finished" podID="44dcb8ac-ebda-413e-b1ea-2a118379f8f7" containerID="970c749725ea3a162d26b0176de43baa644794ad8505dfabc493bf34b9e2809a" exitCode=0 Mar 13 15:32:03 crc kubenswrapper[4786]: I0313 15:32:03.570420 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-97z5t" event={"ID":"44dcb8ac-ebda-413e-b1ea-2a118379f8f7","Type":"ContainerDied","Data":"970c749725ea3a162d26b0176de43baa644794ad8505dfabc493bf34b9e2809a"} Mar 13 15:32:04 crc kubenswrapper[4786]: I0313 15:32:04.870495 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-97z5t" Mar 13 15:32:04 crc kubenswrapper[4786]: I0313 15:32:04.976267 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khbbw\" (UniqueName: \"kubernetes.io/projected/44dcb8ac-ebda-413e-b1ea-2a118379f8f7-kube-api-access-khbbw\") pod \"44dcb8ac-ebda-413e-b1ea-2a118379f8f7\" (UID: \"44dcb8ac-ebda-413e-b1ea-2a118379f8f7\") " Mar 13 15:32:04 crc kubenswrapper[4786]: I0313 15:32:04.980802 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44dcb8ac-ebda-413e-b1ea-2a118379f8f7-kube-api-access-khbbw" (OuterVolumeSpecName: "kube-api-access-khbbw") pod "44dcb8ac-ebda-413e-b1ea-2a118379f8f7" (UID: "44dcb8ac-ebda-413e-b1ea-2a118379f8f7"). InnerVolumeSpecName "kube-api-access-khbbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:32:05 crc kubenswrapper[4786]: I0313 15:32:05.078515 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khbbw\" (UniqueName: \"kubernetes.io/projected/44dcb8ac-ebda-413e-b1ea-2a118379f8f7-kube-api-access-khbbw\") on node \"crc\" DevicePath \"\"" Mar 13 15:32:05 crc kubenswrapper[4786]: I0313 15:32:05.599558 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-97z5t" event={"ID":"44dcb8ac-ebda-413e-b1ea-2a118379f8f7","Type":"ContainerDied","Data":"2210967b2da7dd498efdb654bc9893a189b41fb24a69c1dac801a9858d696808"} Mar 13 15:32:05 crc kubenswrapper[4786]: I0313 15:32:05.599599 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2210967b2da7dd498efdb654bc9893a189b41fb24a69c1dac801a9858d696808" Mar 13 15:32:05 crc kubenswrapper[4786]: I0313 15:32:05.599637 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-97z5t" Mar 13 15:32:05 crc kubenswrapper[4786]: I0313 15:32:05.655573 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-wmpbn"] Mar 13 15:32:05 crc kubenswrapper[4786]: I0313 15:32:05.661784 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-wmpbn"] Mar 13 15:32:06 crc kubenswrapper[4786]: I0313 15:32:06.570086 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7" path="/var/lib/kubelet/pods/be3ea1f3-d0e3-43d8-a99b-8ef3d473bee7/volumes" Mar 13 15:32:07 crc kubenswrapper[4786]: I0313 15:32:07.869206 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:32:07 crc kubenswrapper[4786]: I0313 15:32:07.869291 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:32:07 crc kubenswrapper[4786]: I0313 15:32:07.869362 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:32:07 crc kubenswrapper[4786]: I0313 15:32:07.870092 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:32:07 crc kubenswrapper[4786]: I0313 15:32:07.870186 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" gracePeriod=600 Mar 13 15:32:08 crc kubenswrapper[4786]: E0313 15:32:08.006605 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:32:08 crc kubenswrapper[4786]: I0313 15:32:08.637561 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" exitCode=0 Mar 13 15:32:08 crc kubenswrapper[4786]: I0313 15:32:08.637626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a"} Mar 13 15:32:08 crc kubenswrapper[4786]: I0313 15:32:08.637682 4786 scope.go:117] "RemoveContainer" containerID="a5ef85bd6fdd298e112745a45310d0fedfe424a8161d80698615752d010dc319" Mar 13 15:32:08 crc kubenswrapper[4786]: I0313 15:32:08.638556 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:32:08 crc kubenswrapper[4786]: E0313 15:32:08.639054 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.669094 4786 scope.go:117] "RemoveContainer" containerID="9a6f3ca38fa64af3b952f62a4de858e3db59344ce667e5e2776aba24da0b0bf9" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.722673 4786 scope.go:117] "RemoveContainer" containerID="56fd56f4a759002c22dddb32f7cdac80f70005c5ac86cfce40fbb6bb3ffc6e45" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.749587 4786 scope.go:117] "RemoveContainer" containerID="4c92acc295fd57ca23b0e288711fb9a3af052bd9a754638cb9ef30ab57e5b073" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.770829 4786 scope.go:117] "RemoveContainer" containerID="6fee30319ab254dc362b9cb5404359a46216a536ddf834f8c5f2549a88b37dcc" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.792609 4786 scope.go:117] "RemoveContainer" containerID="7391bf2bfd5a11f51ed0fde91d5f31bb0b8d96280948cba5e98ffa0e3356bece" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.815527 4786 scope.go:117] "RemoveContainer" containerID="bc46a0f61c16d0854539483ecafa210f744c7a82ed6f721a3c996b3f7c16d0ee" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.834640 4786 scope.go:117] "RemoveContainer" containerID="f8fb520920d825dd25aaead12b8048c225a995e2f744b27b8f73261839e24997" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.858821 4786 scope.go:117] "RemoveContainer" containerID="0dab3fd432281d71bbfe474a789a6904e1521fb26d710ab2e196030b217b7c51" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.882403 4786 scope.go:117] "RemoveContainer" containerID="ef2bbe6ec8f3f59479b1f5d89c5a9963a5edef3c6766386ec321d8bd3e16cf0d" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.919591 4786 scope.go:117] "RemoveContainer" containerID="da85c0b8ba7100a5a7dd0b1919a32c21a917bfb0e41b3a25bb5709d1b826d78c" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.935975 4786 scope.go:117] "RemoveContainer" containerID="13d8ceed72e49b7cd2a9089e37e1daad7c70d0e13d73a4feb0a79e5b2664f7d2" Mar 13 15:32:09 crc kubenswrapper[4786]: I0313 15:32:09.955978 4786 scope.go:117] "RemoveContainer" containerID="1f32639a5f98b5405972431389f642ad6d670e08baa6cbd9a62adbd7a1df9754" Mar 13 15:32:19 crc kubenswrapper[4786]: I0313 15:32:19.552160 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:32:19 crc kubenswrapper[4786]: E0313 15:32:19.553052 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:32:34 crc kubenswrapper[4786]: I0313 15:32:34.552464 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:32:34 crc kubenswrapper[4786]: E0313 15:32:34.554513 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:32:47 crc kubenswrapper[4786]: I0313 15:32:47.552632 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:32:47 crc kubenswrapper[4786]: E0313 15:32:47.553610 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:33:01 crc kubenswrapper[4786]: I0313 15:33:01.551583 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:33:01 crc kubenswrapper[4786]: E0313 15:33:01.552408 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:33:10 crc kubenswrapper[4786]: I0313 15:33:10.119177 4786 scope.go:117] "RemoveContainer" containerID="aa58163b1811bf4e08e92bda5164b74d3c5c5e36ddf0d80825b4497a55b219de" Mar 13 15:33:10 crc kubenswrapper[4786]: I0313 15:33:10.167409 4786 scope.go:117] "RemoveContainer" containerID="b88c276bd3280d84dd7d091ac5a691295599342583e25d8e6753ca7442b48a94" Mar 13 15:33:10 crc kubenswrapper[4786]: I0313 15:33:10.199318 4786 scope.go:117] "RemoveContainer" containerID="97169bbe009e1a8cd5540385dcb980d4b982bb33d4a67d5a28a9e3329c1751a0" Mar 13 15:33:10 crc kubenswrapper[4786]: I0313 15:33:10.241869 4786 scope.go:117] "RemoveContainer" containerID="06c5f43e5b49a883415f1dcb2802322135aa1d4ac4bf33e0a364acafc89c04b4" Mar 13 15:33:10 crc kubenswrapper[4786]: I0313 15:33:10.260997 4786 scope.go:117] "RemoveContainer" containerID="dc20faaf8cc18fcc57c6cd26a3dd4b742eebc5e5fa114e21cff65868c6520d96" Mar 13 15:33:10 crc kubenswrapper[4786]: I0313 15:33:10.327887 4786 scope.go:117] "RemoveContainer" containerID="e6f71e613f469c2bac0f3799e031a811cb3db82f35c00b0f461a98ffb7b7c85e" Mar 13 15:33:15 crc kubenswrapper[4786]: I0313 15:33:15.552080 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:33:15 crc kubenswrapper[4786]: E0313 15:33:15.552838 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:33:28 crc kubenswrapper[4786]: I0313 15:33:28.553507 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:33:28 crc kubenswrapper[4786]: E0313 15:33:28.554392 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:33:41 crc kubenswrapper[4786]: I0313 15:33:41.555545 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:33:41 crc kubenswrapper[4786]: E0313 15:33:41.556281 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:33:53 crc kubenswrapper[4786]: I0313 15:33:53.552468 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:33:53 crc kubenswrapper[4786]: E0313 15:33:53.554209 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.415286 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zh9mk"] Mar 13 15:33:55 crc kubenswrapper[4786]: E0313 15:33:55.415762 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44dcb8ac-ebda-413e-b1ea-2a118379f8f7" containerName="oc" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.415786 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="44dcb8ac-ebda-413e-b1ea-2a118379f8f7" containerName="oc" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.416055 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="44dcb8ac-ebda-413e-b1ea-2a118379f8f7" containerName="oc" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.417657 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.433416 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zh9mk"] Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.574621 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-catalog-content\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.574694 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-utilities\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.574998 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rbg\" (UniqueName: \"kubernetes.io/projected/fb825343-cab4-4755-9c90-1deb9d43ae1a-kube-api-access-t9rbg\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.675809 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rbg\" (UniqueName: \"kubernetes.io/projected/fb825343-cab4-4755-9c90-1deb9d43ae1a-kube-api-access-t9rbg\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.675904 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-catalog-content\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.675959 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-utilities\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.676494 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-catalog-content\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.677160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-utilities\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.715238 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rbg\" (UniqueName: \"kubernetes.io/projected/fb825343-cab4-4755-9c90-1deb9d43ae1a-kube-api-access-t9rbg\") pod \"redhat-operators-zh9mk\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:55 crc kubenswrapper[4786]: I0313 15:33:55.736824 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:33:56 crc kubenswrapper[4786]: I0313 15:33:56.008897 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zh9mk"] Mar 13 15:33:56 crc kubenswrapper[4786]: I0313 15:33:56.553260 4786 generic.go:334] "Generic (PLEG): container finished" podID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerID="bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9" exitCode=0 Mar 13 15:33:56 crc kubenswrapper[4786]: I0313 15:33:56.554586 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:33:56 crc kubenswrapper[4786]: I0313 15:33:56.560345 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh9mk" event={"ID":"fb825343-cab4-4755-9c90-1deb9d43ae1a","Type":"ContainerDied","Data":"bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9"} Mar 13 15:33:56 crc kubenswrapper[4786]: I0313 15:33:56.560382 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh9mk" event={"ID":"fb825343-cab4-4755-9c90-1deb9d43ae1a","Type":"ContainerStarted","Data":"2ef00a77833f2287112b742e043b984a2543aff5b6078aa54f26b31b2505e9d4"} Mar 13 15:33:57 crc kubenswrapper[4786]: I0313 15:33:57.561000 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh9mk" event={"ID":"fb825343-cab4-4755-9c90-1deb9d43ae1a","Type":"ContainerStarted","Data":"a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e"} Mar 13 15:33:58 crc kubenswrapper[4786]: I0313 15:33:58.568274 4786 generic.go:334] "Generic (PLEG): container finished" podID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerID="a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e" exitCode=0 Mar 13 15:33:58 crc kubenswrapper[4786]: I0313 15:33:58.568312 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh9mk" event={"ID":"fb825343-cab4-4755-9c90-1deb9d43ae1a","Type":"ContainerDied","Data":"a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e"} Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.140337 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556934-dvm7v"] Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.141661 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.144608 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.145416 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.146409 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.147962 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556934-dvm7v"] Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.236315 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwtxh\" (UniqueName: \"kubernetes.io/projected/0b5c259e-2813-4051-a355-302bf97cd85e-kube-api-access-pwtxh\") pod \"auto-csr-approver-29556934-dvm7v\" (UID: \"0b5c259e-2813-4051-a355-302bf97cd85e\") " pod="openshift-infra/auto-csr-approver-29556934-dvm7v" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.338046 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwtxh\" (UniqueName: \"kubernetes.io/projected/0b5c259e-2813-4051-a355-302bf97cd85e-kube-api-access-pwtxh\") pod \"auto-csr-approver-29556934-dvm7v\" (UID: \"0b5c259e-2813-4051-a355-302bf97cd85e\") " pod="openshift-infra/auto-csr-approver-29556934-dvm7v" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.368747 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwtxh\" (UniqueName: \"kubernetes.io/projected/0b5c259e-2813-4051-a355-302bf97cd85e-kube-api-access-pwtxh\") pod \"auto-csr-approver-29556934-dvm7v\" (UID: \"0b5c259e-2813-4051-a355-302bf97cd85e\") " pod="openshift-infra/auto-csr-approver-29556934-dvm7v" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.459447 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.607669 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh9mk" event={"ID":"fb825343-cab4-4755-9c90-1deb9d43ae1a","Type":"ContainerStarted","Data":"d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b"} Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.634057 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zh9mk" podStartSLOduration=2.6998813 podStartE2EDuration="5.634039142s" podCreationTimestamp="2026-03-13 15:33:55 +0000 UTC" firstStartedPulling="2026-03-13 15:33:56.554299767 +0000 UTC m=+1866.717511588" lastFinishedPulling="2026-03-13 15:33:59.488457619 +0000 UTC m=+1869.651669430" observedRunningTime="2026-03-13 15:34:00.632574665 +0000 UTC m=+1870.795786476" watchObservedRunningTime="2026-03-13 15:34:00.634039142 +0000 UTC m=+1870.797250963" Mar 13 15:34:00 crc kubenswrapper[4786]: I0313 15:34:00.763691 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556934-dvm7v"] Mar 13 15:34:01 crc kubenswrapper[4786]: I0313 15:34:01.618597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" event={"ID":"0b5c259e-2813-4051-a355-302bf97cd85e","Type":"ContainerStarted","Data":"0ce0b726407ab7dd0462971c645c4ae3c5aa36b6feac976cf06e766e909c7ba8"} Mar 13 15:34:02 crc kubenswrapper[4786]: I0313 15:34:02.626038 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" event={"ID":"0b5c259e-2813-4051-a355-302bf97cd85e","Type":"ContainerStarted","Data":"e9ed7ee176bad5639be759b39583ca2ee6075a120163ba1f1d177e6dbe551b45"} Mar 13 15:34:02 crc kubenswrapper[4786]: I0313 15:34:02.644964 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" podStartSLOduration=1.233553505 podStartE2EDuration="2.644936985s" podCreationTimestamp="2026-03-13 15:34:00 +0000 UTC" firstStartedPulling="2026-03-13 15:34:00.77865922 +0000 UTC m=+1870.941871021" lastFinishedPulling="2026-03-13 15:34:02.19004268 +0000 UTC m=+1872.353254501" observedRunningTime="2026-03-13 15:34:02.637653602 +0000 UTC m=+1872.800865413" watchObservedRunningTime="2026-03-13 15:34:02.644936985 +0000 UTC m=+1872.808148796" Mar 13 15:34:03 crc kubenswrapper[4786]: I0313 15:34:03.634489 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" event={"ID":"0b5c259e-2813-4051-a355-302bf97cd85e","Type":"ContainerDied","Data":"e9ed7ee176bad5639be759b39583ca2ee6075a120163ba1f1d177e6dbe551b45"} Mar 13 15:34:03 crc kubenswrapper[4786]: I0313 15:34:03.634427 4786 generic.go:334] "Generic (PLEG): container finished" podID="0b5c259e-2813-4051-a355-302bf97cd85e" containerID="e9ed7ee176bad5639be759b39583ca2ee6075a120163ba1f1d177e6dbe551b45" exitCode=0 Mar 13 15:34:04 crc kubenswrapper[4786]: I0313 15:34:04.552334 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:34:04 crc kubenswrapper[4786]: E0313 15:34:04.552554 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:34:04 crc kubenswrapper[4786]: I0313 15:34:04.934842 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.011995 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwtxh\" (UniqueName: \"kubernetes.io/projected/0b5c259e-2813-4051-a355-302bf97cd85e-kube-api-access-pwtxh\") pod \"0b5c259e-2813-4051-a355-302bf97cd85e\" (UID: \"0b5c259e-2813-4051-a355-302bf97cd85e\") " Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.019763 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5c259e-2813-4051-a355-302bf97cd85e-kube-api-access-pwtxh" (OuterVolumeSpecName: "kube-api-access-pwtxh") pod "0b5c259e-2813-4051-a355-302bf97cd85e" (UID: "0b5c259e-2813-4051-a355-302bf97cd85e"). InnerVolumeSpecName "kube-api-access-pwtxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.113575 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwtxh\" (UniqueName: \"kubernetes.io/projected/0b5c259e-2813-4051-a355-302bf97cd85e-kube-api-access-pwtxh\") on node \"crc\" DevicePath \"\"" Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.653265 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" event={"ID":"0b5c259e-2813-4051-a355-302bf97cd85e","Type":"ContainerDied","Data":"0ce0b726407ab7dd0462971c645c4ae3c5aa36b6feac976cf06e766e909c7ba8"} Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.653312 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556934-dvm7v" Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.653325 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ce0b726407ab7dd0462971c645c4ae3c5aa36b6feac976cf06e766e909c7ba8" Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.716411 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556928-vqdx2"] Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.723552 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556928-vqdx2"] Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.737269 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:34:05 crc kubenswrapper[4786]: I0313 15:34:05.737333 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:34:06 crc kubenswrapper[4786]: I0313 15:34:06.560049 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acee06e2-3e35-4f56-8e16-f9dfbac79a59" path="/var/lib/kubelet/pods/acee06e2-3e35-4f56-8e16-f9dfbac79a59/volumes" Mar 13 15:34:06 crc kubenswrapper[4786]: I0313 15:34:06.782166 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zh9mk" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="registry-server" probeResult="failure" output=< Mar 13 15:34:06 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 15:34:06 crc kubenswrapper[4786]: > Mar 13 15:34:10 crc kubenswrapper[4786]: I0313 15:34:10.421874 4786 scope.go:117] "RemoveContainer" containerID="5bd5e837985b7d27d3ae9656f3b436ea8a4e18eb2f4dc859342ba05567ee81f9" Mar 13 15:34:10 crc kubenswrapper[4786]: I0313 15:34:10.471882 4786 scope.go:117] "RemoveContainer" containerID="ab2535b8e1de5140eefc200ad3169b4b8cccfd02a55308f9e31396c3a4208a9d" Mar 13 15:34:15 crc kubenswrapper[4786]: I0313 15:34:15.782174 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:34:15 crc kubenswrapper[4786]: I0313 15:34:15.826602 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:34:16 crc kubenswrapper[4786]: I0313 15:34:16.021224 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zh9mk"] Mar 13 15:34:17 crc kubenswrapper[4786]: I0313 15:34:17.552021 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:34:17 crc kubenswrapper[4786]: E0313 15:34:17.552667 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:34:17 crc kubenswrapper[4786]: I0313 15:34:17.759631 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zh9mk" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="registry-server" containerID="cri-o://d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b" gracePeriod=2 Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.219414 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.323011 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-utilities\") pod \"fb825343-cab4-4755-9c90-1deb9d43ae1a\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.323099 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-catalog-content\") pod \"fb825343-cab4-4755-9c90-1deb9d43ae1a\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.323170 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9rbg\" (UniqueName: \"kubernetes.io/projected/fb825343-cab4-4755-9c90-1deb9d43ae1a-kube-api-access-t9rbg\") pod \"fb825343-cab4-4755-9c90-1deb9d43ae1a\" (UID: \"fb825343-cab4-4755-9c90-1deb9d43ae1a\") " Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.323758 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-utilities" (OuterVolumeSpecName: "utilities") pod "fb825343-cab4-4755-9c90-1deb9d43ae1a" (UID: "fb825343-cab4-4755-9c90-1deb9d43ae1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.327919 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb825343-cab4-4755-9c90-1deb9d43ae1a-kube-api-access-t9rbg" (OuterVolumeSpecName: "kube-api-access-t9rbg") pod "fb825343-cab4-4755-9c90-1deb9d43ae1a" (UID: "fb825343-cab4-4755-9c90-1deb9d43ae1a"). InnerVolumeSpecName "kube-api-access-t9rbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.424618 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.424662 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9rbg\" (UniqueName: \"kubernetes.io/projected/fb825343-cab4-4755-9c90-1deb9d43ae1a-kube-api-access-t9rbg\") on node \"crc\" DevicePath \"\"" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.483468 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb825343-cab4-4755-9c90-1deb9d43ae1a" (UID: "fb825343-cab4-4755-9c90-1deb9d43ae1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.525591 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb825343-cab4-4755-9c90-1deb9d43ae1a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.777252 4786 generic.go:334] "Generic (PLEG): container finished" podID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerID="d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b" exitCode=0 Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.777308 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh9mk" event={"ID":"fb825343-cab4-4755-9c90-1deb9d43ae1a","Type":"ContainerDied","Data":"d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b"} Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.777337 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zh9mk" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.777372 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zh9mk" event={"ID":"fb825343-cab4-4755-9c90-1deb9d43ae1a","Type":"ContainerDied","Data":"2ef00a77833f2287112b742e043b984a2543aff5b6078aa54f26b31b2505e9d4"} Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.777406 4786 scope.go:117] "RemoveContainer" containerID="d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.801509 4786 scope.go:117] "RemoveContainer" containerID="a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.802888 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zh9mk"] Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.826392 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zh9mk"] Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.834131 4786 scope.go:117] "RemoveContainer" containerID="bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.852428 4786 scope.go:117] "RemoveContainer" containerID="d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b" Mar 13 15:34:18 crc kubenswrapper[4786]: E0313 15:34:18.853239 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b\": container with ID starting with d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b not found: ID does not exist" containerID="d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.853270 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b"} err="failed to get container status \"d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b\": rpc error: code = NotFound desc = could not find container \"d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b\": container with ID starting with d874142a8a5586900f5ebd317f21577b7ee52834bb6da3a8ecb798319bec710b not found: ID does not exist" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.853290 4786 scope.go:117] "RemoveContainer" containerID="a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e" Mar 13 15:34:18 crc kubenswrapper[4786]: E0313 15:34:18.853629 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e\": container with ID starting with a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e not found: ID does not exist" containerID="a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.853661 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e"} err="failed to get container status \"a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e\": rpc error: code = NotFound desc = could not find container \"a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e\": container with ID starting with a5bba610108c59dee2b15005d86a77272ea3c313437b6940017ba67f3b13428e not found: ID does not exist" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.853681 4786 scope.go:117] "RemoveContainer" containerID="bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9" Mar 13 15:34:18 crc kubenswrapper[4786]: E0313 15:34:18.854021 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9\": container with ID starting with bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9 not found: ID does not exist" containerID="bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9" Mar 13 15:34:18 crc kubenswrapper[4786]: I0313 15:34:18.854047 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9"} err="failed to get container status \"bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9\": rpc error: code = NotFound desc = could not find container \"bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9\": container with ID starting with bb3504646a1ae2ef2502c3180c87b7aecb11c268a1d2a27509ce69c3b51531f9 not found: ID does not exist" Mar 13 15:34:20 crc kubenswrapper[4786]: I0313 15:34:20.565838 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" path="/var/lib/kubelet/pods/fb825343-cab4-4755-9c90-1deb9d43ae1a/volumes" Mar 13 15:34:32 crc kubenswrapper[4786]: I0313 15:34:32.552511 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:34:32 crc kubenswrapper[4786]: E0313 15:34:32.553151 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:34:47 crc kubenswrapper[4786]: I0313 15:34:47.551887 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:34:47 crc kubenswrapper[4786]: E0313 15:34:47.552666 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:34:59 crc kubenswrapper[4786]: I0313 15:34:59.552780 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:34:59 crc kubenswrapper[4786]: E0313 15:34:59.553660 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:35:13 crc kubenswrapper[4786]: I0313 15:35:13.552195 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:35:13 crc kubenswrapper[4786]: E0313 15:35:13.553135 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:35:25 crc kubenswrapper[4786]: I0313 15:35:25.552360 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:35:25 crc kubenswrapper[4786]: E0313 15:35:25.553141 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:35:39 crc kubenswrapper[4786]: I0313 15:35:39.552565 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:35:39 crc kubenswrapper[4786]: E0313 15:35:39.555068 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:35:50 crc kubenswrapper[4786]: I0313 15:35:50.558453 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:35:50 crc kubenswrapper[4786]: E0313 15:35:50.559078 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.141559 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556936-qrj2v"] Mar 13 15:36:00 crc kubenswrapper[4786]: E0313 15:36:00.142455 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="extract-utilities" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.142470 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="extract-utilities" Mar 13 15:36:00 crc kubenswrapper[4786]: E0313 15:36:00.142486 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5c259e-2813-4051-a355-302bf97cd85e" containerName="oc" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.142492 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5c259e-2813-4051-a355-302bf97cd85e" containerName="oc" Mar 13 15:36:00 crc kubenswrapper[4786]: E0313 15:36:00.142505 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="registry-server" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.142513 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="registry-server" Mar 13 15:36:00 crc kubenswrapper[4786]: E0313 15:36:00.142524 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="extract-content" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.142533 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="extract-content" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.142693 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb825343-cab4-4755-9c90-1deb9d43ae1a" containerName="registry-server" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.142715 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5c259e-2813-4051-a355-302bf97cd85e" containerName="oc" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.143276 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556936-qrj2v" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.145019 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.145056 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.145743 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.157015 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556936-qrj2v"] Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.285951 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlszf\" (UniqueName: \"kubernetes.io/projected/ba89a406-7f00-4dce-8542-970bfa870701-kube-api-access-xlszf\") pod \"auto-csr-approver-29556936-qrj2v\" (UID: \"ba89a406-7f00-4dce-8542-970bfa870701\") " pod="openshift-infra/auto-csr-approver-29556936-qrj2v" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.387065 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlszf\" (UniqueName: \"kubernetes.io/projected/ba89a406-7f00-4dce-8542-970bfa870701-kube-api-access-xlszf\") pod \"auto-csr-approver-29556936-qrj2v\" (UID: \"ba89a406-7f00-4dce-8542-970bfa870701\") " pod="openshift-infra/auto-csr-approver-29556936-qrj2v" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.416599 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlszf\" (UniqueName: \"kubernetes.io/projected/ba89a406-7f00-4dce-8542-970bfa870701-kube-api-access-xlszf\") pod \"auto-csr-approver-29556936-qrj2v\" (UID: \"ba89a406-7f00-4dce-8542-970bfa870701\") " pod="openshift-infra/auto-csr-approver-29556936-qrj2v" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.527464 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556936-qrj2v" Mar 13 15:36:00 crc kubenswrapper[4786]: I0313 15:36:00.996251 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556936-qrj2v"] Mar 13 15:36:01 crc kubenswrapper[4786]: I0313 15:36:01.551696 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:36:01 crc kubenswrapper[4786]: E0313 15:36:01.552191 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:36:01 crc kubenswrapper[4786]: I0313 15:36:01.884673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556936-qrj2v" event={"ID":"ba89a406-7f00-4dce-8542-970bfa870701","Type":"ContainerStarted","Data":"c56040f194f99d61d0b8aee6edc44241073e581a94fe6c52d38a33163ceed8b8"} Mar 13 15:36:02 crc kubenswrapper[4786]: I0313 15:36:02.895389 4786 generic.go:334] "Generic (PLEG): container finished" podID="ba89a406-7f00-4dce-8542-970bfa870701" containerID="a4128c4f820bf74e1439f3398dd72e967cb858a0f8f97292645d1ea8198b1d67" exitCode=0 Mar 13 15:36:02 crc kubenswrapper[4786]: I0313 15:36:02.895553 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556936-qrj2v" event={"ID":"ba89a406-7f00-4dce-8542-970bfa870701","Type":"ContainerDied","Data":"a4128c4f820bf74e1439f3398dd72e967cb858a0f8f97292645d1ea8198b1d67"} Mar 13 15:36:04 crc kubenswrapper[4786]: I0313 15:36:04.171324 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556936-qrj2v" Mar 13 15:36:04 crc kubenswrapper[4786]: I0313 15:36:04.243554 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlszf\" (UniqueName: \"kubernetes.io/projected/ba89a406-7f00-4dce-8542-970bfa870701-kube-api-access-xlszf\") pod \"ba89a406-7f00-4dce-8542-970bfa870701\" (UID: \"ba89a406-7f00-4dce-8542-970bfa870701\") " Mar 13 15:36:04 crc kubenswrapper[4786]: I0313 15:36:04.252413 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba89a406-7f00-4dce-8542-970bfa870701-kube-api-access-xlszf" (OuterVolumeSpecName: "kube-api-access-xlszf") pod "ba89a406-7f00-4dce-8542-970bfa870701" (UID: "ba89a406-7f00-4dce-8542-970bfa870701"). InnerVolumeSpecName "kube-api-access-xlszf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:36:04 crc kubenswrapper[4786]: I0313 15:36:04.345797 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlszf\" (UniqueName: \"kubernetes.io/projected/ba89a406-7f00-4dce-8542-970bfa870701-kube-api-access-xlszf\") on node \"crc\" DevicePath \"\"" Mar 13 15:36:04 crc kubenswrapper[4786]: I0313 15:36:04.922285 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556936-qrj2v" event={"ID":"ba89a406-7f00-4dce-8542-970bfa870701","Type":"ContainerDied","Data":"c56040f194f99d61d0b8aee6edc44241073e581a94fe6c52d38a33163ceed8b8"} Mar 13 15:36:04 crc kubenswrapper[4786]: I0313 15:36:04.922320 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56040f194f99d61d0b8aee6edc44241073e581a94fe6c52d38a33163ceed8b8" Mar 13 15:36:04 crc kubenswrapper[4786]: I0313 15:36:04.922352 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556936-qrj2v" Mar 13 15:36:05 crc kubenswrapper[4786]: I0313 15:36:05.241813 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556930-6f5qf"] Mar 13 15:36:05 crc kubenswrapper[4786]: I0313 15:36:05.248670 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556930-6f5qf"] Mar 13 15:36:06 crc kubenswrapper[4786]: I0313 15:36:06.563425 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be7ee84-5588-4a0c-a24e-51cd4720b224" path="/var/lib/kubelet/pods/8be7ee84-5588-4a0c-a24e-51cd4720b224/volumes" Mar 13 15:36:10 crc kubenswrapper[4786]: I0313 15:36:10.579120 4786 scope.go:117] "RemoveContainer" containerID="49a8d2fb95f7eabbe42eddaf38be0e77002493823be033535cb22fd0955d85a7" Mar 13 15:36:15 crc kubenswrapper[4786]: I0313 15:36:15.552472 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:36:15 crc kubenswrapper[4786]: E0313 15:36:15.553471 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:36:29 crc kubenswrapper[4786]: I0313 15:36:29.552634 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:36:29 crc kubenswrapper[4786]: E0313 15:36:29.554154 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:36:42 crc kubenswrapper[4786]: I0313 15:36:42.552063 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:36:42 crc kubenswrapper[4786]: E0313 15:36:42.553302 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:36:54 crc kubenswrapper[4786]: I0313 15:36:54.555162 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:36:54 crc kubenswrapper[4786]: E0313 15:36:54.556396 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:37:08 crc kubenswrapper[4786]: I0313 15:37:08.552892 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:37:09 crc kubenswrapper[4786]: I0313 15:37:09.478314 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"4dcb2153a0a137b6fa9bb9053f208afcffda7d4a027ef34039a1c015892bfc5a"} Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.170468 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556938-5gb89"] Mar 13 15:38:00 crc kubenswrapper[4786]: E0313 15:38:00.171581 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba89a406-7f00-4dce-8542-970bfa870701" containerName="oc" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.171605 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba89a406-7f00-4dce-8542-970bfa870701" containerName="oc" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.171925 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba89a406-7f00-4dce-8542-970bfa870701" containerName="oc" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.172901 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556938-5gb89" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.175933 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.176621 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.177380 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.180749 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556938-5gb89"] Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.222837 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4xt\" (UniqueName: \"kubernetes.io/projected/59c1fc31-e84a-4e8b-b648-6fd8a29d484a-kube-api-access-nn4xt\") pod \"auto-csr-approver-29556938-5gb89\" (UID: \"59c1fc31-e84a-4e8b-b648-6fd8a29d484a\") " pod="openshift-infra/auto-csr-approver-29556938-5gb89" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.324437 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4xt\" (UniqueName: \"kubernetes.io/projected/59c1fc31-e84a-4e8b-b648-6fd8a29d484a-kube-api-access-nn4xt\") pod \"auto-csr-approver-29556938-5gb89\" (UID: \"59c1fc31-e84a-4e8b-b648-6fd8a29d484a\") " pod="openshift-infra/auto-csr-approver-29556938-5gb89" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.347128 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4xt\" (UniqueName: \"kubernetes.io/projected/59c1fc31-e84a-4e8b-b648-6fd8a29d484a-kube-api-access-nn4xt\") pod \"auto-csr-approver-29556938-5gb89\" (UID: \"59c1fc31-e84a-4e8b-b648-6fd8a29d484a\") " pod="openshift-infra/auto-csr-approver-29556938-5gb89" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.516297 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556938-5gb89" Mar 13 15:38:00 crc kubenswrapper[4786]: I0313 15:38:00.952492 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556938-5gb89"] Mar 13 15:38:01 crc kubenswrapper[4786]: I0313 15:38:01.911504 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556938-5gb89" event={"ID":"59c1fc31-e84a-4e8b-b648-6fd8a29d484a","Type":"ContainerStarted","Data":"934d9a00b73c8d41f5a486cf11a3a69a4afbb6671bc74c23d9d5cd2f81808ffe"} Mar 13 15:38:02 crc kubenswrapper[4786]: I0313 15:38:02.921556 4786 generic.go:334] "Generic (PLEG): container finished" podID="59c1fc31-e84a-4e8b-b648-6fd8a29d484a" containerID="e878ba75f14e4ce661440f103954367cee974526e554cf20874c5e95e1488473" exitCode=0 Mar 13 15:38:02 crc kubenswrapper[4786]: I0313 15:38:02.921634 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556938-5gb89" event={"ID":"59c1fc31-e84a-4e8b-b648-6fd8a29d484a","Type":"ContainerDied","Data":"e878ba75f14e4ce661440f103954367cee974526e554cf20874c5e95e1488473"} Mar 13 15:38:04 crc kubenswrapper[4786]: I0313 15:38:04.239924 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556938-5gb89" Mar 13 15:38:04 crc kubenswrapper[4786]: I0313 15:38:04.332833 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn4xt\" (UniqueName: \"kubernetes.io/projected/59c1fc31-e84a-4e8b-b648-6fd8a29d484a-kube-api-access-nn4xt\") pod \"59c1fc31-e84a-4e8b-b648-6fd8a29d484a\" (UID: \"59c1fc31-e84a-4e8b-b648-6fd8a29d484a\") " Mar 13 15:38:04 crc kubenswrapper[4786]: I0313 15:38:04.349096 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c1fc31-e84a-4e8b-b648-6fd8a29d484a-kube-api-access-nn4xt" (OuterVolumeSpecName: "kube-api-access-nn4xt") pod "59c1fc31-e84a-4e8b-b648-6fd8a29d484a" (UID: "59c1fc31-e84a-4e8b-b648-6fd8a29d484a"). InnerVolumeSpecName "kube-api-access-nn4xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:38:04 crc kubenswrapper[4786]: I0313 15:38:04.434546 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn4xt\" (UniqueName: \"kubernetes.io/projected/59c1fc31-e84a-4e8b-b648-6fd8a29d484a-kube-api-access-nn4xt\") on node \"crc\" DevicePath \"\"" Mar 13 15:38:04 crc kubenswrapper[4786]: I0313 15:38:04.942432 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556938-5gb89" event={"ID":"59c1fc31-e84a-4e8b-b648-6fd8a29d484a","Type":"ContainerDied","Data":"934d9a00b73c8d41f5a486cf11a3a69a4afbb6671bc74c23d9d5cd2f81808ffe"} Mar 13 15:38:04 crc kubenswrapper[4786]: I0313 15:38:04.942491 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556938-5gb89" Mar 13 15:38:04 crc kubenswrapper[4786]: I0313 15:38:04.942517 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="934d9a00b73c8d41f5a486cf11a3a69a4afbb6671bc74c23d9d5cd2f81808ffe" Mar 13 15:38:05 crc kubenswrapper[4786]: I0313 15:38:05.301004 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556932-97z5t"] Mar 13 15:38:05 crc kubenswrapper[4786]: I0313 15:38:05.305827 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556932-97z5t"] Mar 13 15:38:06 crc kubenswrapper[4786]: I0313 15:38:06.563609 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44dcb8ac-ebda-413e-b1ea-2a118379f8f7" path="/var/lib/kubelet/pods/44dcb8ac-ebda-413e-b1ea-2a118379f8f7/volumes" Mar 13 15:38:10 crc kubenswrapper[4786]: I0313 15:38:10.666892 4786 scope.go:117] "RemoveContainer" containerID="970c749725ea3a162d26b0176de43baa644794ad8505dfabc493bf34b9e2809a" Mar 13 15:39:37 crc kubenswrapper[4786]: I0313 15:39:37.869036 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:39:37 crc kubenswrapper[4786]: I0313 15:39:37.869795 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.164436 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556940-j5txx"] Mar 13 15:40:00 crc kubenswrapper[4786]: E0313 15:40:00.165380 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c1fc31-e84a-4e8b-b648-6fd8a29d484a" containerName="oc" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.165396 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c1fc31-e84a-4e8b-b648-6fd8a29d484a" containerName="oc" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.165541 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c1fc31-e84a-4e8b-b648-6fd8a29d484a" containerName="oc" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.166108 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556940-j5txx" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.168705 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.168769 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.169323 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.200395 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556940-j5txx"] Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.267611 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5gvv\" (UniqueName: \"kubernetes.io/projected/85aea1da-1d6b-4f0f-acca-87584b9df30f-kube-api-access-q5gvv\") pod \"auto-csr-approver-29556940-j5txx\" (UID: \"85aea1da-1d6b-4f0f-acca-87584b9df30f\") " pod="openshift-infra/auto-csr-approver-29556940-j5txx" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.369351 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5gvv\" (UniqueName: \"kubernetes.io/projected/85aea1da-1d6b-4f0f-acca-87584b9df30f-kube-api-access-q5gvv\") pod \"auto-csr-approver-29556940-j5txx\" (UID: \"85aea1da-1d6b-4f0f-acca-87584b9df30f\") " pod="openshift-infra/auto-csr-approver-29556940-j5txx" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.394463 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5gvv\" (UniqueName: \"kubernetes.io/projected/85aea1da-1d6b-4f0f-acca-87584b9df30f-kube-api-access-q5gvv\") pod \"auto-csr-approver-29556940-j5txx\" (UID: \"85aea1da-1d6b-4f0f-acca-87584b9df30f\") " pod="openshift-infra/auto-csr-approver-29556940-j5txx" Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.483106 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556940-j5txx" Mar 13 15:40:00 crc kubenswrapper[4786]: W0313 15:40:00.932610 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85aea1da_1d6b_4f0f_acca_87584b9df30f.slice/crio-15aaf537274af8c383472da2887ee115a45608c50ff8a20df73c3cdb7b1ab1e5 WatchSource:0}: Error finding container 15aaf537274af8c383472da2887ee115a45608c50ff8a20df73c3cdb7b1ab1e5: Status 404 returned error can't find the container with id 15aaf537274af8c383472da2887ee115a45608c50ff8a20df73c3cdb7b1ab1e5 Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.936271 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556940-j5txx"] Mar 13 15:40:00 crc kubenswrapper[4786]: I0313 15:40:00.939775 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:40:01 crc kubenswrapper[4786]: I0313 15:40:01.877034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556940-j5txx" event={"ID":"85aea1da-1d6b-4f0f-acca-87584b9df30f","Type":"ContainerStarted","Data":"15aaf537274af8c383472da2887ee115a45608c50ff8a20df73c3cdb7b1ab1e5"} Mar 13 15:40:02 crc kubenswrapper[4786]: I0313 15:40:02.887281 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556940-j5txx" event={"ID":"85aea1da-1d6b-4f0f-acca-87584b9df30f","Type":"ContainerStarted","Data":"ed0f43c7bc55f3a087e49e488a09b7271eeda17debd76bf265e2de3a73ebd9e8"} Mar 13 15:40:02 crc kubenswrapper[4786]: I0313 15:40:02.906849 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556940-j5txx" podStartSLOduration=1.464828714 podStartE2EDuration="2.906826018s" podCreationTimestamp="2026-03-13 15:40:00 +0000 UTC" firstStartedPulling="2026-03-13 15:40:00.93903441 +0000 UTC m=+2231.102246221" lastFinishedPulling="2026-03-13 15:40:02.381031714 +0000 UTC m=+2232.544243525" observedRunningTime="2026-03-13 15:40:02.904295575 +0000 UTC m=+2233.067507416" watchObservedRunningTime="2026-03-13 15:40:02.906826018 +0000 UTC m=+2233.070037829" Mar 13 15:40:03 crc kubenswrapper[4786]: I0313 15:40:03.896796 4786 generic.go:334] "Generic (PLEG): container finished" podID="85aea1da-1d6b-4f0f-acca-87584b9df30f" containerID="ed0f43c7bc55f3a087e49e488a09b7271eeda17debd76bf265e2de3a73ebd9e8" exitCode=0 Mar 13 15:40:03 crc kubenswrapper[4786]: I0313 15:40:03.897172 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556940-j5txx" event={"ID":"85aea1da-1d6b-4f0f-acca-87584b9df30f","Type":"ContainerDied","Data":"ed0f43c7bc55f3a087e49e488a09b7271eeda17debd76bf265e2de3a73ebd9e8"} Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.252968 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556940-j5txx" Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.361762 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5gvv\" (UniqueName: \"kubernetes.io/projected/85aea1da-1d6b-4f0f-acca-87584b9df30f-kube-api-access-q5gvv\") pod \"85aea1da-1d6b-4f0f-acca-87584b9df30f\" (UID: \"85aea1da-1d6b-4f0f-acca-87584b9df30f\") " Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.370048 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85aea1da-1d6b-4f0f-acca-87584b9df30f-kube-api-access-q5gvv" (OuterVolumeSpecName: "kube-api-access-q5gvv") pod "85aea1da-1d6b-4f0f-acca-87584b9df30f" (UID: "85aea1da-1d6b-4f0f-acca-87584b9df30f"). InnerVolumeSpecName "kube-api-access-q5gvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.463557 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5gvv\" (UniqueName: \"kubernetes.io/projected/85aea1da-1d6b-4f0f-acca-87584b9df30f-kube-api-access-q5gvv\") on node \"crc\" DevicePath \"\"" Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.915593 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556940-j5txx" Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.915524 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556940-j5txx" event={"ID":"85aea1da-1d6b-4f0f-acca-87584b9df30f","Type":"ContainerDied","Data":"15aaf537274af8c383472da2887ee115a45608c50ff8a20df73c3cdb7b1ab1e5"} Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.915728 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15aaf537274af8c383472da2887ee115a45608c50ff8a20df73c3cdb7b1ab1e5" Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.986727 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556934-dvm7v"] Mar 13 15:40:05 crc kubenswrapper[4786]: I0313 15:40:05.992378 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556934-dvm7v"] Mar 13 15:40:06 crc kubenswrapper[4786]: I0313 15:40:06.563689 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5c259e-2813-4051-a355-302bf97cd85e" path="/var/lib/kubelet/pods/0b5c259e-2813-4051-a355-302bf97cd85e/volumes" Mar 13 15:40:07 crc kubenswrapper[4786]: I0313 15:40:07.869170 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:40:07 crc kubenswrapper[4786]: I0313 15:40:07.869650 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:40:10 crc kubenswrapper[4786]: I0313 15:40:10.754160 4786 scope.go:117] "RemoveContainer" containerID="e9ed7ee176bad5639be759b39583ca2ee6075a120163ba1f1d177e6dbe551b45" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.396817 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5vzpc"] Mar 13 15:40:26 crc kubenswrapper[4786]: E0313 15:40:26.398218 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85aea1da-1d6b-4f0f-acca-87584b9df30f" containerName="oc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.398274 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="85aea1da-1d6b-4f0f-acca-87584b9df30f" containerName="oc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.398627 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="85aea1da-1d6b-4f0f-acca-87584b9df30f" containerName="oc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.402372 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.412209 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vzpc"] Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.485243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-utilities\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.485318 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcqwv\" (UniqueName: \"kubernetes.io/projected/fd7c4468-cb26-4a4b-9c50-59c542d3774f-kube-api-access-rcqwv\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.485376 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-catalog-content\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.586812 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-catalog-content\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.586943 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-utilities\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.586988 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcqwv\" (UniqueName: \"kubernetes.io/projected/fd7c4468-cb26-4a4b-9c50-59c542d3774f-kube-api-access-rcqwv\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.587381 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-catalog-content\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.587582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-utilities\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.614575 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcqwv\" (UniqueName: \"kubernetes.io/projected/fd7c4468-cb26-4a4b-9c50-59c542d3774f-kube-api-access-rcqwv\") pod \"community-operators-5vzpc\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:26 crc kubenswrapper[4786]: I0313 15:40:26.734054 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.292097 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vzpc"] Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.382266 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4w4dn"] Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.383883 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.396424 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4w4dn"] Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.497981 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-catalog-content\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.498104 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-utilities\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.498144 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjlz\" (UniqueName: \"kubernetes.io/projected/39367cf3-5f72-4242-9fda-53a2cc9ba630-kube-api-access-jkjlz\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.599977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-utilities\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.600033 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjlz\" (UniqueName: \"kubernetes.io/projected/39367cf3-5f72-4242-9fda-53a2cc9ba630-kube-api-access-jkjlz\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.600077 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-catalog-content\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.600479 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-catalog-content\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.600558 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-utilities\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.627805 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjlz\" (UniqueName: \"kubernetes.io/projected/39367cf3-5f72-4242-9fda-53a2cc9ba630-kube-api-access-jkjlz\") pod \"redhat-marketplace-4w4dn\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:27 crc kubenswrapper[4786]: I0313 15:40:27.723514 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:28 crc kubenswrapper[4786]: I0313 15:40:28.096140 4786 generic.go:334] "Generic (PLEG): container finished" podID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerID="7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40" exitCode=0 Mar 13 15:40:28 crc kubenswrapper[4786]: I0313 15:40:28.096200 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vzpc" event={"ID":"fd7c4468-cb26-4a4b-9c50-59c542d3774f","Type":"ContainerDied","Data":"7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40"} Mar 13 15:40:28 crc kubenswrapper[4786]: I0313 15:40:28.096439 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vzpc" event={"ID":"fd7c4468-cb26-4a4b-9c50-59c542d3774f","Type":"ContainerStarted","Data":"3af4985820355d8aa234e492e6b2a7716e41171cc6f2fccfa3f02c5755c18e4e"} Mar 13 15:40:28 crc kubenswrapper[4786]: W0313 15:40:28.213432 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39367cf3_5f72_4242_9fda_53a2cc9ba630.slice/crio-bb97ef5f93d22e119f8b316ba76ec835de03a61d74717d580191a9133923bd69 WatchSource:0}: Error finding container bb97ef5f93d22e119f8b316ba76ec835de03a61d74717d580191a9133923bd69: Status 404 returned error can't find the container with id bb97ef5f93d22e119f8b316ba76ec835de03a61d74717d580191a9133923bd69 Mar 13 15:40:28 crc kubenswrapper[4786]: I0313 15:40:28.219637 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4w4dn"] Mar 13 15:40:29 crc kubenswrapper[4786]: I0313 15:40:29.111283 4786 generic.go:334] "Generic (PLEG): container finished" podID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerID="70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7" exitCode=0 Mar 13 15:40:29 crc kubenswrapper[4786]: I0313 15:40:29.111326 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4w4dn" event={"ID":"39367cf3-5f72-4242-9fda-53a2cc9ba630","Type":"ContainerDied","Data":"70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7"} Mar 13 15:40:29 crc kubenswrapper[4786]: I0313 15:40:29.111675 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4w4dn" event={"ID":"39367cf3-5f72-4242-9fda-53a2cc9ba630","Type":"ContainerStarted","Data":"bb97ef5f93d22e119f8b316ba76ec835de03a61d74717d580191a9133923bd69"} Mar 13 15:40:29 crc kubenswrapper[4786]: E0313 15:40:29.504190 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/ef/ef9efff70532a11721d59df306a3f837de2bdbe08020b569b374744e739489a0?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T154028Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b1ae7751db8840a74082c35e0d7894be997c5f9f5d8936af5e258ccb52710d5e®ion=us-east-1&namespace=redhat&username=redhat+registry_proxy&repo_name=redhat----community-operator-index&akamai_signature=exp=1773417328~hmac=0c7cf137ed8a43a8775dd578a3310fc132cfb51c3fe1838b839dc78c91328cad\": remote error: tls: internal error" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 15:40:29 crc kubenswrapper[4786]: E0313 15:40:29.504338 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rcqwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5vzpc_openshift-marketplace(fd7c4468-cb26-4a4b-9c50-59c542d3774f): ErrImagePull: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/ef/ef9efff70532a11721d59df306a3f837de2bdbe08020b569b374744e739489a0?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T154028Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b1ae7751db8840a74082c35e0d7894be997c5f9f5d8936af5e258ccb52710d5e®ion=us-east-1&namespace=redhat&username=redhat+registry_proxy&repo_name=redhat----community-operator-index&akamai_signature=exp=1773417328~hmac=0c7cf137ed8a43a8775dd578a3310fc132cfb51c3fe1838b839dc78c91328cad\": remote error: tls: internal error" logger="UnhandledError" Mar 13 15:40:29 crc kubenswrapper[4786]: E0313 15:40:29.505567 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/ef/ef9efff70532a11721d59df306a3f837de2bdbe08020b569b374744e739489a0?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T154028Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b1ae7751db8840a74082c35e0d7894be997c5f9f5d8936af5e258ccb52710d5e®ion=us-east-1&namespace=redhat&username=redhat+registry_proxy&repo_name=redhat----community-operator-index&akamai_signature=exp=1773417328~hmac=0c7cf137ed8a43a8775dd578a3310fc132cfb51c3fe1838b839dc78c91328cad\\\": remote error: tls: internal error\"" pod="openshift-marketplace/community-operators-5vzpc" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" Mar 13 15:40:30 crc kubenswrapper[4786]: I0313 15:40:30.125736 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4w4dn" event={"ID":"39367cf3-5f72-4242-9fda-53a2cc9ba630","Type":"ContainerStarted","Data":"c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196"} Mar 13 15:40:30 crc kubenswrapper[4786]: E0313 15:40:30.127766 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5vzpc" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" Mar 13 15:40:31 crc kubenswrapper[4786]: I0313 15:40:31.136296 4786 generic.go:334] "Generic (PLEG): container finished" podID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerID="c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196" exitCode=0 Mar 13 15:40:31 crc kubenswrapper[4786]: I0313 15:40:31.136337 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4w4dn" event={"ID":"39367cf3-5f72-4242-9fda-53a2cc9ba630","Type":"ContainerDied","Data":"c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196"} Mar 13 15:40:32 crc kubenswrapper[4786]: I0313 15:40:32.146742 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4w4dn" event={"ID":"39367cf3-5f72-4242-9fda-53a2cc9ba630","Type":"ContainerStarted","Data":"ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5"} Mar 13 15:40:32 crc kubenswrapper[4786]: I0313 15:40:32.174664 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4w4dn" podStartSLOduration=2.698760729 podStartE2EDuration="5.174639897s" podCreationTimestamp="2026-03-13 15:40:27 +0000 UTC" firstStartedPulling="2026-03-13 15:40:29.113813266 +0000 UTC m=+2259.277025117" lastFinishedPulling="2026-03-13 15:40:31.589692434 +0000 UTC m=+2261.752904285" observedRunningTime="2026-03-13 15:40:32.165493517 +0000 UTC m=+2262.328705328" watchObservedRunningTime="2026-03-13 15:40:32.174639897 +0000 UTC m=+2262.337851708" Mar 13 15:40:37 crc kubenswrapper[4786]: I0313 15:40:37.724311 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:37 crc kubenswrapper[4786]: I0313 15:40:37.725217 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:37 crc kubenswrapper[4786]: I0313 15:40:37.779010 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:37 crc kubenswrapper[4786]: I0313 15:40:37.868266 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:40:37 crc kubenswrapper[4786]: I0313 15:40:37.868354 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:40:37 crc kubenswrapper[4786]: I0313 15:40:37.868419 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:40:37 crc kubenswrapper[4786]: I0313 15:40:37.869322 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dcb2153a0a137b6fa9bb9053f208afcffda7d4a027ef34039a1c015892bfc5a"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:40:37 crc kubenswrapper[4786]: I0313 15:40:37.869432 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://4dcb2153a0a137b6fa9bb9053f208afcffda7d4a027ef34039a1c015892bfc5a" gracePeriod=600 Mar 13 15:40:38 crc kubenswrapper[4786]: I0313 15:40:38.207422 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="4dcb2153a0a137b6fa9bb9053f208afcffda7d4a027ef34039a1c015892bfc5a" exitCode=0 Mar 13 15:40:38 crc kubenswrapper[4786]: I0313 15:40:38.207488 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"4dcb2153a0a137b6fa9bb9053f208afcffda7d4a027ef34039a1c015892bfc5a"} Mar 13 15:40:38 crc kubenswrapper[4786]: I0313 15:40:38.207816 4786 scope.go:117] "RemoveContainer" containerID="3f2bc051cd6d795f2bc3eaa3f0293e0bb8529619bf5f4dc44025170a2befe51a" Mar 13 15:40:38 crc kubenswrapper[4786]: I0313 15:40:38.260562 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:38 crc kubenswrapper[4786]: I0313 15:40:38.301702 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4w4dn"] Mar 13 15:40:39 crc kubenswrapper[4786]: I0313 15:40:39.221268 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec"} Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.230913 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4w4dn" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerName="registry-server" containerID="cri-o://ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5" gracePeriod=2 Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.687603 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.836886 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-utilities\") pod \"39367cf3-5f72-4242-9fda-53a2cc9ba630\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.837065 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-catalog-content\") pod \"39367cf3-5f72-4242-9fda-53a2cc9ba630\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.837104 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkjlz\" (UniqueName: \"kubernetes.io/projected/39367cf3-5f72-4242-9fda-53a2cc9ba630-kube-api-access-jkjlz\") pod \"39367cf3-5f72-4242-9fda-53a2cc9ba630\" (UID: \"39367cf3-5f72-4242-9fda-53a2cc9ba630\") " Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.838303 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-utilities" (OuterVolumeSpecName: "utilities") pod "39367cf3-5f72-4242-9fda-53a2cc9ba630" (UID: "39367cf3-5f72-4242-9fda-53a2cc9ba630"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.843544 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39367cf3-5f72-4242-9fda-53a2cc9ba630-kube-api-access-jkjlz" (OuterVolumeSpecName: "kube-api-access-jkjlz") pod "39367cf3-5f72-4242-9fda-53a2cc9ba630" (UID: "39367cf3-5f72-4242-9fda-53a2cc9ba630"). InnerVolumeSpecName "kube-api-access-jkjlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.871490 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39367cf3-5f72-4242-9fda-53a2cc9ba630" (UID: "39367cf3-5f72-4242-9fda-53a2cc9ba630"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.938506 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.938572 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39367cf3-5f72-4242-9fda-53a2cc9ba630-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:40:40 crc kubenswrapper[4786]: I0313 15:40:40.938587 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkjlz\" (UniqueName: \"kubernetes.io/projected/39367cf3-5f72-4242-9fda-53a2cc9ba630-kube-api-access-jkjlz\") on node \"crc\" DevicePath \"\"" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.243711 4786 generic.go:334] "Generic (PLEG): container finished" podID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerID="ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5" exitCode=0 Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.243775 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4w4dn" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.243798 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4w4dn" event={"ID":"39367cf3-5f72-4242-9fda-53a2cc9ba630","Type":"ContainerDied","Data":"ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5"} Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.244382 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4w4dn" event={"ID":"39367cf3-5f72-4242-9fda-53a2cc9ba630","Type":"ContainerDied","Data":"bb97ef5f93d22e119f8b316ba76ec835de03a61d74717d580191a9133923bd69"} Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.244421 4786 scope.go:117] "RemoveContainer" containerID="ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.295753 4786 scope.go:117] "RemoveContainer" containerID="c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.301926 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4w4dn"] Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.310524 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4w4dn"] Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.322304 4786 scope.go:117] "RemoveContainer" containerID="70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.359131 4786 scope.go:117] "RemoveContainer" containerID="ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5" Mar 13 15:40:41 crc kubenswrapper[4786]: E0313 15:40:41.359831 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5\": container with ID starting with ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5 not found: ID does not exist" containerID="ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.359905 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5"} err="failed to get container status \"ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5\": rpc error: code = NotFound desc = could not find container \"ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5\": container with ID starting with ee59b633bf6155a1cf452bfcac5d3bb5df84a8fedbed7b1c55a17540f3ffc8c5 not found: ID does not exist" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.359938 4786 scope.go:117] "RemoveContainer" containerID="c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196" Mar 13 15:40:41 crc kubenswrapper[4786]: E0313 15:40:41.360655 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196\": container with ID starting with c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196 not found: ID does not exist" containerID="c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.360702 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196"} err="failed to get container status \"c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196\": rpc error: code = NotFound desc = could not find container \"c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196\": container with ID starting with c11de80d2b8b7b36697df87cd5d3306f416e8ca3e1a05ed9805d84df5851c196 not found: ID does not exist" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.360735 4786 scope.go:117] "RemoveContainer" containerID="70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7" Mar 13 15:40:41 crc kubenswrapper[4786]: E0313 15:40:41.361108 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7\": container with ID starting with 70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7 not found: ID does not exist" containerID="70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7" Mar 13 15:40:41 crc kubenswrapper[4786]: I0313 15:40:41.361147 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7"} err="failed to get container status \"70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7\": rpc error: code = NotFound desc = could not find container \"70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7\": container with ID starting with 70bfea75b834333261309fe7850f42f87da6f242a622276605dee4c1bdfcc9f7 not found: ID does not exist" Mar 13 15:40:42 crc kubenswrapper[4786]: I0313 15:40:42.561098 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" path="/var/lib/kubelet/pods/39367cf3-5f72-4242-9fda-53a2cc9ba630/volumes" Mar 13 15:40:45 crc kubenswrapper[4786]: I0313 15:40:45.281869 4786 generic.go:334] "Generic (PLEG): container finished" podID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerID="2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8" exitCode=0 Mar 13 15:40:45 crc kubenswrapper[4786]: I0313 15:40:45.281882 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vzpc" event={"ID":"fd7c4468-cb26-4a4b-9c50-59c542d3774f","Type":"ContainerDied","Data":"2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8"} Mar 13 15:40:46 crc kubenswrapper[4786]: I0313 15:40:46.299491 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vzpc" event={"ID":"fd7c4468-cb26-4a4b-9c50-59c542d3774f","Type":"ContainerStarted","Data":"f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c"} Mar 13 15:40:46 crc kubenswrapper[4786]: I0313 15:40:46.338162 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5vzpc" podStartSLOduration=2.373934123 podStartE2EDuration="20.338140961s" podCreationTimestamp="2026-03-13 15:40:26 +0000 UTC" firstStartedPulling="2026-03-13 15:40:28.098203163 +0000 UTC m=+2258.261414974" lastFinishedPulling="2026-03-13 15:40:46.062409961 +0000 UTC m=+2276.225621812" observedRunningTime="2026-03-13 15:40:46.335479844 +0000 UTC m=+2276.498691695" watchObservedRunningTime="2026-03-13 15:40:46.338140961 +0000 UTC m=+2276.501352812" Mar 13 15:40:46 crc kubenswrapper[4786]: I0313 15:40:46.734732 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:46 crc kubenswrapper[4786]: I0313 15:40:46.734794 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:47 crc kubenswrapper[4786]: I0313 15:40:47.791493 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5vzpc" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="registry-server" probeResult="failure" output=< Mar 13 15:40:47 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 15:40:47 crc kubenswrapper[4786]: > Mar 13 15:40:56 crc kubenswrapper[4786]: I0313 15:40:56.811444 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:56 crc kubenswrapper[4786]: I0313 15:40:56.867743 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:57 crc kubenswrapper[4786]: I0313 15:40:57.800525 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vzpc"] Mar 13 15:40:58 crc kubenswrapper[4786]: I0313 15:40:58.403753 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5vzpc" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="registry-server" containerID="cri-o://f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c" gracePeriod=2 Mar 13 15:40:58 crc kubenswrapper[4786]: I0313 15:40:58.896675 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.043327 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-utilities\") pod \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.043519 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-catalog-content\") pod \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.043569 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcqwv\" (UniqueName: \"kubernetes.io/projected/fd7c4468-cb26-4a4b-9c50-59c542d3774f-kube-api-access-rcqwv\") pod \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\" (UID: \"fd7c4468-cb26-4a4b-9c50-59c542d3774f\") " Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.044208 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-utilities" (OuterVolumeSpecName: "utilities") pod "fd7c4468-cb26-4a4b-9c50-59c542d3774f" (UID: "fd7c4468-cb26-4a4b-9c50-59c542d3774f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.050341 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd7c4468-cb26-4a4b-9c50-59c542d3774f-kube-api-access-rcqwv" (OuterVolumeSpecName: "kube-api-access-rcqwv") pod "fd7c4468-cb26-4a4b-9c50-59c542d3774f" (UID: "fd7c4468-cb26-4a4b-9c50-59c542d3774f"). InnerVolumeSpecName "kube-api-access-rcqwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.121924 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd7c4468-cb26-4a4b-9c50-59c542d3774f" (UID: "fd7c4468-cb26-4a4b-9c50-59c542d3774f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.145496 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.145743 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7c4468-cb26-4a4b-9c50-59c542d3774f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.145875 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcqwv\" (UniqueName: \"kubernetes.io/projected/fd7c4468-cb26-4a4b-9c50-59c542d3774f-kube-api-access-rcqwv\") on node \"crc\" DevicePath \"\"" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.417383 4786 generic.go:334] "Generic (PLEG): container finished" podID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerID="f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c" exitCode=0 Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.417474 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vzpc" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.417514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vzpc" event={"ID":"fd7c4468-cb26-4a4b-9c50-59c542d3774f","Type":"ContainerDied","Data":"f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c"} Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.418260 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vzpc" event={"ID":"fd7c4468-cb26-4a4b-9c50-59c542d3774f","Type":"ContainerDied","Data":"3af4985820355d8aa234e492e6b2a7716e41171cc6f2fccfa3f02c5755c18e4e"} Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.418302 4786 scope.go:117] "RemoveContainer" containerID="f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.444987 4786 scope.go:117] "RemoveContainer" containerID="2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.475689 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vzpc"] Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.481897 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5vzpc"] Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.496633 4786 scope.go:117] "RemoveContainer" containerID="7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.523171 4786 scope.go:117] "RemoveContainer" containerID="f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c" Mar 13 15:40:59 crc kubenswrapper[4786]: E0313 15:40:59.523650 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c\": container with ID starting with f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c not found: ID does not exist" containerID="f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.523690 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c"} err="failed to get container status \"f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c\": rpc error: code = NotFound desc = could not find container \"f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c\": container with ID starting with f1066c61525ed2df12caa97a8436f57a6f2cde591a0a3468163b10030597995c not found: ID does not exist" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.523715 4786 scope.go:117] "RemoveContainer" containerID="2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8" Mar 13 15:40:59 crc kubenswrapper[4786]: E0313 15:40:59.524076 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8\": container with ID starting with 2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8 not found: ID does not exist" containerID="2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.524101 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8"} err="failed to get container status \"2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8\": rpc error: code = NotFound desc = could not find container \"2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8\": container with ID starting with 2252f851b12496912f832ef81e79d93e75b42284c65516b0b4c3f3929122bed8 not found: ID does not exist" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.524118 4786 scope.go:117] "RemoveContainer" containerID="7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40" Mar 13 15:40:59 crc kubenswrapper[4786]: E0313 15:40:59.524362 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40\": container with ID starting with 7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40 not found: ID does not exist" containerID="7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40" Mar 13 15:40:59 crc kubenswrapper[4786]: I0313 15:40:59.524385 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40"} err="failed to get container status \"7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40\": rpc error: code = NotFound desc = could not find container \"7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40\": container with ID starting with 7f09b543bac8d315d607713fd375940170f2d807e859088c5f8fda12a5897b40 not found: ID does not exist" Mar 13 15:41:00 crc kubenswrapper[4786]: I0313 15:41:00.567893 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" path="/var/lib/kubelet/pods/fd7c4468-cb26-4a4b-9c50-59c542d3774f/volumes" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.677126 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vbd9w"] Mar 13 15:41:23 crc kubenswrapper[4786]: E0313 15:41:23.677958 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="registry-server" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.677971 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="registry-server" Mar 13 15:41:23 crc kubenswrapper[4786]: E0313 15:41:23.677987 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerName="extract-content" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.677993 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerName="extract-content" Mar 13 15:41:23 crc kubenswrapper[4786]: E0313 15:41:23.678001 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerName="extract-utilities" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.678009 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerName="extract-utilities" Mar 13 15:41:23 crc kubenswrapper[4786]: E0313 15:41:23.678024 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerName="registry-server" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.678030 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerName="registry-server" Mar 13 15:41:23 crc kubenswrapper[4786]: E0313 15:41:23.678042 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="extract-content" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.678048 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="extract-content" Mar 13 15:41:23 crc kubenswrapper[4786]: E0313 15:41:23.678057 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="extract-utilities" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.678062 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="extract-utilities" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.678210 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="39367cf3-5f72-4242-9fda-53a2cc9ba630" containerName="registry-server" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.678226 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd7c4468-cb26-4a4b-9c50-59c542d3774f" containerName="registry-server" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.679793 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.691337 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vbd9w"] Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.867446 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-utilities\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.867517 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-catalog-content\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.867539 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tctl\" (UniqueName: \"kubernetes.io/projected/bfaabc8e-5856-4beb-a3c9-08223326ef06-kube-api-access-9tctl\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.969119 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-catalog-content\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.969225 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tctl\" (UniqueName: \"kubernetes.io/projected/bfaabc8e-5856-4beb-a3c9-08223326ef06-kube-api-access-9tctl\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.969383 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-utilities\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.969743 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-catalog-content\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.969958 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-utilities\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:23 crc kubenswrapper[4786]: I0313 15:41:23.998162 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tctl\" (UniqueName: \"kubernetes.io/projected/bfaabc8e-5856-4beb-a3c9-08223326ef06-kube-api-access-9tctl\") pod \"certified-operators-vbd9w\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:24 crc kubenswrapper[4786]: I0313 15:41:24.001546 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:24 crc kubenswrapper[4786]: I0313 15:41:24.481611 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vbd9w"] Mar 13 15:41:24 crc kubenswrapper[4786]: W0313 15:41:24.505115 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfaabc8e_5856_4beb_a3c9_08223326ef06.slice/crio-aa4a81aea66a4529a033ba112985459bf9a0036743bc023f21adfa1487a1c873 WatchSource:0}: Error finding container aa4a81aea66a4529a033ba112985459bf9a0036743bc023f21adfa1487a1c873: Status 404 returned error can't find the container with id aa4a81aea66a4529a033ba112985459bf9a0036743bc023f21adfa1487a1c873 Mar 13 15:41:24 crc kubenswrapper[4786]: I0313 15:41:24.645537 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbd9w" event={"ID":"bfaabc8e-5856-4beb-a3c9-08223326ef06","Type":"ContainerStarted","Data":"aa4a81aea66a4529a033ba112985459bf9a0036743bc023f21adfa1487a1c873"} Mar 13 15:41:25 crc kubenswrapper[4786]: I0313 15:41:25.654135 4786 generic.go:334] "Generic (PLEG): container finished" podID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerID="5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64" exitCode=0 Mar 13 15:41:25 crc kubenswrapper[4786]: I0313 15:41:25.654191 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbd9w" event={"ID":"bfaabc8e-5856-4beb-a3c9-08223326ef06","Type":"ContainerDied","Data":"5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64"} Mar 13 15:41:26 crc kubenswrapper[4786]: I0313 15:41:26.662323 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbd9w" event={"ID":"bfaabc8e-5856-4beb-a3c9-08223326ef06","Type":"ContainerStarted","Data":"fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a"} Mar 13 15:41:27 crc kubenswrapper[4786]: I0313 15:41:27.673655 4786 generic.go:334] "Generic (PLEG): container finished" podID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerID="fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a" exitCode=0 Mar 13 15:41:27 crc kubenswrapper[4786]: I0313 15:41:27.673713 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbd9w" event={"ID":"bfaabc8e-5856-4beb-a3c9-08223326ef06","Type":"ContainerDied","Data":"fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a"} Mar 13 15:41:28 crc kubenswrapper[4786]: I0313 15:41:28.686660 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbd9w" event={"ID":"bfaabc8e-5856-4beb-a3c9-08223326ef06","Type":"ContainerStarted","Data":"76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3"} Mar 13 15:41:28 crc kubenswrapper[4786]: I0313 15:41:28.730933 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vbd9w" podStartSLOduration=3.176712799 podStartE2EDuration="5.730906458s" podCreationTimestamp="2026-03-13 15:41:23 +0000 UTC" firstStartedPulling="2026-03-13 15:41:25.656258059 +0000 UTC m=+2315.819469860" lastFinishedPulling="2026-03-13 15:41:28.210451708 +0000 UTC m=+2318.373663519" observedRunningTime="2026-03-13 15:41:28.71549617 +0000 UTC m=+2318.878708041" watchObservedRunningTime="2026-03-13 15:41:28.730906458 +0000 UTC m=+2318.894118299" Mar 13 15:41:34 crc kubenswrapper[4786]: I0313 15:41:34.002500 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:34 crc kubenswrapper[4786]: I0313 15:41:34.003160 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:34 crc kubenswrapper[4786]: I0313 15:41:34.062460 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:34 crc kubenswrapper[4786]: I0313 15:41:34.775766 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:34 crc kubenswrapper[4786]: I0313 15:41:34.831287 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vbd9w"] Mar 13 15:41:36 crc kubenswrapper[4786]: I0313 15:41:36.754705 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vbd9w" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerName="registry-server" containerID="cri-o://76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3" gracePeriod=2 Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.161963 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.287161 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-utilities\") pod \"bfaabc8e-5856-4beb-a3c9-08223326ef06\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.287233 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-catalog-content\") pod \"bfaabc8e-5856-4beb-a3c9-08223326ef06\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.287290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tctl\" (UniqueName: \"kubernetes.io/projected/bfaabc8e-5856-4beb-a3c9-08223326ef06-kube-api-access-9tctl\") pod \"bfaabc8e-5856-4beb-a3c9-08223326ef06\" (UID: \"bfaabc8e-5856-4beb-a3c9-08223326ef06\") " Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.288487 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-utilities" (OuterVolumeSpecName: "utilities") pod "bfaabc8e-5856-4beb-a3c9-08223326ef06" (UID: "bfaabc8e-5856-4beb-a3c9-08223326ef06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.292832 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfaabc8e-5856-4beb-a3c9-08223326ef06-kube-api-access-9tctl" (OuterVolumeSpecName: "kube-api-access-9tctl") pod "bfaabc8e-5856-4beb-a3c9-08223326ef06" (UID: "bfaabc8e-5856-4beb-a3c9-08223326ef06"). InnerVolumeSpecName "kube-api-access-9tctl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.354734 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfaabc8e-5856-4beb-a3c9-08223326ef06" (UID: "bfaabc8e-5856-4beb-a3c9-08223326ef06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.388624 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.388679 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaabc8e-5856-4beb-a3c9-08223326ef06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.388700 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tctl\" (UniqueName: \"kubernetes.io/projected/bfaabc8e-5856-4beb-a3c9-08223326ef06-kube-api-access-9tctl\") on node \"crc\" DevicePath \"\"" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.765710 4786 generic.go:334] "Generic (PLEG): container finished" podID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerID="76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3" exitCode=0 Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.765754 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbd9w" event={"ID":"bfaabc8e-5856-4beb-a3c9-08223326ef06","Type":"ContainerDied","Data":"76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3"} Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.765782 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vbd9w" event={"ID":"bfaabc8e-5856-4beb-a3c9-08223326ef06","Type":"ContainerDied","Data":"aa4a81aea66a4529a033ba112985459bf9a0036743bc023f21adfa1487a1c873"} Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.765800 4786 scope.go:117] "RemoveContainer" containerID="76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.765801 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vbd9w" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.799368 4786 scope.go:117] "RemoveContainer" containerID="fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.828771 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vbd9w"] Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.834717 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vbd9w"] Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.837726 4786 scope.go:117] "RemoveContainer" containerID="5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.864582 4786 scope.go:117] "RemoveContainer" containerID="76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3" Mar 13 15:41:37 crc kubenswrapper[4786]: E0313 15:41:37.864996 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3\": container with ID starting with 76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3 not found: ID does not exist" containerID="76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.865035 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3"} err="failed to get container status \"76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3\": rpc error: code = NotFound desc = could not find container \"76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3\": container with ID starting with 76d31c603029fbe0a11c71a08eae4060c87cc6c955d0b3cedb9f3ce7dddb4cf3 not found: ID does not exist" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.865060 4786 scope.go:117] "RemoveContainer" containerID="fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a" Mar 13 15:41:37 crc kubenswrapper[4786]: E0313 15:41:37.865258 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a\": container with ID starting with fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a not found: ID does not exist" containerID="fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.865288 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a"} err="failed to get container status \"fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a\": rpc error: code = NotFound desc = could not find container \"fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a\": container with ID starting with fcf4552b91c480c02a8ebc793fb20b44a71b62019121bd7aa2bfa92c424db62a not found: ID does not exist" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.865306 4786 scope.go:117] "RemoveContainer" containerID="5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64" Mar 13 15:41:37 crc kubenswrapper[4786]: E0313 15:41:37.865541 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64\": container with ID starting with 5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64 not found: ID does not exist" containerID="5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64" Mar 13 15:41:37 crc kubenswrapper[4786]: I0313 15:41:37.865564 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64"} err="failed to get container status \"5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64\": rpc error: code = NotFound desc = could not find container \"5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64\": container with ID starting with 5670a5bb3ceb086925d33e099ffc91cd0c1ef90c0b64ddedbb90d719eedbce64 not found: ID does not exist" Mar 13 15:41:38 crc kubenswrapper[4786]: I0313 15:41:38.566722 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" path="/var/lib/kubelet/pods/bfaabc8e-5856-4beb-a3c9-08223326ef06/volumes" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.142388 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556942-x7f64"] Mar 13 15:42:00 crc kubenswrapper[4786]: E0313 15:42:00.143269 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerName="extract-content" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.143284 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerName="extract-content" Mar 13 15:42:00 crc kubenswrapper[4786]: E0313 15:42:00.143299 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerName="extract-utilities" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.143307 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerName="extract-utilities" Mar 13 15:42:00 crc kubenswrapper[4786]: E0313 15:42:00.143324 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerName="registry-server" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.143331 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerName="registry-server" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.143757 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfaabc8e-5856-4beb-a3c9-08223326ef06" containerName="registry-server" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.144209 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556942-x7f64" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.151574 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556942-x7f64"] Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.153480 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.153526 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.153817 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.282313 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7p4g\" (UniqueName: \"kubernetes.io/projected/7740665f-53bb-42ce-8553-3bf63bbefb9f-kube-api-access-p7p4g\") pod \"auto-csr-approver-29556942-x7f64\" (UID: \"7740665f-53bb-42ce-8553-3bf63bbefb9f\") " pod="openshift-infra/auto-csr-approver-29556942-x7f64" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.383938 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7p4g\" (UniqueName: \"kubernetes.io/projected/7740665f-53bb-42ce-8553-3bf63bbefb9f-kube-api-access-p7p4g\") pod \"auto-csr-approver-29556942-x7f64\" (UID: \"7740665f-53bb-42ce-8553-3bf63bbefb9f\") " pod="openshift-infra/auto-csr-approver-29556942-x7f64" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.409777 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7p4g\" (UniqueName: \"kubernetes.io/projected/7740665f-53bb-42ce-8553-3bf63bbefb9f-kube-api-access-p7p4g\") pod \"auto-csr-approver-29556942-x7f64\" (UID: \"7740665f-53bb-42ce-8553-3bf63bbefb9f\") " pod="openshift-infra/auto-csr-approver-29556942-x7f64" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.465822 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556942-x7f64" Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.733413 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556942-x7f64"] Mar 13 15:42:00 crc kubenswrapper[4786]: I0313 15:42:00.943085 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556942-x7f64" event={"ID":"7740665f-53bb-42ce-8553-3bf63bbefb9f","Type":"ContainerStarted","Data":"9c19b209225f10d19bac15edb42642c9abedf9cb7beb9cf5ba1f4ea332e83e24"} Mar 13 15:42:02 crc kubenswrapper[4786]: I0313 15:42:02.959939 4786 generic.go:334] "Generic (PLEG): container finished" podID="7740665f-53bb-42ce-8553-3bf63bbefb9f" containerID="a605770fb14bd7bb5acaa98c9163a7db5792166332b13aeb2f84291df6e4f746" exitCode=0 Mar 13 15:42:02 crc kubenswrapper[4786]: I0313 15:42:02.959996 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556942-x7f64" event={"ID":"7740665f-53bb-42ce-8553-3bf63bbefb9f","Type":"ContainerDied","Data":"a605770fb14bd7bb5acaa98c9163a7db5792166332b13aeb2f84291df6e4f746"} Mar 13 15:42:04 crc kubenswrapper[4786]: I0313 15:42:04.252355 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556942-x7f64" Mar 13 15:42:04 crc kubenswrapper[4786]: I0313 15:42:04.439435 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7p4g\" (UniqueName: \"kubernetes.io/projected/7740665f-53bb-42ce-8553-3bf63bbefb9f-kube-api-access-p7p4g\") pod \"7740665f-53bb-42ce-8553-3bf63bbefb9f\" (UID: \"7740665f-53bb-42ce-8553-3bf63bbefb9f\") " Mar 13 15:42:04 crc kubenswrapper[4786]: I0313 15:42:04.445660 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7740665f-53bb-42ce-8553-3bf63bbefb9f-kube-api-access-p7p4g" (OuterVolumeSpecName: "kube-api-access-p7p4g") pod "7740665f-53bb-42ce-8553-3bf63bbefb9f" (UID: "7740665f-53bb-42ce-8553-3bf63bbefb9f"). InnerVolumeSpecName "kube-api-access-p7p4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:42:04 crc kubenswrapper[4786]: I0313 15:42:04.540613 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7p4g\" (UniqueName: \"kubernetes.io/projected/7740665f-53bb-42ce-8553-3bf63bbefb9f-kube-api-access-p7p4g\") on node \"crc\" DevicePath \"\"" Mar 13 15:42:04 crc kubenswrapper[4786]: I0313 15:42:04.977505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556942-x7f64" event={"ID":"7740665f-53bb-42ce-8553-3bf63bbefb9f","Type":"ContainerDied","Data":"9c19b209225f10d19bac15edb42642c9abedf9cb7beb9cf5ba1f4ea332e83e24"} Mar 13 15:42:04 crc kubenswrapper[4786]: I0313 15:42:04.977546 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c19b209225f10d19bac15edb42642c9abedf9cb7beb9cf5ba1f4ea332e83e24" Mar 13 15:42:04 crc kubenswrapper[4786]: I0313 15:42:04.977611 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556942-x7f64" Mar 13 15:42:05 crc kubenswrapper[4786]: I0313 15:42:05.323958 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556936-qrj2v"] Mar 13 15:42:05 crc kubenswrapper[4786]: I0313 15:42:05.330472 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556936-qrj2v"] Mar 13 15:42:06 crc kubenswrapper[4786]: I0313 15:42:06.567064 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba89a406-7f00-4dce-8542-970bfa870701" path="/var/lib/kubelet/pods/ba89a406-7f00-4dce-8542-970bfa870701/volumes" Mar 13 15:42:10 crc kubenswrapper[4786]: I0313 15:42:10.883254 4786 scope.go:117] "RemoveContainer" containerID="a4128c4f820bf74e1439f3398dd72e967cb858a0f8f97292645d1ea8198b1d67" Mar 13 15:43:07 crc kubenswrapper[4786]: I0313 15:43:07.868263 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:43:07 crc kubenswrapper[4786]: I0313 15:43:07.869126 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:43:37 crc kubenswrapper[4786]: I0313 15:43:37.868299 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:43:37 crc kubenswrapper[4786]: I0313 15:43:37.868954 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.147069 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556944-lwfjv"] Mar 13 15:44:00 crc kubenswrapper[4786]: E0313 15:44:00.147983 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7740665f-53bb-42ce-8553-3bf63bbefb9f" containerName="oc" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.148000 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7740665f-53bb-42ce-8553-3bf63bbefb9f" containerName="oc" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.148212 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7740665f-53bb-42ce-8553-3bf63bbefb9f" containerName="oc" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.148791 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556944-lwfjv" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.150828 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.151258 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.151306 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.155191 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556944-lwfjv"] Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.185599 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44ln6\" (UniqueName: \"kubernetes.io/projected/f44ccf4f-736c-49a3-9ff3-df85362a7d96-kube-api-access-44ln6\") pod \"auto-csr-approver-29556944-lwfjv\" (UID: \"f44ccf4f-736c-49a3-9ff3-df85362a7d96\") " pod="openshift-infra/auto-csr-approver-29556944-lwfjv" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.287379 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44ln6\" (UniqueName: \"kubernetes.io/projected/f44ccf4f-736c-49a3-9ff3-df85362a7d96-kube-api-access-44ln6\") pod \"auto-csr-approver-29556944-lwfjv\" (UID: \"f44ccf4f-736c-49a3-9ff3-df85362a7d96\") " pod="openshift-infra/auto-csr-approver-29556944-lwfjv" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.306395 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44ln6\" (UniqueName: \"kubernetes.io/projected/f44ccf4f-736c-49a3-9ff3-df85362a7d96-kube-api-access-44ln6\") pod \"auto-csr-approver-29556944-lwfjv\" (UID: \"f44ccf4f-736c-49a3-9ff3-df85362a7d96\") " pod="openshift-infra/auto-csr-approver-29556944-lwfjv" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.465960 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556944-lwfjv" Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.684237 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556944-lwfjv"] Mar 13 15:44:00 crc kubenswrapper[4786]: I0313 15:44:00.931273 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556944-lwfjv" event={"ID":"f44ccf4f-736c-49a3-9ff3-df85362a7d96","Type":"ContainerStarted","Data":"d4903653097fdc06d6e1766b60e54b9cba71dd71b1378b1521f18a4ac6f52965"} Mar 13 15:44:02 crc kubenswrapper[4786]: I0313 15:44:02.949788 4786 generic.go:334] "Generic (PLEG): container finished" podID="f44ccf4f-736c-49a3-9ff3-df85362a7d96" containerID="fbf9aa423b694ab6f20096f0ed2d0da3da1df9f18b447a544ce19ae9d09e7ba5" exitCode=0 Mar 13 15:44:02 crc kubenswrapper[4786]: I0313 15:44:02.950280 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556944-lwfjv" event={"ID":"f44ccf4f-736c-49a3-9ff3-df85362a7d96","Type":"ContainerDied","Data":"fbf9aa423b694ab6f20096f0ed2d0da3da1df9f18b447a544ce19ae9d09e7ba5"} Mar 13 15:44:04 crc kubenswrapper[4786]: I0313 15:44:04.237271 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556944-lwfjv" Mar 13 15:44:04 crc kubenswrapper[4786]: I0313 15:44:04.347382 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44ln6\" (UniqueName: \"kubernetes.io/projected/f44ccf4f-736c-49a3-9ff3-df85362a7d96-kube-api-access-44ln6\") pod \"f44ccf4f-736c-49a3-9ff3-df85362a7d96\" (UID: \"f44ccf4f-736c-49a3-9ff3-df85362a7d96\") " Mar 13 15:44:04 crc kubenswrapper[4786]: I0313 15:44:04.353893 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44ccf4f-736c-49a3-9ff3-df85362a7d96-kube-api-access-44ln6" (OuterVolumeSpecName: "kube-api-access-44ln6") pod "f44ccf4f-736c-49a3-9ff3-df85362a7d96" (UID: "f44ccf4f-736c-49a3-9ff3-df85362a7d96"). InnerVolumeSpecName "kube-api-access-44ln6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:44:04 crc kubenswrapper[4786]: I0313 15:44:04.449555 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44ln6\" (UniqueName: \"kubernetes.io/projected/f44ccf4f-736c-49a3-9ff3-df85362a7d96-kube-api-access-44ln6\") on node \"crc\" DevicePath \"\"" Mar 13 15:44:04 crc kubenswrapper[4786]: I0313 15:44:04.971530 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556944-lwfjv" event={"ID":"f44ccf4f-736c-49a3-9ff3-df85362a7d96","Type":"ContainerDied","Data":"d4903653097fdc06d6e1766b60e54b9cba71dd71b1378b1521f18a4ac6f52965"} Mar 13 15:44:04 crc kubenswrapper[4786]: I0313 15:44:04.971808 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4903653097fdc06d6e1766b60e54b9cba71dd71b1378b1521f18a4ac6f52965" Mar 13 15:44:04 crc kubenswrapper[4786]: I0313 15:44:04.971633 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556944-lwfjv" Mar 13 15:44:05 crc kubenswrapper[4786]: I0313 15:44:05.312160 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556938-5gb89"] Mar 13 15:44:05 crc kubenswrapper[4786]: I0313 15:44:05.318549 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556938-5gb89"] Mar 13 15:44:06 crc kubenswrapper[4786]: I0313 15:44:06.561459 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c1fc31-e84a-4e8b-b648-6fd8a29d484a" path="/var/lib/kubelet/pods/59c1fc31-e84a-4e8b-b648-6fd8a29d484a/volumes" Mar 13 15:44:07 crc kubenswrapper[4786]: I0313 15:44:07.869209 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:44:07 crc kubenswrapper[4786]: I0313 15:44:07.869743 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:44:07 crc kubenswrapper[4786]: I0313 15:44:07.869832 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:44:07 crc kubenswrapper[4786]: I0313 15:44:07.870828 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:44:07 crc kubenswrapper[4786]: I0313 15:44:07.871048 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" gracePeriod=600 Mar 13 15:44:08 crc kubenswrapper[4786]: E0313 15:44:08.052887 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:44:09 crc kubenswrapper[4786]: I0313 15:44:09.005831 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" exitCode=0 Mar 13 15:44:09 crc kubenswrapper[4786]: I0313 15:44:09.006650 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec"} Mar 13 15:44:09 crc kubenswrapper[4786]: I0313 15:44:09.006795 4786 scope.go:117] "RemoveContainer" containerID="4dcb2153a0a137b6fa9bb9053f208afcffda7d4a027ef34039a1c015892bfc5a" Mar 13 15:44:09 crc kubenswrapper[4786]: I0313 15:44:09.007480 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:44:09 crc kubenswrapper[4786]: E0313 15:44:09.007807 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:44:10 crc kubenswrapper[4786]: I0313 15:44:10.974849 4786 scope.go:117] "RemoveContainer" containerID="e878ba75f14e4ce661440f103954367cee974526e554cf20874c5e95e1488473" Mar 13 15:44:20 crc kubenswrapper[4786]: I0313 15:44:20.555402 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:44:20 crc kubenswrapper[4786]: E0313 15:44:20.556127 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.657733 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bb7m2"] Mar 13 15:44:24 crc kubenswrapper[4786]: E0313 15:44:24.658983 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44ccf4f-736c-49a3-9ff3-df85362a7d96" containerName="oc" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.659007 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44ccf4f-736c-49a3-9ff3-df85362a7d96" containerName="oc" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.659166 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44ccf4f-736c-49a3-9ff3-df85362a7d96" containerName="oc" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.660338 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.668672 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bb7m2"] Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.744585 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-catalog-content\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.744693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-kube-api-access-wntmd\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.744733 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-utilities\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.846586 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-kube-api-access-wntmd\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.846678 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-utilities\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.846747 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-catalog-content\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.847311 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-utilities\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.847341 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-catalog-content\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.875206 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-kube-api-access-wntmd\") pod \"redhat-operators-bb7m2\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:24 crc kubenswrapper[4786]: I0313 15:44:24.982593 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:25 crc kubenswrapper[4786]: I0313 15:44:25.226747 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bb7m2"] Mar 13 15:44:26 crc kubenswrapper[4786]: I0313 15:44:26.157884 4786 generic.go:334] "Generic (PLEG): container finished" podID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerID="0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53" exitCode=0 Mar 13 15:44:26 crc kubenswrapper[4786]: I0313 15:44:26.158129 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb7m2" event={"ID":"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8","Type":"ContainerDied","Data":"0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53"} Mar 13 15:44:26 crc kubenswrapper[4786]: I0313 15:44:26.158248 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb7m2" event={"ID":"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8","Type":"ContainerStarted","Data":"6d547e426a4b7a8a7e3c0bad2a4b866ee7389d4fa7bccb339a13a6adc9bdcfac"} Mar 13 15:44:28 crc kubenswrapper[4786]: I0313 15:44:28.175552 4786 generic.go:334] "Generic (PLEG): container finished" podID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerID="5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0" exitCode=0 Mar 13 15:44:28 crc kubenswrapper[4786]: I0313 15:44:28.175654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb7m2" event={"ID":"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8","Type":"ContainerDied","Data":"5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0"} Mar 13 15:44:29 crc kubenswrapper[4786]: I0313 15:44:29.191616 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb7m2" event={"ID":"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8","Type":"ContainerStarted","Data":"480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7"} Mar 13 15:44:32 crc kubenswrapper[4786]: I0313 15:44:32.552277 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:44:32 crc kubenswrapper[4786]: E0313 15:44:32.552806 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:44:34 crc kubenswrapper[4786]: I0313 15:44:34.983191 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:34 crc kubenswrapper[4786]: I0313 15:44:34.983258 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:36 crc kubenswrapper[4786]: I0313 15:44:36.019540 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bb7m2" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="registry-server" probeResult="failure" output=< Mar 13 15:44:36 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 15:44:36 crc kubenswrapper[4786]: > Mar 13 15:44:45 crc kubenswrapper[4786]: I0313 15:44:45.035645 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:45 crc kubenswrapper[4786]: I0313 15:44:45.058690 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bb7m2" podStartSLOduration=18.629843795 podStartE2EDuration="21.058671026s" podCreationTimestamp="2026-03-13 15:44:24 +0000 UTC" firstStartedPulling="2026-03-13 15:44:26.16035589 +0000 UTC m=+2496.323567721" lastFinishedPulling="2026-03-13 15:44:28.589183131 +0000 UTC m=+2498.752394952" observedRunningTime="2026-03-13 15:44:29.231123761 +0000 UTC m=+2499.394335632" watchObservedRunningTime="2026-03-13 15:44:45.058671026 +0000 UTC m=+2515.221882837" Mar 13 15:44:45 crc kubenswrapper[4786]: I0313 15:44:45.105279 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:45 crc kubenswrapper[4786]: I0313 15:44:45.272222 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bb7m2"] Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.324236 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bb7m2" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="registry-server" containerID="cri-o://480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7" gracePeriod=2 Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.551902 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:44:46 crc kubenswrapper[4786]: E0313 15:44:46.552458 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.744111 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.872405 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-kube-api-access-wntmd\") pod \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.872584 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-utilities\") pod \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.872667 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-catalog-content\") pod \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\" (UID: \"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8\") " Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.874206 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-utilities" (OuterVolumeSpecName: "utilities") pod "f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" (UID: "f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.879455 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-kube-api-access-wntmd" (OuterVolumeSpecName: "kube-api-access-wntmd") pod "f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" (UID: "f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8"). InnerVolumeSpecName "kube-api-access-wntmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.974682 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:44:46 crc kubenswrapper[4786]: I0313 15:44:46.974720 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-kube-api-access-wntmd\") on node \"crc\" DevicePath \"\"" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.030959 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" (UID: "f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.075768 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.346155 4786 generic.go:334] "Generic (PLEG): container finished" podID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerID="480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7" exitCode=0 Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.346237 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb7m2" event={"ID":"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8","Type":"ContainerDied","Data":"480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7"} Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.346250 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bb7m2" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.346317 4786 scope.go:117] "RemoveContainer" containerID="480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.346292 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb7m2" event={"ID":"f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8","Type":"ContainerDied","Data":"6d547e426a4b7a8a7e3c0bad2a4b866ee7389d4fa7bccb339a13a6adc9bdcfac"} Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.388541 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bb7m2"] Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.392509 4786 scope.go:117] "RemoveContainer" containerID="5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.395112 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bb7m2"] Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.425145 4786 scope.go:117] "RemoveContainer" containerID="0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.442513 4786 scope.go:117] "RemoveContainer" containerID="480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7" Mar 13 15:44:47 crc kubenswrapper[4786]: E0313 15:44:47.443283 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7\": container with ID starting with 480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7 not found: ID does not exist" containerID="480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.443334 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7"} err="failed to get container status \"480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7\": rpc error: code = NotFound desc = could not find container \"480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7\": container with ID starting with 480320d513b18a7206ca1c4af36b3d85a9f5a886c40f29ff5ad5bda3a9538cd7 not found: ID does not exist" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.443366 4786 scope.go:117] "RemoveContainer" containerID="5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0" Mar 13 15:44:47 crc kubenswrapper[4786]: E0313 15:44:47.443893 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0\": container with ID starting with 5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0 not found: ID does not exist" containerID="5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.443952 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0"} err="failed to get container status \"5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0\": rpc error: code = NotFound desc = could not find container \"5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0\": container with ID starting with 5de2065edcad2c54c09eed104faccd4b8e6f11697285e654c622fd6cb40f01f0 not found: ID does not exist" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.443985 4786 scope.go:117] "RemoveContainer" containerID="0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53" Mar 13 15:44:47 crc kubenswrapper[4786]: E0313 15:44:47.444443 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53\": container with ID starting with 0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53 not found: ID does not exist" containerID="0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53" Mar 13 15:44:47 crc kubenswrapper[4786]: I0313 15:44:47.444491 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53"} err="failed to get container status \"0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53\": rpc error: code = NotFound desc = could not find container \"0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53\": container with ID starting with 0f6faf4e5c06e894d5202181e40ba9a3799081d4137a995c22e8af2b18066b53 not found: ID does not exist" Mar 13 15:44:48 crc kubenswrapper[4786]: I0313 15:44:48.560462 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" path="/var/lib/kubelet/pods/f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8/volumes" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.159649 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg"] Mar 13 15:45:00 crc kubenswrapper[4786]: E0313 15:45:00.160693 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="extract-content" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.160710 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="extract-content" Mar 13 15:45:00 crc kubenswrapper[4786]: E0313 15:45:00.160732 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="registry-server" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.160740 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="registry-server" Mar 13 15:45:00 crc kubenswrapper[4786]: E0313 15:45:00.160755 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="extract-utilities" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.160763 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="extract-utilities" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.160981 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9276a0e-a4e8-42c4-8c12-3d9a21c1f2f8" containerName="registry-server" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.161675 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.164120 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg"] Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.165279 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.171432 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.215809 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aaccb485-07a2-43c1-b974-1268fcf7a5ee-secret-volume\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.216316 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aaccb485-07a2-43c1-b974-1268fcf7a5ee-config-volume\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.216361 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j4qv\" (UniqueName: \"kubernetes.io/projected/aaccb485-07a2-43c1-b974-1268fcf7a5ee-kube-api-access-5j4qv\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.317531 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aaccb485-07a2-43c1-b974-1268fcf7a5ee-secret-volume\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.317617 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aaccb485-07a2-43c1-b974-1268fcf7a5ee-config-volume\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.317646 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j4qv\" (UniqueName: \"kubernetes.io/projected/aaccb485-07a2-43c1-b974-1268fcf7a5ee-kube-api-access-5j4qv\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.318844 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aaccb485-07a2-43c1-b974-1268fcf7a5ee-config-volume\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.329115 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aaccb485-07a2-43c1-b974-1268fcf7a5ee-secret-volume\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.336288 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j4qv\" (UniqueName: \"kubernetes.io/projected/aaccb485-07a2-43c1-b974-1268fcf7a5ee-kube-api-access-5j4qv\") pod \"collect-profiles-29556945-gwfpg\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.484322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:00 crc kubenswrapper[4786]: I0313 15:45:00.712655 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg"] Mar 13 15:45:01 crc kubenswrapper[4786]: I0313 15:45:01.480036 4786 generic.go:334] "Generic (PLEG): container finished" podID="aaccb485-07a2-43c1-b974-1268fcf7a5ee" containerID="747ca890b9753a681878dde3cf8bc7a85a2b846d7fbc4a8c3b20d6a44c6d1010" exitCode=0 Mar 13 15:45:01 crc kubenswrapper[4786]: I0313 15:45:01.480151 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" event={"ID":"aaccb485-07a2-43c1-b974-1268fcf7a5ee","Type":"ContainerDied","Data":"747ca890b9753a681878dde3cf8bc7a85a2b846d7fbc4a8c3b20d6a44c6d1010"} Mar 13 15:45:01 crc kubenswrapper[4786]: I0313 15:45:01.483357 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" event={"ID":"aaccb485-07a2-43c1-b974-1268fcf7a5ee","Type":"ContainerStarted","Data":"763ae850c048e72d6ce20ccbadfebb9b7a4bcf0145c4b0668ed3368f388fc20a"} Mar 13 15:45:01 crc kubenswrapper[4786]: I0313 15:45:01.552229 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:45:01 crc kubenswrapper[4786]: E0313 15:45:01.553240 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.805321 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.853910 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aaccb485-07a2-43c1-b974-1268fcf7a5ee-config-volume\") pod \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.854058 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j4qv\" (UniqueName: \"kubernetes.io/projected/aaccb485-07a2-43c1-b974-1268fcf7a5ee-kube-api-access-5j4qv\") pod \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.854102 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aaccb485-07a2-43c1-b974-1268fcf7a5ee-secret-volume\") pod \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\" (UID: \"aaccb485-07a2-43c1-b974-1268fcf7a5ee\") " Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.855603 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaccb485-07a2-43c1-b974-1268fcf7a5ee-config-volume" (OuterVolumeSpecName: "config-volume") pod "aaccb485-07a2-43c1-b974-1268fcf7a5ee" (UID: "aaccb485-07a2-43c1-b974-1268fcf7a5ee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.859195 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaccb485-07a2-43c1-b974-1268fcf7a5ee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aaccb485-07a2-43c1-b974-1268fcf7a5ee" (UID: "aaccb485-07a2-43c1-b974-1268fcf7a5ee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.859234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaccb485-07a2-43c1-b974-1268fcf7a5ee-kube-api-access-5j4qv" (OuterVolumeSpecName: "kube-api-access-5j4qv") pod "aaccb485-07a2-43c1-b974-1268fcf7a5ee" (UID: "aaccb485-07a2-43c1-b974-1268fcf7a5ee"). InnerVolumeSpecName "kube-api-access-5j4qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.955264 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j4qv\" (UniqueName: \"kubernetes.io/projected/aaccb485-07a2-43c1-b974-1268fcf7a5ee-kube-api-access-5j4qv\") on node \"crc\" DevicePath \"\"" Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.955366 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aaccb485-07a2-43c1-b974-1268fcf7a5ee-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:45:02 crc kubenswrapper[4786]: I0313 15:45:02.955380 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aaccb485-07a2-43c1-b974-1268fcf7a5ee-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:45:03 crc kubenswrapper[4786]: I0313 15:45:03.500192 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" event={"ID":"aaccb485-07a2-43c1-b974-1268fcf7a5ee","Type":"ContainerDied","Data":"763ae850c048e72d6ce20ccbadfebb9b7a4bcf0145c4b0668ed3368f388fc20a"} Mar 13 15:45:03 crc kubenswrapper[4786]: I0313 15:45:03.500243 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="763ae850c048e72d6ce20ccbadfebb9b7a4bcf0145c4b0668ed3368f388fc20a" Mar 13 15:45:03 crc kubenswrapper[4786]: I0313 15:45:03.500637 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg" Mar 13 15:45:03 crc kubenswrapper[4786]: I0313 15:45:03.909321 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv"] Mar 13 15:45:03 crc kubenswrapper[4786]: I0313 15:45:03.920423 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-d6kmv"] Mar 13 15:45:04 crc kubenswrapper[4786]: I0313 15:45:04.562921 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af" path="/var/lib/kubelet/pods/3b4a9ded-21a9-4f58-97c0-5e3e4ff1e1af/volumes" Mar 13 15:45:11 crc kubenswrapper[4786]: I0313 15:45:11.048365 4786 scope.go:117] "RemoveContainer" containerID="2cd44f730f0e36d056d8017b35bdaab621c0324faa0b9ec39d5d8fe4ae805e60" Mar 13 15:45:16 crc kubenswrapper[4786]: I0313 15:45:16.551915 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:45:16 crc kubenswrapper[4786]: E0313 15:45:16.552935 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:45:27 crc kubenswrapper[4786]: I0313 15:45:27.552192 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:45:27 crc kubenswrapper[4786]: E0313 15:45:27.553126 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:45:39 crc kubenswrapper[4786]: I0313 15:45:39.552205 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:45:39 crc kubenswrapper[4786]: E0313 15:45:39.553649 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:45:51 crc kubenswrapper[4786]: I0313 15:45:51.552637 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:45:51 crc kubenswrapper[4786]: E0313 15:45:51.554140 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.158061 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556946-8cmp5"] Mar 13 15:46:00 crc kubenswrapper[4786]: E0313 15:46:00.158772 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaccb485-07a2-43c1-b974-1268fcf7a5ee" containerName="collect-profiles" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.158785 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaccb485-07a2-43c1-b974-1268fcf7a5ee" containerName="collect-profiles" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.158953 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaccb485-07a2-43c1-b974-1268fcf7a5ee" containerName="collect-profiles" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.159373 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556946-8cmp5" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.161790 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.162486 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.163220 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.180072 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556946-8cmp5"] Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.325515 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls4rj\" (UniqueName: \"kubernetes.io/projected/357f52d1-dea9-4ac5-92fc-d0b472c67a2e-kube-api-access-ls4rj\") pod \"auto-csr-approver-29556946-8cmp5\" (UID: \"357f52d1-dea9-4ac5-92fc-d0b472c67a2e\") " pod="openshift-infra/auto-csr-approver-29556946-8cmp5" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.427205 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls4rj\" (UniqueName: \"kubernetes.io/projected/357f52d1-dea9-4ac5-92fc-d0b472c67a2e-kube-api-access-ls4rj\") pod \"auto-csr-approver-29556946-8cmp5\" (UID: \"357f52d1-dea9-4ac5-92fc-d0b472c67a2e\") " pod="openshift-infra/auto-csr-approver-29556946-8cmp5" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.449639 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls4rj\" (UniqueName: \"kubernetes.io/projected/357f52d1-dea9-4ac5-92fc-d0b472c67a2e-kube-api-access-ls4rj\") pod \"auto-csr-approver-29556946-8cmp5\" (UID: \"357f52d1-dea9-4ac5-92fc-d0b472c67a2e\") " pod="openshift-infra/auto-csr-approver-29556946-8cmp5" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.498458 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556946-8cmp5" Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.696322 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556946-8cmp5"] Mar 13 15:46:00 crc kubenswrapper[4786]: W0313 15:46:00.701393 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod357f52d1_dea9_4ac5_92fc_d0b472c67a2e.slice/crio-60321e519261052fe0a79a5ae67be16ab97371917c9e0df151f1a10106805612 WatchSource:0}: Error finding container 60321e519261052fe0a79a5ae67be16ab97371917c9e0df151f1a10106805612: Status 404 returned error can't find the container with id 60321e519261052fe0a79a5ae67be16ab97371917c9e0df151f1a10106805612 Mar 13 15:46:00 crc kubenswrapper[4786]: I0313 15:46:00.703236 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:46:01 crc kubenswrapper[4786]: I0313 15:46:01.135957 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556946-8cmp5" event={"ID":"357f52d1-dea9-4ac5-92fc-d0b472c67a2e","Type":"ContainerStarted","Data":"60321e519261052fe0a79a5ae67be16ab97371917c9e0df151f1a10106805612"} Mar 13 15:46:03 crc kubenswrapper[4786]: I0313 15:46:03.191976 4786 generic.go:334] "Generic (PLEG): container finished" podID="357f52d1-dea9-4ac5-92fc-d0b472c67a2e" containerID="165cc1d4d93cd5548350ef4dae0dfb68ec53e818b5fddac73e506cf50a66cef3" exitCode=0 Mar 13 15:46:03 crc kubenswrapper[4786]: I0313 15:46:03.192266 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556946-8cmp5" event={"ID":"357f52d1-dea9-4ac5-92fc-d0b472c67a2e","Type":"ContainerDied","Data":"165cc1d4d93cd5548350ef4dae0dfb68ec53e818b5fddac73e506cf50a66cef3"} Mar 13 15:46:03 crc kubenswrapper[4786]: I0313 15:46:03.551739 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:46:03 crc kubenswrapper[4786]: E0313 15:46:03.552118 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:46:04 crc kubenswrapper[4786]: I0313 15:46:04.495090 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556946-8cmp5" Mar 13 15:46:04 crc kubenswrapper[4786]: I0313 15:46:04.688972 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls4rj\" (UniqueName: \"kubernetes.io/projected/357f52d1-dea9-4ac5-92fc-d0b472c67a2e-kube-api-access-ls4rj\") pod \"357f52d1-dea9-4ac5-92fc-d0b472c67a2e\" (UID: \"357f52d1-dea9-4ac5-92fc-d0b472c67a2e\") " Mar 13 15:46:04 crc kubenswrapper[4786]: I0313 15:46:04.696918 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357f52d1-dea9-4ac5-92fc-d0b472c67a2e-kube-api-access-ls4rj" (OuterVolumeSpecName: "kube-api-access-ls4rj") pod "357f52d1-dea9-4ac5-92fc-d0b472c67a2e" (UID: "357f52d1-dea9-4ac5-92fc-d0b472c67a2e"). InnerVolumeSpecName "kube-api-access-ls4rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:46:04 crc kubenswrapper[4786]: I0313 15:46:04.791014 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls4rj\" (UniqueName: \"kubernetes.io/projected/357f52d1-dea9-4ac5-92fc-d0b472c67a2e-kube-api-access-ls4rj\") on node \"crc\" DevicePath \"\"" Mar 13 15:46:05 crc kubenswrapper[4786]: I0313 15:46:05.208822 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556946-8cmp5" event={"ID":"357f52d1-dea9-4ac5-92fc-d0b472c67a2e","Type":"ContainerDied","Data":"60321e519261052fe0a79a5ae67be16ab97371917c9e0df151f1a10106805612"} Mar 13 15:46:05 crc kubenswrapper[4786]: I0313 15:46:05.209064 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60321e519261052fe0a79a5ae67be16ab97371917c9e0df151f1a10106805612" Mar 13 15:46:05 crc kubenswrapper[4786]: I0313 15:46:05.208913 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556946-8cmp5" Mar 13 15:46:05 crc kubenswrapper[4786]: I0313 15:46:05.567162 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556940-j5txx"] Mar 13 15:46:05 crc kubenswrapper[4786]: I0313 15:46:05.572562 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556940-j5txx"] Mar 13 15:46:06 crc kubenswrapper[4786]: I0313 15:46:06.565283 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85aea1da-1d6b-4f0f-acca-87584b9df30f" path="/var/lib/kubelet/pods/85aea1da-1d6b-4f0f-acca-87584b9df30f/volumes" Mar 13 15:46:11 crc kubenswrapper[4786]: I0313 15:46:11.128582 4786 scope.go:117] "RemoveContainer" containerID="ed0f43c7bc55f3a087e49e488a09b7271eeda17debd76bf265e2de3a73ebd9e8" Mar 13 15:46:14 crc kubenswrapper[4786]: I0313 15:46:14.552909 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:46:14 crc kubenswrapper[4786]: E0313 15:46:14.553608 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:46:28 crc kubenswrapper[4786]: I0313 15:46:28.552707 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:46:28 crc kubenswrapper[4786]: E0313 15:46:28.554382 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:46:40 crc kubenswrapper[4786]: I0313 15:46:40.560170 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:46:40 crc kubenswrapper[4786]: E0313 15:46:40.561231 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:46:51 crc kubenswrapper[4786]: I0313 15:46:51.552579 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:46:51 crc kubenswrapper[4786]: E0313 15:46:51.553696 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:47:05 crc kubenswrapper[4786]: I0313 15:47:05.553061 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:47:05 crc kubenswrapper[4786]: E0313 15:47:05.557170 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:47:19 crc kubenswrapper[4786]: I0313 15:47:19.552665 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:47:19 crc kubenswrapper[4786]: E0313 15:47:19.553917 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:47:33 crc kubenswrapper[4786]: I0313 15:47:33.552693 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:47:33 crc kubenswrapper[4786]: E0313 15:47:33.554011 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:47:48 crc kubenswrapper[4786]: I0313 15:47:48.552664 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:47:48 crc kubenswrapper[4786]: E0313 15:47:48.554470 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.159274 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556948-9j5fz"] Mar 13 15:48:00 crc kubenswrapper[4786]: E0313 15:48:00.160397 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357f52d1-dea9-4ac5-92fc-d0b472c67a2e" containerName="oc" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.160412 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="357f52d1-dea9-4ac5-92fc-d0b472c67a2e" containerName="oc" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.160659 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="357f52d1-dea9-4ac5-92fc-d0b472c67a2e" containerName="oc" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.161416 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.167602 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.167990 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.168158 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.173191 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556948-9j5fz"] Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.257775 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slbm2\" (UniqueName: \"kubernetes.io/projected/99335794-cabf-4d18-be56-302a98c40a5a-kube-api-access-slbm2\") pod \"auto-csr-approver-29556948-9j5fz\" (UID: \"99335794-cabf-4d18-be56-302a98c40a5a\") " pod="openshift-infra/auto-csr-approver-29556948-9j5fz" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.358701 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slbm2\" (UniqueName: \"kubernetes.io/projected/99335794-cabf-4d18-be56-302a98c40a5a-kube-api-access-slbm2\") pod \"auto-csr-approver-29556948-9j5fz\" (UID: \"99335794-cabf-4d18-be56-302a98c40a5a\") " pod="openshift-infra/auto-csr-approver-29556948-9j5fz" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.379339 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slbm2\" (UniqueName: \"kubernetes.io/projected/99335794-cabf-4d18-be56-302a98c40a5a-kube-api-access-slbm2\") pod \"auto-csr-approver-29556948-9j5fz\" (UID: \"99335794-cabf-4d18-be56-302a98c40a5a\") " pod="openshift-infra/auto-csr-approver-29556948-9j5fz" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.498150 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" Mar 13 15:48:00 crc kubenswrapper[4786]: I0313 15:48:00.943796 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556948-9j5fz"] Mar 13 15:48:00 crc kubenswrapper[4786]: W0313 15:48:00.951685 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99335794_cabf_4d18_be56_302a98c40a5a.slice/crio-6203c76cb7199221dc0e59c81a43d8de8e72433f387b9e720a3d1f40f9708cc6 WatchSource:0}: Error finding container 6203c76cb7199221dc0e59c81a43d8de8e72433f387b9e720a3d1f40f9708cc6: Status 404 returned error can't find the container with id 6203c76cb7199221dc0e59c81a43d8de8e72433f387b9e720a3d1f40f9708cc6 Mar 13 15:48:01 crc kubenswrapper[4786]: I0313 15:48:01.164628 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" event={"ID":"99335794-cabf-4d18-be56-302a98c40a5a","Type":"ContainerStarted","Data":"6203c76cb7199221dc0e59c81a43d8de8e72433f387b9e720a3d1f40f9708cc6"} Mar 13 15:48:02 crc kubenswrapper[4786]: I0313 15:48:02.175495 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" event={"ID":"99335794-cabf-4d18-be56-302a98c40a5a","Type":"ContainerStarted","Data":"fb73e3bdb5a956b279c100e6ba90c771d55392c61026c14d06cf5014942cbd61"} Mar 13 15:48:02 crc kubenswrapper[4786]: I0313 15:48:02.198701 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" podStartSLOduration=1.272155546 podStartE2EDuration="2.198674903s" podCreationTimestamp="2026-03-13 15:48:00 +0000 UTC" firstStartedPulling="2026-03-13 15:48:00.954828074 +0000 UTC m=+2711.118039895" lastFinishedPulling="2026-03-13 15:48:01.881347431 +0000 UTC m=+2712.044559252" observedRunningTime="2026-03-13 15:48:02.189542873 +0000 UTC m=+2712.352754704" watchObservedRunningTime="2026-03-13 15:48:02.198674903 +0000 UTC m=+2712.361886744" Mar 13 15:48:03 crc kubenswrapper[4786]: I0313 15:48:03.188591 4786 generic.go:334] "Generic (PLEG): container finished" podID="99335794-cabf-4d18-be56-302a98c40a5a" containerID="fb73e3bdb5a956b279c100e6ba90c771d55392c61026c14d06cf5014942cbd61" exitCode=0 Mar 13 15:48:03 crc kubenswrapper[4786]: I0313 15:48:03.188658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" event={"ID":"99335794-cabf-4d18-be56-302a98c40a5a","Type":"ContainerDied","Data":"fb73e3bdb5a956b279c100e6ba90c771d55392c61026c14d06cf5014942cbd61"} Mar 13 15:48:03 crc kubenswrapper[4786]: I0313 15:48:03.552087 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:48:03 crc kubenswrapper[4786]: E0313 15:48:03.552425 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:48:04 crc kubenswrapper[4786]: I0313 15:48:04.561123 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" Mar 13 15:48:04 crc kubenswrapper[4786]: I0313 15:48:04.733690 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slbm2\" (UniqueName: \"kubernetes.io/projected/99335794-cabf-4d18-be56-302a98c40a5a-kube-api-access-slbm2\") pod \"99335794-cabf-4d18-be56-302a98c40a5a\" (UID: \"99335794-cabf-4d18-be56-302a98c40a5a\") " Mar 13 15:48:04 crc kubenswrapper[4786]: I0313 15:48:04.739983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99335794-cabf-4d18-be56-302a98c40a5a-kube-api-access-slbm2" (OuterVolumeSpecName: "kube-api-access-slbm2") pod "99335794-cabf-4d18-be56-302a98c40a5a" (UID: "99335794-cabf-4d18-be56-302a98c40a5a"). InnerVolumeSpecName "kube-api-access-slbm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:48:04 crc kubenswrapper[4786]: I0313 15:48:04.835998 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slbm2\" (UniqueName: \"kubernetes.io/projected/99335794-cabf-4d18-be56-302a98c40a5a-kube-api-access-slbm2\") on node \"crc\" DevicePath \"\"" Mar 13 15:48:05 crc kubenswrapper[4786]: I0313 15:48:05.204067 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" event={"ID":"99335794-cabf-4d18-be56-302a98c40a5a","Type":"ContainerDied","Data":"6203c76cb7199221dc0e59c81a43d8de8e72433f387b9e720a3d1f40f9708cc6"} Mar 13 15:48:05 crc kubenswrapper[4786]: I0313 15:48:05.204339 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6203c76cb7199221dc0e59c81a43d8de8e72433f387b9e720a3d1f40f9708cc6" Mar 13 15:48:05 crc kubenswrapper[4786]: I0313 15:48:05.204104 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556948-9j5fz" Mar 13 15:48:05 crc kubenswrapper[4786]: I0313 15:48:05.266285 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556942-x7f64"] Mar 13 15:48:05 crc kubenswrapper[4786]: I0313 15:48:05.272847 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556942-x7f64"] Mar 13 15:48:06 crc kubenswrapper[4786]: I0313 15:48:06.567747 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7740665f-53bb-42ce-8553-3bf63bbefb9f" path="/var/lib/kubelet/pods/7740665f-53bb-42ce-8553-3bf63bbefb9f/volumes" Mar 13 15:48:11 crc kubenswrapper[4786]: I0313 15:48:11.214092 4786 scope.go:117] "RemoveContainer" containerID="a605770fb14bd7bb5acaa98c9163a7db5792166332b13aeb2f84291df6e4f746" Mar 13 15:48:14 crc kubenswrapper[4786]: I0313 15:48:14.552227 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:48:14 crc kubenswrapper[4786]: E0313 15:48:14.553132 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:48:28 crc kubenswrapper[4786]: I0313 15:48:28.553560 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:48:28 crc kubenswrapper[4786]: E0313 15:48:28.554918 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:48:42 crc kubenswrapper[4786]: I0313 15:48:42.552049 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:48:42 crc kubenswrapper[4786]: E0313 15:48:42.552587 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:48:54 crc kubenswrapper[4786]: I0313 15:48:54.552642 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:48:54 crc kubenswrapper[4786]: E0313 15:48:54.553537 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:49:05 crc kubenswrapper[4786]: I0313 15:49:05.551800 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:49:05 crc kubenswrapper[4786]: E0313 15:49:05.552910 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:49:18 crc kubenswrapper[4786]: I0313 15:49:18.552218 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:49:18 crc kubenswrapper[4786]: I0313 15:49:18.837650 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"2434f68ff036d4c7e88066d4c9206f1ac88f25c420d13e8648ea6e86bcd2c9ac"} Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.140379 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556950-mzkqk"] Mar 13 15:50:00 crc kubenswrapper[4786]: E0313 15:50:00.141312 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99335794-cabf-4d18-be56-302a98c40a5a" containerName="oc" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.141327 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="99335794-cabf-4d18-be56-302a98c40a5a" containerName="oc" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.141487 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="99335794-cabf-4d18-be56-302a98c40a5a" containerName="oc" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.142028 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556950-mzkqk" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.145359 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.145831 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.145898 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.158921 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556950-mzkqk"] Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.184707 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwptz\" (UniqueName: \"kubernetes.io/projected/38fdd75d-0d84-46b3-b590-259ed2254d5e-kube-api-access-mwptz\") pod \"auto-csr-approver-29556950-mzkqk\" (UID: \"38fdd75d-0d84-46b3-b590-259ed2254d5e\") " pod="openshift-infra/auto-csr-approver-29556950-mzkqk" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.286721 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwptz\" (UniqueName: \"kubernetes.io/projected/38fdd75d-0d84-46b3-b590-259ed2254d5e-kube-api-access-mwptz\") pod \"auto-csr-approver-29556950-mzkqk\" (UID: \"38fdd75d-0d84-46b3-b590-259ed2254d5e\") " pod="openshift-infra/auto-csr-approver-29556950-mzkqk" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.305029 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwptz\" (UniqueName: \"kubernetes.io/projected/38fdd75d-0d84-46b3-b590-259ed2254d5e-kube-api-access-mwptz\") pod \"auto-csr-approver-29556950-mzkqk\" (UID: \"38fdd75d-0d84-46b3-b590-259ed2254d5e\") " pod="openshift-infra/auto-csr-approver-29556950-mzkqk" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.461131 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556950-mzkqk" Mar 13 15:50:00 crc kubenswrapper[4786]: I0313 15:50:00.879691 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556950-mzkqk"] Mar 13 15:50:01 crc kubenswrapper[4786]: I0313 15:50:01.160097 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556950-mzkqk" event={"ID":"38fdd75d-0d84-46b3-b590-259ed2254d5e","Type":"ContainerStarted","Data":"1f2a75aeb93c34972abf2c0aa07b8fca021b6e3ddb39f5d83d1d4959adb8b3e2"} Mar 13 15:50:03 crc kubenswrapper[4786]: I0313 15:50:03.176024 4786 generic.go:334] "Generic (PLEG): container finished" podID="38fdd75d-0d84-46b3-b590-259ed2254d5e" containerID="c3497af5622a651e0755047c3ef44f492969cf2b074e92b4979e18f9ff0b898e" exitCode=0 Mar 13 15:50:03 crc kubenswrapper[4786]: I0313 15:50:03.176301 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556950-mzkqk" event={"ID":"38fdd75d-0d84-46b3-b590-259ed2254d5e","Type":"ContainerDied","Data":"c3497af5622a651e0755047c3ef44f492969cf2b074e92b4979e18f9ff0b898e"} Mar 13 15:50:04 crc kubenswrapper[4786]: I0313 15:50:04.533322 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556950-mzkqk" Mar 13 15:50:04 crc kubenswrapper[4786]: I0313 15:50:04.553436 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwptz\" (UniqueName: \"kubernetes.io/projected/38fdd75d-0d84-46b3-b590-259ed2254d5e-kube-api-access-mwptz\") pod \"38fdd75d-0d84-46b3-b590-259ed2254d5e\" (UID: \"38fdd75d-0d84-46b3-b590-259ed2254d5e\") " Mar 13 15:50:04 crc kubenswrapper[4786]: I0313 15:50:04.560652 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38fdd75d-0d84-46b3-b590-259ed2254d5e-kube-api-access-mwptz" (OuterVolumeSpecName: "kube-api-access-mwptz") pod "38fdd75d-0d84-46b3-b590-259ed2254d5e" (UID: "38fdd75d-0d84-46b3-b590-259ed2254d5e"). InnerVolumeSpecName "kube-api-access-mwptz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:50:04 crc kubenswrapper[4786]: I0313 15:50:04.655363 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwptz\" (UniqueName: \"kubernetes.io/projected/38fdd75d-0d84-46b3-b590-259ed2254d5e-kube-api-access-mwptz\") on node \"crc\" DevicePath \"\"" Mar 13 15:50:05 crc kubenswrapper[4786]: I0313 15:50:05.190734 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556950-mzkqk" event={"ID":"38fdd75d-0d84-46b3-b590-259ed2254d5e","Type":"ContainerDied","Data":"1f2a75aeb93c34972abf2c0aa07b8fca021b6e3ddb39f5d83d1d4959adb8b3e2"} Mar 13 15:50:05 crc kubenswrapper[4786]: I0313 15:50:05.190773 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f2a75aeb93c34972abf2c0aa07b8fca021b6e3ddb39f5d83d1d4959adb8b3e2" Mar 13 15:50:05 crc kubenswrapper[4786]: I0313 15:50:05.190789 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556950-mzkqk" Mar 13 15:50:05 crc kubenswrapper[4786]: I0313 15:50:05.602664 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556944-lwfjv"] Mar 13 15:50:05 crc kubenswrapper[4786]: I0313 15:50:05.607641 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556944-lwfjv"] Mar 13 15:50:06 crc kubenswrapper[4786]: I0313 15:50:06.564099 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44ccf4f-736c-49a3-9ff3-df85362a7d96" path="/var/lib/kubelet/pods/f44ccf4f-736c-49a3-9ff3-df85362a7d96/volumes" Mar 13 15:50:11 crc kubenswrapper[4786]: I0313 15:50:11.310273 4786 scope.go:117] "RemoveContainer" containerID="fbf9aa423b694ab6f20096f0ed2d0da3da1df9f18b447a544ce19ae9d09e7ba5" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.233599 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bcp88"] Mar 13 15:51:02 crc kubenswrapper[4786]: E0313 15:51:02.234472 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fdd75d-0d84-46b3-b590-259ed2254d5e" containerName="oc" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.234488 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fdd75d-0d84-46b3-b590-259ed2254d5e" containerName="oc" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.234645 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="38fdd75d-0d84-46b3-b590-259ed2254d5e" containerName="oc" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.235889 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.248955 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcp88"] Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.338148 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-utilities\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.338438 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-catalog-content\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.338551 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfkk8\" (UniqueName: \"kubernetes.io/projected/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-kube-api-access-lfkk8\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.439381 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-catalog-content\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.439432 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfkk8\" (UniqueName: \"kubernetes.io/projected/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-kube-api-access-lfkk8\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.439502 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-utilities\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.439948 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-catalog-content\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.439968 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-utilities\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.462002 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfkk8\" (UniqueName: \"kubernetes.io/projected/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-kube-api-access-lfkk8\") pod \"community-operators-bcp88\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:02 crc kubenswrapper[4786]: I0313 15:51:02.591676 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:03 crc kubenswrapper[4786]: I0313 15:51:03.064052 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcp88"] Mar 13 15:51:03 crc kubenswrapper[4786]: I0313 15:51:03.705768 4786 generic.go:334] "Generic (PLEG): container finished" podID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerID="1866dda4ee73cda9f03a2c6475001b1ad51d0cc701c123bf9e1f11cf258613eb" exitCode=0 Mar 13 15:51:03 crc kubenswrapper[4786]: I0313 15:51:03.705853 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcp88" event={"ID":"157bf7ee-14a7-4a34-af0a-368bbe8aeba2","Type":"ContainerDied","Data":"1866dda4ee73cda9f03a2c6475001b1ad51d0cc701c123bf9e1f11cf258613eb"} Mar 13 15:51:03 crc kubenswrapper[4786]: I0313 15:51:03.708187 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcp88" event={"ID":"157bf7ee-14a7-4a34-af0a-368bbe8aeba2","Type":"ContainerStarted","Data":"93661a8398a9435e120b757ddf05bd5973b85b4db011effb16c858bf532699e7"} Mar 13 15:51:03 crc kubenswrapper[4786]: I0313 15:51:03.710034 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:51:04 crc kubenswrapper[4786]: I0313 15:51:04.718752 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcp88" event={"ID":"157bf7ee-14a7-4a34-af0a-368bbe8aeba2","Type":"ContainerStarted","Data":"fd256b714e3469b588ef11a6f3e90c1fcc6b3e6c3b62b086524b22eb59375e9d"} Mar 13 15:51:05 crc kubenswrapper[4786]: I0313 15:51:05.893761 4786 generic.go:334] "Generic (PLEG): container finished" podID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerID="fd256b714e3469b588ef11a6f3e90c1fcc6b3e6c3b62b086524b22eb59375e9d" exitCode=0 Mar 13 15:51:05 crc kubenswrapper[4786]: I0313 15:51:05.893821 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcp88" event={"ID":"157bf7ee-14a7-4a34-af0a-368bbe8aeba2","Type":"ContainerDied","Data":"fd256b714e3469b588ef11a6f3e90c1fcc6b3e6c3b62b086524b22eb59375e9d"} Mar 13 15:51:06 crc kubenswrapper[4786]: I0313 15:51:06.904938 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcp88" event={"ID":"157bf7ee-14a7-4a34-af0a-368bbe8aeba2","Type":"ContainerStarted","Data":"9a64ace85d333bcf2aec533d1a68fc4710246356932492535114fc3e5c0eef7f"} Mar 13 15:51:06 crc kubenswrapper[4786]: I0313 15:51:06.927324 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bcp88" podStartSLOduration=2.236767919 podStartE2EDuration="4.927302469s" podCreationTimestamp="2026-03-13 15:51:02 +0000 UTC" firstStartedPulling="2026-03-13 15:51:03.708221613 +0000 UTC m=+2893.871433454" lastFinishedPulling="2026-03-13 15:51:06.398756163 +0000 UTC m=+2896.561968004" observedRunningTime="2026-03-13 15:51:06.926362085 +0000 UTC m=+2897.089573896" watchObservedRunningTime="2026-03-13 15:51:06.927302469 +0000 UTC m=+2897.090514280" Mar 13 15:51:12 crc kubenswrapper[4786]: I0313 15:51:12.591908 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:12 crc kubenswrapper[4786]: I0313 15:51:12.592524 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:12 crc kubenswrapper[4786]: I0313 15:51:12.652768 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:13 crc kubenswrapper[4786]: I0313 15:51:13.012663 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:13 crc kubenswrapper[4786]: I0313 15:51:13.070380 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcp88"] Mar 13 15:51:14 crc kubenswrapper[4786]: I0313 15:51:14.976241 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bcp88" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerName="registry-server" containerID="cri-o://9a64ace85d333bcf2aec533d1a68fc4710246356932492535114fc3e5c0eef7f" gracePeriod=2 Mar 13 15:51:15 crc kubenswrapper[4786]: I0313 15:51:15.985653 4786 generic.go:334] "Generic (PLEG): container finished" podID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerID="9a64ace85d333bcf2aec533d1a68fc4710246356932492535114fc3e5c0eef7f" exitCode=0 Mar 13 15:51:15 crc kubenswrapper[4786]: I0313 15:51:15.985702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcp88" event={"ID":"157bf7ee-14a7-4a34-af0a-368bbe8aeba2","Type":"ContainerDied","Data":"9a64ace85d333bcf2aec533d1a68fc4710246356932492535114fc3e5c0eef7f"} Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.120733 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.249668 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-utilities\") pod \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.249831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-catalog-content\") pod \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.249893 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfkk8\" (UniqueName: \"kubernetes.io/projected/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-kube-api-access-lfkk8\") pod \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\" (UID: \"157bf7ee-14a7-4a34-af0a-368bbe8aeba2\") " Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.250331 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-utilities" (OuterVolumeSpecName: "utilities") pod "157bf7ee-14a7-4a34-af0a-368bbe8aeba2" (UID: "157bf7ee-14a7-4a34-af0a-368bbe8aeba2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.256366 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-kube-api-access-lfkk8" (OuterVolumeSpecName: "kube-api-access-lfkk8") pod "157bf7ee-14a7-4a34-af0a-368bbe8aeba2" (UID: "157bf7ee-14a7-4a34-af0a-368bbe8aeba2"). InnerVolumeSpecName "kube-api-access-lfkk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.305054 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "157bf7ee-14a7-4a34-af0a-368bbe8aeba2" (UID: "157bf7ee-14a7-4a34-af0a-368bbe8aeba2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.350681 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.351036 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.351054 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfkk8\" (UniqueName: \"kubernetes.io/projected/157bf7ee-14a7-4a34-af0a-368bbe8aeba2-kube-api-access-lfkk8\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.998080 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcp88" event={"ID":"157bf7ee-14a7-4a34-af0a-368bbe8aeba2","Type":"ContainerDied","Data":"93661a8398a9435e120b757ddf05bd5973b85b4db011effb16c858bf532699e7"} Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.998179 4786 scope.go:117] "RemoveContainer" containerID="9a64ace85d333bcf2aec533d1a68fc4710246356932492535114fc3e5c0eef7f" Mar 13 15:51:16 crc kubenswrapper[4786]: I0313 15:51:16.998214 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcp88" Mar 13 15:51:17 crc kubenswrapper[4786]: I0313 15:51:17.034295 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcp88"] Mar 13 15:51:17 crc kubenswrapper[4786]: I0313 15:51:17.042341 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bcp88"] Mar 13 15:51:17 crc kubenswrapper[4786]: I0313 15:51:17.047939 4786 scope.go:117] "RemoveContainer" containerID="fd256b714e3469b588ef11a6f3e90c1fcc6b3e6c3b62b086524b22eb59375e9d" Mar 13 15:51:17 crc kubenswrapper[4786]: I0313 15:51:17.073505 4786 scope.go:117] "RemoveContainer" containerID="1866dda4ee73cda9f03a2c6475001b1ad51d0cc701c123bf9e1f11cf258613eb" Mar 13 15:51:18 crc kubenswrapper[4786]: I0313 15:51:18.567551 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" path="/var/lib/kubelet/pods/157bf7ee-14a7-4a34-af0a-368bbe8aeba2/volumes" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.826959 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dsxtq"] Mar 13 15:51:31 crc kubenswrapper[4786]: E0313 15:51:31.827756 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerName="extract-utilities" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.827771 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerName="extract-utilities" Mar 13 15:51:31 crc kubenswrapper[4786]: E0313 15:51:31.827780 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerName="extract-content" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.827785 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerName="extract-content" Mar 13 15:51:31 crc kubenswrapper[4786]: E0313 15:51:31.827820 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerName="registry-server" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.827827 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerName="registry-server" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.827965 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="157bf7ee-14a7-4a34-af0a-368bbe8aeba2" containerName="registry-server" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.828956 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.841069 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dsxtq"] Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.981198 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-catalog-content\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.981252 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-utilities\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:31 crc kubenswrapper[4786]: I0313 15:51:31.981289 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz4rd\" (UniqueName: \"kubernetes.io/projected/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-kube-api-access-tz4rd\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:32 crc kubenswrapper[4786]: I0313 15:51:32.083755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-catalog-content\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:32 crc kubenswrapper[4786]: I0313 15:51:32.083830 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-utilities\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:32 crc kubenswrapper[4786]: I0313 15:51:32.083876 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz4rd\" (UniqueName: \"kubernetes.io/projected/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-kube-api-access-tz4rd\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:32 crc kubenswrapper[4786]: I0313 15:51:32.084328 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-catalog-content\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:32 crc kubenswrapper[4786]: I0313 15:51:32.084327 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-utilities\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:32 crc kubenswrapper[4786]: I0313 15:51:32.102341 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz4rd\" (UniqueName: \"kubernetes.io/projected/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-kube-api-access-tz4rd\") pod \"certified-operators-dsxtq\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:32 crc kubenswrapper[4786]: I0313 15:51:32.152172 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:32 crc kubenswrapper[4786]: I0313 15:51:32.610483 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dsxtq"] Mar 13 15:51:33 crc kubenswrapper[4786]: I0313 15:51:33.158537 4786 generic.go:334] "Generic (PLEG): container finished" podID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerID="f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7" exitCode=0 Mar 13 15:51:33 crc kubenswrapper[4786]: I0313 15:51:33.158624 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsxtq" event={"ID":"2d55f2e4-1fa3-49ca-b979-2c93797f81b5","Type":"ContainerDied","Data":"f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7"} Mar 13 15:51:33 crc kubenswrapper[4786]: I0313 15:51:33.158931 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsxtq" event={"ID":"2d55f2e4-1fa3-49ca-b979-2c93797f81b5","Type":"ContainerStarted","Data":"c85a0e9d1ce2b08b017aca539a7acd1d6990b0879206f551e05bdaa27f9c44d9"} Mar 13 15:51:34 crc kubenswrapper[4786]: I0313 15:51:34.171574 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsxtq" event={"ID":"2d55f2e4-1fa3-49ca-b979-2c93797f81b5","Type":"ContainerStarted","Data":"a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6"} Mar 13 15:51:35 crc kubenswrapper[4786]: I0313 15:51:35.182325 4786 generic.go:334] "Generic (PLEG): container finished" podID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerID="a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6" exitCode=0 Mar 13 15:51:35 crc kubenswrapper[4786]: I0313 15:51:35.182375 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsxtq" event={"ID":"2d55f2e4-1fa3-49ca-b979-2c93797f81b5","Type":"ContainerDied","Data":"a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6"} Mar 13 15:51:36 crc kubenswrapper[4786]: I0313 15:51:36.195029 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsxtq" event={"ID":"2d55f2e4-1fa3-49ca-b979-2c93797f81b5","Type":"ContainerStarted","Data":"73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613"} Mar 13 15:51:36 crc kubenswrapper[4786]: I0313 15:51:36.223486 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dsxtq" podStartSLOduration=2.688744512 podStartE2EDuration="5.223458704s" podCreationTimestamp="2026-03-13 15:51:31 +0000 UTC" firstStartedPulling="2026-03-13 15:51:33.161169572 +0000 UTC m=+2923.324381383" lastFinishedPulling="2026-03-13 15:51:35.695883764 +0000 UTC m=+2925.859095575" observedRunningTime="2026-03-13 15:51:36.217941854 +0000 UTC m=+2926.381153695" watchObservedRunningTime="2026-03-13 15:51:36.223458704 +0000 UTC m=+2926.386670555" Mar 13 15:51:37 crc kubenswrapper[4786]: I0313 15:51:37.868839 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:51:37 crc kubenswrapper[4786]: I0313 15:51:37.868924 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:51:42 crc kubenswrapper[4786]: I0313 15:51:42.153096 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:42 crc kubenswrapper[4786]: I0313 15:51:42.153499 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:42 crc kubenswrapper[4786]: I0313 15:51:42.202695 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:42 crc kubenswrapper[4786]: I0313 15:51:42.284824 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:42 crc kubenswrapper[4786]: I0313 15:51:42.441240 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dsxtq"] Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.265811 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dsxtq" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerName="registry-server" containerID="cri-o://73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613" gracePeriod=2 Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.647963 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.784365 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz4rd\" (UniqueName: \"kubernetes.io/projected/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-kube-api-access-tz4rd\") pod \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.784549 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-catalog-content\") pod \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.784661 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-utilities\") pod \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\" (UID: \"2d55f2e4-1fa3-49ca-b979-2c93797f81b5\") " Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.785719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-utilities" (OuterVolumeSpecName: "utilities") pod "2d55f2e4-1fa3-49ca-b979-2c93797f81b5" (UID: "2d55f2e4-1fa3-49ca-b979-2c93797f81b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.792585 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-kube-api-access-tz4rd" (OuterVolumeSpecName: "kube-api-access-tz4rd") pod "2d55f2e4-1fa3-49ca-b979-2c93797f81b5" (UID: "2d55f2e4-1fa3-49ca-b979-2c93797f81b5"). InnerVolumeSpecName "kube-api-access-tz4rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.863266 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d55f2e4-1fa3-49ca-b979-2c93797f81b5" (UID: "2d55f2e4-1fa3-49ca-b979-2c93797f81b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.864002 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4rkxf"] Mar 13 15:51:44 crc kubenswrapper[4786]: E0313 15:51:44.864335 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerName="registry-server" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.864360 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerName="registry-server" Mar 13 15:51:44 crc kubenswrapper[4786]: E0313 15:51:44.864377 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerName="extract-content" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.864387 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerName="extract-content" Mar 13 15:51:44 crc kubenswrapper[4786]: E0313 15:51:44.864429 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerName="extract-utilities" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.864438 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerName="extract-utilities" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.864606 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerName="registry-server" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.866662 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.877773 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rkxf"] Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.887257 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz4rd\" (UniqueName: \"kubernetes.io/projected/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-kube-api-access-tz4rd\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.887305 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.887325 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d55f2e4-1fa3-49ca-b979-2c93797f81b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.988382 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swcp4\" (UniqueName: \"kubernetes.io/projected/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-kube-api-access-swcp4\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.988433 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-catalog-content\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:44 crc kubenswrapper[4786]: I0313 15:51:44.988459 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-utilities\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.090052 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swcp4\" (UniqueName: \"kubernetes.io/projected/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-kube-api-access-swcp4\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.090118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-catalog-content\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.090145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-utilities\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.090634 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-utilities\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.090761 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-catalog-content\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.121119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swcp4\" (UniqueName: \"kubernetes.io/projected/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-kube-api-access-swcp4\") pod \"redhat-marketplace-4rkxf\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.236908 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.275097 4786 generic.go:334] "Generic (PLEG): container finished" podID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" containerID="73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613" exitCode=0 Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.275144 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsxtq" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.275148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsxtq" event={"ID":"2d55f2e4-1fa3-49ca-b979-2c93797f81b5","Type":"ContainerDied","Data":"73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613"} Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.275183 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsxtq" event={"ID":"2d55f2e4-1fa3-49ca-b979-2c93797f81b5","Type":"ContainerDied","Data":"c85a0e9d1ce2b08b017aca539a7acd1d6990b0879206f551e05bdaa27f9c44d9"} Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.275204 4786 scope.go:117] "RemoveContainer" containerID="73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.294340 4786 scope.go:117] "RemoveContainer" containerID="a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.308869 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dsxtq"] Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.315244 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dsxtq"] Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.354201 4786 scope.go:117] "RemoveContainer" containerID="f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.379263 4786 scope.go:117] "RemoveContainer" containerID="73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613" Mar 13 15:51:45 crc kubenswrapper[4786]: E0313 15:51:45.379829 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613\": container with ID starting with 73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613 not found: ID does not exist" containerID="73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.379909 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613"} err="failed to get container status \"73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613\": rpc error: code = NotFound desc = could not find container \"73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613\": container with ID starting with 73577edeeed2bf9249034c77896e8254484f91492a661b44a4c4ed9eabe40613 not found: ID does not exist" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.379937 4786 scope.go:117] "RemoveContainer" containerID="a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6" Mar 13 15:51:45 crc kubenswrapper[4786]: E0313 15:51:45.380260 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6\": container with ID starting with a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6 not found: ID does not exist" containerID="a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.380283 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6"} err="failed to get container status \"a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6\": rpc error: code = NotFound desc = could not find container \"a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6\": container with ID starting with a9894f0f27f7263c43b3509f4056af39a3fc4600a00ed57589a5e7efeb59eaf6 not found: ID does not exist" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.380304 4786 scope.go:117] "RemoveContainer" containerID="f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7" Mar 13 15:51:45 crc kubenswrapper[4786]: E0313 15:51:45.380682 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7\": container with ID starting with f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7 not found: ID does not exist" containerID="f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.380705 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7"} err="failed to get container status \"f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7\": rpc error: code = NotFound desc = could not find container \"f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7\": container with ID starting with f05f001cdfd92d9ed472bb4216e38db33764a44249d42fc8aec7edcc1b9ba4c7 not found: ID does not exist" Mar 13 15:51:45 crc kubenswrapper[4786]: I0313 15:51:45.516561 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rkxf"] Mar 13 15:51:46 crc kubenswrapper[4786]: I0313 15:51:46.292374 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerID="f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c" exitCode=0 Mar 13 15:51:46 crc kubenswrapper[4786]: I0313 15:51:46.292491 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rkxf" event={"ID":"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2","Type":"ContainerDied","Data":"f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c"} Mar 13 15:51:46 crc kubenswrapper[4786]: I0313 15:51:46.292530 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rkxf" event={"ID":"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2","Type":"ContainerStarted","Data":"627f7a13d0bdc8705d1f2167c0647f366a440d5de21aaa9906cbe002079ea5be"} Mar 13 15:51:46 crc kubenswrapper[4786]: I0313 15:51:46.571502 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d55f2e4-1fa3-49ca-b979-2c93797f81b5" path="/var/lib/kubelet/pods/2d55f2e4-1fa3-49ca-b979-2c93797f81b5/volumes" Mar 13 15:51:48 crc kubenswrapper[4786]: I0313 15:51:48.313415 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerID="dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8" exitCode=0 Mar 13 15:51:48 crc kubenswrapper[4786]: I0313 15:51:48.313528 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rkxf" event={"ID":"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2","Type":"ContainerDied","Data":"dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8"} Mar 13 15:51:49 crc kubenswrapper[4786]: I0313 15:51:49.322361 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rkxf" event={"ID":"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2","Type":"ContainerStarted","Data":"2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f"} Mar 13 15:51:49 crc kubenswrapper[4786]: I0313 15:51:49.343781 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4rkxf" podStartSLOduration=2.925713535 podStartE2EDuration="5.34376282s" podCreationTimestamp="2026-03-13 15:51:44 +0000 UTC" firstStartedPulling="2026-03-13 15:51:46.296281862 +0000 UTC m=+2936.459493673" lastFinishedPulling="2026-03-13 15:51:48.714331107 +0000 UTC m=+2938.877542958" observedRunningTime="2026-03-13 15:51:49.339265537 +0000 UTC m=+2939.502477358" watchObservedRunningTime="2026-03-13 15:51:49.34376282 +0000 UTC m=+2939.506974631" Mar 13 15:51:55 crc kubenswrapper[4786]: I0313 15:51:55.237455 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:55 crc kubenswrapper[4786]: I0313 15:51:55.237802 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:55 crc kubenswrapper[4786]: I0313 15:51:55.294059 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:55 crc kubenswrapper[4786]: I0313 15:51:55.412732 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:55 crc kubenswrapper[4786]: I0313 15:51:55.524247 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rkxf"] Mar 13 15:51:57 crc kubenswrapper[4786]: I0313 15:51:57.391697 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4rkxf" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerName="registry-server" containerID="cri-o://2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f" gracePeriod=2 Mar 13 15:51:57 crc kubenswrapper[4786]: I0313 15:51:57.896225 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.079722 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swcp4\" (UniqueName: \"kubernetes.io/projected/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-kube-api-access-swcp4\") pod \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.079788 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-catalog-content\") pod \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.079909 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-utilities\") pod \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\" (UID: \"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2\") " Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.081331 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-utilities" (OuterVolumeSpecName: "utilities") pod "0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" (UID: "0c2b1e8f-3a07-4425-9db7-a87f1685e4a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.086391 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-kube-api-access-swcp4" (OuterVolumeSpecName: "kube-api-access-swcp4") pod "0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" (UID: "0c2b1e8f-3a07-4425-9db7-a87f1685e4a2"). InnerVolumeSpecName "kube-api-access-swcp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.133952 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" (UID: "0c2b1e8f-3a07-4425-9db7-a87f1685e4a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.182045 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swcp4\" (UniqueName: \"kubernetes.io/projected/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-kube-api-access-swcp4\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.182445 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.182604 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.401460 4786 generic.go:334] "Generic (PLEG): container finished" podID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerID="2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f" exitCode=0 Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.401524 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rkxf" event={"ID":"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2","Type":"ContainerDied","Data":"2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f"} Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.401541 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rkxf" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.401579 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rkxf" event={"ID":"0c2b1e8f-3a07-4425-9db7-a87f1685e4a2","Type":"ContainerDied","Data":"627f7a13d0bdc8705d1f2167c0647f366a440d5de21aaa9906cbe002079ea5be"} Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.401614 4786 scope.go:117] "RemoveContainer" containerID="2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.430238 4786 scope.go:117] "RemoveContainer" containerID="dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.469009 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rkxf"] Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.470946 4786 scope.go:117] "RemoveContainer" containerID="f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.479696 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rkxf"] Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.488823 4786 scope.go:117] "RemoveContainer" containerID="2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f" Mar 13 15:51:58 crc kubenswrapper[4786]: E0313 15:51:58.489271 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f\": container with ID starting with 2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f not found: ID does not exist" containerID="2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.489314 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f"} err="failed to get container status \"2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f\": rpc error: code = NotFound desc = could not find container \"2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f\": container with ID starting with 2e4dd2a46548e797db33319c89745356ab3f2605f364c61ca82135e91993728f not found: ID does not exist" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.489339 4786 scope.go:117] "RemoveContainer" containerID="dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8" Mar 13 15:51:58 crc kubenswrapper[4786]: E0313 15:51:58.489728 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8\": container with ID starting with dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8 not found: ID does not exist" containerID="dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.489768 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8"} err="failed to get container status \"dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8\": rpc error: code = NotFound desc = could not find container \"dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8\": container with ID starting with dca702602a5d139b260a1700809894419aa4291fefa07b1deb79168983a1c1e8 not found: ID does not exist" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.489796 4786 scope.go:117] "RemoveContainer" containerID="f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c" Mar 13 15:51:58 crc kubenswrapper[4786]: E0313 15:51:58.490130 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c\": container with ID starting with f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c not found: ID does not exist" containerID="f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.490161 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c"} err="failed to get container status \"f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c\": rpc error: code = NotFound desc = could not find container \"f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c\": container with ID starting with f9c00974d0d756cda5ac2a23f28b6b0245e4838a9be3947a88cd78d55919944c not found: ID does not exist" Mar 13 15:51:58 crc kubenswrapper[4786]: I0313 15:51:58.567059 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" path="/var/lib/kubelet/pods/0c2b1e8f-3a07-4425-9db7-a87f1685e4a2/volumes" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.142309 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556952-58dn5"] Mar 13 15:52:00 crc kubenswrapper[4786]: E0313 15:52:00.142628 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerName="registry-server" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.142642 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerName="registry-server" Mar 13 15:52:00 crc kubenswrapper[4786]: E0313 15:52:00.142661 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerName="extract-utilities" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.142667 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerName="extract-utilities" Mar 13 15:52:00 crc kubenswrapper[4786]: E0313 15:52:00.142676 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerName="extract-content" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.142683 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerName="extract-content" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.142828 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2b1e8f-3a07-4425-9db7-a87f1685e4a2" containerName="registry-server" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.143279 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556952-58dn5" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.148978 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.149155 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.151936 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.154096 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556952-58dn5"] Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.212371 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v95zm\" (UniqueName: \"kubernetes.io/projected/758a50ff-5c59-4468-bb7f-ebaf5e25d80c-kube-api-access-v95zm\") pod \"auto-csr-approver-29556952-58dn5\" (UID: \"758a50ff-5c59-4468-bb7f-ebaf5e25d80c\") " pod="openshift-infra/auto-csr-approver-29556952-58dn5" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.313458 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v95zm\" (UniqueName: \"kubernetes.io/projected/758a50ff-5c59-4468-bb7f-ebaf5e25d80c-kube-api-access-v95zm\") pod \"auto-csr-approver-29556952-58dn5\" (UID: \"758a50ff-5c59-4468-bb7f-ebaf5e25d80c\") " pod="openshift-infra/auto-csr-approver-29556952-58dn5" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.336265 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v95zm\" (UniqueName: \"kubernetes.io/projected/758a50ff-5c59-4468-bb7f-ebaf5e25d80c-kube-api-access-v95zm\") pod \"auto-csr-approver-29556952-58dn5\" (UID: \"758a50ff-5c59-4468-bb7f-ebaf5e25d80c\") " pod="openshift-infra/auto-csr-approver-29556952-58dn5" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.469991 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556952-58dn5" Mar 13 15:52:00 crc kubenswrapper[4786]: I0313 15:52:00.944332 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556952-58dn5"] Mar 13 15:52:01 crc kubenswrapper[4786]: I0313 15:52:01.424310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556952-58dn5" event={"ID":"758a50ff-5c59-4468-bb7f-ebaf5e25d80c","Type":"ContainerStarted","Data":"19f07ee4c226f17c324a204defaac016530d7891f69d8278c75b457d922b68d7"} Mar 13 15:52:03 crc kubenswrapper[4786]: I0313 15:52:03.440952 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556952-58dn5" event={"ID":"758a50ff-5c59-4468-bb7f-ebaf5e25d80c","Type":"ContainerStarted","Data":"fae552754228dd64cd44e162424fb81298fcea53bf6516f4f77d9d33d5d88436"} Mar 13 15:52:03 crc kubenswrapper[4786]: I0313 15:52:03.460955 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556952-58dn5" podStartSLOduration=1.309143526 podStartE2EDuration="3.460930344s" podCreationTimestamp="2026-03-13 15:52:00 +0000 UTC" firstStartedPulling="2026-03-13 15:52:00.952113166 +0000 UTC m=+2951.115324987" lastFinishedPulling="2026-03-13 15:52:03.103899994 +0000 UTC m=+2953.267111805" observedRunningTime="2026-03-13 15:52:03.454540403 +0000 UTC m=+2953.617752234" watchObservedRunningTime="2026-03-13 15:52:03.460930344 +0000 UTC m=+2953.624142165" Mar 13 15:52:04 crc kubenswrapper[4786]: I0313 15:52:04.451959 4786 generic.go:334] "Generic (PLEG): container finished" podID="758a50ff-5c59-4468-bb7f-ebaf5e25d80c" containerID="fae552754228dd64cd44e162424fb81298fcea53bf6516f4f77d9d33d5d88436" exitCode=0 Mar 13 15:52:04 crc kubenswrapper[4786]: I0313 15:52:04.452029 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556952-58dn5" event={"ID":"758a50ff-5c59-4468-bb7f-ebaf5e25d80c","Type":"ContainerDied","Data":"fae552754228dd64cd44e162424fb81298fcea53bf6516f4f77d9d33d5d88436"} Mar 13 15:52:05 crc kubenswrapper[4786]: I0313 15:52:05.745365 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556952-58dn5" Mar 13 15:52:05 crc kubenswrapper[4786]: I0313 15:52:05.896353 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v95zm\" (UniqueName: \"kubernetes.io/projected/758a50ff-5c59-4468-bb7f-ebaf5e25d80c-kube-api-access-v95zm\") pod \"758a50ff-5c59-4468-bb7f-ebaf5e25d80c\" (UID: \"758a50ff-5c59-4468-bb7f-ebaf5e25d80c\") " Mar 13 15:52:06 crc kubenswrapper[4786]: I0313 15:52:06.151015 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758a50ff-5c59-4468-bb7f-ebaf5e25d80c-kube-api-access-v95zm" (OuterVolumeSpecName: "kube-api-access-v95zm") pod "758a50ff-5c59-4468-bb7f-ebaf5e25d80c" (UID: "758a50ff-5c59-4468-bb7f-ebaf5e25d80c"). InnerVolumeSpecName "kube-api-access-v95zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:52:06 crc kubenswrapper[4786]: I0313 15:52:06.200995 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v95zm\" (UniqueName: \"kubernetes.io/projected/758a50ff-5c59-4468-bb7f-ebaf5e25d80c-kube-api-access-v95zm\") on node \"crc\" DevicePath \"\"" Mar 13 15:52:06 crc kubenswrapper[4786]: I0313 15:52:06.468779 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556952-58dn5" event={"ID":"758a50ff-5c59-4468-bb7f-ebaf5e25d80c","Type":"ContainerDied","Data":"19f07ee4c226f17c324a204defaac016530d7891f69d8278c75b457d922b68d7"} Mar 13 15:52:06 crc kubenswrapper[4786]: I0313 15:52:06.469267 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19f07ee4c226f17c324a204defaac016530d7891f69d8278c75b457d922b68d7" Mar 13 15:52:06 crc kubenswrapper[4786]: I0313 15:52:06.468894 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556952-58dn5" Mar 13 15:52:06 crc kubenswrapper[4786]: I0313 15:52:06.538951 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556946-8cmp5"] Mar 13 15:52:06 crc kubenswrapper[4786]: I0313 15:52:06.543581 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556946-8cmp5"] Mar 13 15:52:06 crc kubenswrapper[4786]: I0313 15:52:06.562429 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357f52d1-dea9-4ac5-92fc-d0b472c67a2e" path="/var/lib/kubelet/pods/357f52d1-dea9-4ac5-92fc-d0b472c67a2e/volumes" Mar 13 15:52:07 crc kubenswrapper[4786]: I0313 15:52:07.869159 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:52:07 crc kubenswrapper[4786]: I0313 15:52:07.869233 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:52:11 crc kubenswrapper[4786]: I0313 15:52:11.411308 4786 scope.go:117] "RemoveContainer" containerID="165cc1d4d93cd5548350ef4dae0dfb68ec53e818b5fddac73e506cf50a66cef3" Mar 13 15:52:37 crc kubenswrapper[4786]: I0313 15:52:37.868204 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:52:37 crc kubenswrapper[4786]: I0313 15:52:37.868773 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:52:37 crc kubenswrapper[4786]: I0313 15:52:37.868825 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:52:37 crc kubenswrapper[4786]: I0313 15:52:37.869495 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2434f68ff036d4c7e88066d4c9206f1ac88f25c420d13e8648ea6e86bcd2c9ac"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:52:37 crc kubenswrapper[4786]: I0313 15:52:37.869557 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://2434f68ff036d4c7e88066d4c9206f1ac88f25c420d13e8648ea6e86bcd2c9ac" gracePeriod=600 Mar 13 15:52:38 crc kubenswrapper[4786]: I0313 15:52:38.742846 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="2434f68ff036d4c7e88066d4c9206f1ac88f25c420d13e8648ea6e86bcd2c9ac" exitCode=0 Mar 13 15:52:38 crc kubenswrapper[4786]: I0313 15:52:38.742917 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"2434f68ff036d4c7e88066d4c9206f1ac88f25c420d13e8648ea6e86bcd2c9ac"} Mar 13 15:52:38 crc kubenswrapper[4786]: I0313 15:52:38.743284 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5"} Mar 13 15:52:38 crc kubenswrapper[4786]: I0313 15:52:38.743316 4786 scope.go:117] "RemoveContainer" containerID="8200b2c87f0457579718a1dabb596bc49fd71735a13b8e71af44f6cdcf4937ec" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.148407 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556954-r4cpp"] Mar 13 15:54:00 crc kubenswrapper[4786]: E0313 15:54:00.150999 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758a50ff-5c59-4468-bb7f-ebaf5e25d80c" containerName="oc" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.151813 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="758a50ff-5c59-4468-bb7f-ebaf5e25d80c" containerName="oc" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.152117 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="758a50ff-5c59-4468-bb7f-ebaf5e25d80c" containerName="oc" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.155511 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556954-r4cpp" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.157397 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.157995 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.158056 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.159768 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556954-r4cpp"] Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.257839 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nktw\" (UniqueName: \"kubernetes.io/projected/ed033ef8-d2cc-453f-bf4f-e2f36c6caac9-kube-api-access-2nktw\") pod \"auto-csr-approver-29556954-r4cpp\" (UID: \"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9\") " pod="openshift-infra/auto-csr-approver-29556954-r4cpp" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.359378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nktw\" (UniqueName: \"kubernetes.io/projected/ed033ef8-d2cc-453f-bf4f-e2f36c6caac9-kube-api-access-2nktw\") pod \"auto-csr-approver-29556954-r4cpp\" (UID: \"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9\") " pod="openshift-infra/auto-csr-approver-29556954-r4cpp" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.383524 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nktw\" (UniqueName: \"kubernetes.io/projected/ed033ef8-d2cc-453f-bf4f-e2f36c6caac9-kube-api-access-2nktw\") pod \"auto-csr-approver-29556954-r4cpp\" (UID: \"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9\") " pod="openshift-infra/auto-csr-approver-29556954-r4cpp" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.471003 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556954-r4cpp" Mar 13 15:54:00 crc kubenswrapper[4786]: I0313 15:54:00.925412 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556954-r4cpp"] Mar 13 15:54:01 crc kubenswrapper[4786]: I0313 15:54:01.426188 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556954-r4cpp" event={"ID":"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9","Type":"ContainerStarted","Data":"2638a2287e75cba2c38c3be30bba026b0df85f511ca8e0c39fe1ea483dc425ad"} Mar 13 15:54:02 crc kubenswrapper[4786]: I0313 15:54:02.434307 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556954-r4cpp" event={"ID":"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9","Type":"ContainerStarted","Data":"5afaf18485b511642d3176a7447c825eee0ab4e07b97ba4bbb2f90e5a21eb1e5"} Mar 13 15:54:03 crc kubenswrapper[4786]: I0313 15:54:03.449878 4786 generic.go:334] "Generic (PLEG): container finished" podID="ed033ef8-d2cc-453f-bf4f-e2f36c6caac9" containerID="5afaf18485b511642d3176a7447c825eee0ab4e07b97ba4bbb2f90e5a21eb1e5" exitCode=0 Mar 13 15:54:03 crc kubenswrapper[4786]: I0313 15:54:03.449913 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556954-r4cpp" event={"ID":"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9","Type":"ContainerDied","Data":"5afaf18485b511642d3176a7447c825eee0ab4e07b97ba4bbb2f90e5a21eb1e5"} Mar 13 15:54:04 crc kubenswrapper[4786]: I0313 15:54:04.865616 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556954-r4cpp" Mar 13 15:54:04 crc kubenswrapper[4786]: I0313 15:54:04.951553 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nktw\" (UniqueName: \"kubernetes.io/projected/ed033ef8-d2cc-453f-bf4f-e2f36c6caac9-kube-api-access-2nktw\") pod \"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9\" (UID: \"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9\") " Mar 13 15:54:04 crc kubenswrapper[4786]: I0313 15:54:04.962561 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed033ef8-d2cc-453f-bf4f-e2f36c6caac9-kube-api-access-2nktw" (OuterVolumeSpecName: "kube-api-access-2nktw") pod "ed033ef8-d2cc-453f-bf4f-e2f36c6caac9" (UID: "ed033ef8-d2cc-453f-bf4f-e2f36c6caac9"). InnerVolumeSpecName "kube-api-access-2nktw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:54:05 crc kubenswrapper[4786]: I0313 15:54:05.053330 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nktw\" (UniqueName: \"kubernetes.io/projected/ed033ef8-d2cc-453f-bf4f-e2f36c6caac9-kube-api-access-2nktw\") on node \"crc\" DevicePath \"\"" Mar 13 15:54:05 crc kubenswrapper[4786]: I0313 15:54:05.470670 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556954-r4cpp" event={"ID":"ed033ef8-d2cc-453f-bf4f-e2f36c6caac9","Type":"ContainerDied","Data":"2638a2287e75cba2c38c3be30bba026b0df85f511ca8e0c39fe1ea483dc425ad"} Mar 13 15:54:05 crc kubenswrapper[4786]: I0313 15:54:05.470729 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2638a2287e75cba2c38c3be30bba026b0df85f511ca8e0c39fe1ea483dc425ad" Mar 13 15:54:05 crc kubenswrapper[4786]: I0313 15:54:05.470731 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556954-r4cpp" Mar 13 15:54:05 crc kubenswrapper[4786]: I0313 15:54:05.942436 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556948-9j5fz"] Mar 13 15:54:05 crc kubenswrapper[4786]: I0313 15:54:05.948906 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556948-9j5fz"] Mar 13 15:54:06 crc kubenswrapper[4786]: I0313 15:54:06.559616 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99335794-cabf-4d18-be56-302a98c40a5a" path="/var/lib/kubelet/pods/99335794-cabf-4d18-be56-302a98c40a5a/volumes" Mar 13 15:54:11 crc kubenswrapper[4786]: I0313 15:54:11.579769 4786 scope.go:117] "RemoveContainer" containerID="fb73e3bdb5a956b279c100e6ba90c771d55392c61026c14d06cf5014942cbd61" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.675601 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lpfkr"] Mar 13 15:55:01 crc kubenswrapper[4786]: E0313 15:55:01.676815 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed033ef8-d2cc-453f-bf4f-e2f36c6caac9" containerName="oc" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.676853 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed033ef8-d2cc-453f-bf4f-e2f36c6caac9" containerName="oc" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.677240 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed033ef8-d2cc-453f-bf4f-e2f36c6caac9" containerName="oc" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.679497 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.695635 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpfkr"] Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.875661 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8xs\" (UniqueName: \"kubernetes.io/projected/bc431c71-d6df-4dbd-99a8-3c783c085180-kube-api-access-bf8xs\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.875740 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-catalog-content\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.876036 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-utilities\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.977482 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8xs\" (UniqueName: \"kubernetes.io/projected/bc431c71-d6df-4dbd-99a8-3c783c085180-kube-api-access-bf8xs\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.977552 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-catalog-content\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.977619 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-utilities\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.978212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-catalog-content\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:01 crc kubenswrapper[4786]: I0313 15:55:01.978265 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-utilities\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:02 crc kubenswrapper[4786]: I0313 15:55:02.007730 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8xs\" (UniqueName: \"kubernetes.io/projected/bc431c71-d6df-4dbd-99a8-3c783c085180-kube-api-access-bf8xs\") pod \"redhat-operators-lpfkr\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:02 crc kubenswrapper[4786]: I0313 15:55:02.016040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:02 crc kubenswrapper[4786]: I0313 15:55:02.283211 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpfkr"] Mar 13 15:55:02 crc kubenswrapper[4786]: I0313 15:55:02.951981 4786 generic.go:334] "Generic (PLEG): container finished" podID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerID="17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea" exitCode=0 Mar 13 15:55:02 crc kubenswrapper[4786]: I0313 15:55:02.952049 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpfkr" event={"ID":"bc431c71-d6df-4dbd-99a8-3c783c085180","Type":"ContainerDied","Data":"17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea"} Mar 13 15:55:02 crc kubenswrapper[4786]: I0313 15:55:02.952298 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpfkr" event={"ID":"bc431c71-d6df-4dbd-99a8-3c783c085180","Type":"ContainerStarted","Data":"0db231acb4ebde3a0c09ce3da73911fb90ca67e241c87db3fc835e4a774198f4"} Mar 13 15:55:05 crc kubenswrapper[4786]: I0313 15:55:05.975144 4786 generic.go:334] "Generic (PLEG): container finished" podID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerID="a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0" exitCode=0 Mar 13 15:55:05 crc kubenswrapper[4786]: I0313 15:55:05.976319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpfkr" event={"ID":"bc431c71-d6df-4dbd-99a8-3c783c085180","Type":"ContainerDied","Data":"a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0"} Mar 13 15:55:06 crc kubenswrapper[4786]: I0313 15:55:06.988051 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpfkr" event={"ID":"bc431c71-d6df-4dbd-99a8-3c783c085180","Type":"ContainerStarted","Data":"1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c"} Mar 13 15:55:07 crc kubenswrapper[4786]: I0313 15:55:07.009897 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lpfkr" podStartSLOduration=2.583202787 podStartE2EDuration="6.009831905s" podCreationTimestamp="2026-03-13 15:55:01 +0000 UTC" firstStartedPulling="2026-03-13 15:55:02.95357811 +0000 UTC m=+3133.116789921" lastFinishedPulling="2026-03-13 15:55:06.380207228 +0000 UTC m=+3136.543419039" observedRunningTime="2026-03-13 15:55:07.005507916 +0000 UTC m=+3137.168719737" watchObservedRunningTime="2026-03-13 15:55:07.009831905 +0000 UTC m=+3137.173043756" Mar 13 15:55:07 crc kubenswrapper[4786]: I0313 15:55:07.868197 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:55:07 crc kubenswrapper[4786]: I0313 15:55:07.868513 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:55:12 crc kubenswrapper[4786]: I0313 15:55:12.016350 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:12 crc kubenswrapper[4786]: I0313 15:55:12.017741 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:13 crc kubenswrapper[4786]: I0313 15:55:13.087162 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lpfkr" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="registry-server" probeResult="failure" output=< Mar 13 15:55:13 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 15:55:13 crc kubenswrapper[4786]: > Mar 13 15:55:22 crc kubenswrapper[4786]: I0313 15:55:22.086250 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:22 crc kubenswrapper[4786]: I0313 15:55:22.149766 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:22 crc kubenswrapper[4786]: I0313 15:55:22.331938 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpfkr"] Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.127784 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lpfkr" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="registry-server" containerID="cri-o://1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c" gracePeriod=2 Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.569162 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.711730 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-catalog-content\") pod \"bc431c71-d6df-4dbd-99a8-3c783c085180\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.711795 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-utilities\") pod \"bc431c71-d6df-4dbd-99a8-3c783c085180\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.711936 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf8xs\" (UniqueName: \"kubernetes.io/projected/bc431c71-d6df-4dbd-99a8-3c783c085180-kube-api-access-bf8xs\") pod \"bc431c71-d6df-4dbd-99a8-3c783c085180\" (UID: \"bc431c71-d6df-4dbd-99a8-3c783c085180\") " Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.712897 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-utilities" (OuterVolumeSpecName: "utilities") pod "bc431c71-d6df-4dbd-99a8-3c783c085180" (UID: "bc431c71-d6df-4dbd-99a8-3c783c085180"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.716564 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.730022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc431c71-d6df-4dbd-99a8-3c783c085180-kube-api-access-bf8xs" (OuterVolumeSpecName: "kube-api-access-bf8xs") pod "bc431c71-d6df-4dbd-99a8-3c783c085180" (UID: "bc431c71-d6df-4dbd-99a8-3c783c085180"). InnerVolumeSpecName "kube-api-access-bf8xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.817783 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf8xs\" (UniqueName: \"kubernetes.io/projected/bc431c71-d6df-4dbd-99a8-3c783c085180-kube-api-access-bf8xs\") on node \"crc\" DevicePath \"\"" Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.867750 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc431c71-d6df-4dbd-99a8-3c783c085180" (UID: "bc431c71-d6df-4dbd-99a8-3c783c085180"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:55:23 crc kubenswrapper[4786]: I0313 15:55:23.919565 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc431c71-d6df-4dbd-99a8-3c783c085180-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.137820 4786 generic.go:334] "Generic (PLEG): container finished" podID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerID="1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c" exitCode=0 Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.137877 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpfkr" event={"ID":"bc431c71-d6df-4dbd-99a8-3c783c085180","Type":"ContainerDied","Data":"1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c"} Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.137904 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpfkr" event={"ID":"bc431c71-d6df-4dbd-99a8-3c783c085180","Type":"ContainerDied","Data":"0db231acb4ebde3a0c09ce3da73911fb90ca67e241c87db3fc835e4a774198f4"} Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.137920 4786 scope.go:117] "RemoveContainer" containerID="1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.137981 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpfkr" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.156414 4786 scope.go:117] "RemoveContainer" containerID="a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.174625 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpfkr"] Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.179682 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lpfkr"] Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.199657 4786 scope.go:117] "RemoveContainer" containerID="17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.218309 4786 scope.go:117] "RemoveContainer" containerID="1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c" Mar 13 15:55:24 crc kubenswrapper[4786]: E0313 15:55:24.218726 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c\": container with ID starting with 1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c not found: ID does not exist" containerID="1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.218773 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c"} err="failed to get container status \"1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c\": rpc error: code = NotFound desc = could not find container \"1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c\": container with ID starting with 1fe8115e34e41ce52904ca03369e408a13cf4668ab304488213f2e77036f8f1c not found: ID does not exist" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.218800 4786 scope.go:117] "RemoveContainer" containerID="a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0" Mar 13 15:55:24 crc kubenswrapper[4786]: E0313 15:55:24.219278 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0\": container with ID starting with a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0 not found: ID does not exist" containerID="a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.219310 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0"} err="failed to get container status \"a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0\": rpc error: code = NotFound desc = could not find container \"a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0\": container with ID starting with a8c2da53b9f429759ad650ecd197d7d4a5f1e6a9d56ba50b930e97c2267e1bc0 not found: ID does not exist" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.219334 4786 scope.go:117] "RemoveContainer" containerID="17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea" Mar 13 15:55:24 crc kubenswrapper[4786]: E0313 15:55:24.219639 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea\": container with ID starting with 17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea not found: ID does not exist" containerID="17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.219666 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea"} err="failed to get container status \"17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea\": rpc error: code = NotFound desc = could not find container \"17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea\": container with ID starting with 17bf6a05e11fe840dadd7db6dcc7c7685f545d7a4e50e3f049b18d877565bfea not found: ID does not exist" Mar 13 15:55:24 crc kubenswrapper[4786]: I0313 15:55:24.567553 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" path="/var/lib/kubelet/pods/bc431c71-d6df-4dbd-99a8-3c783c085180/volumes" Mar 13 15:55:37 crc kubenswrapper[4786]: I0313 15:55:37.868547 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:55:37 crc kubenswrapper[4786]: I0313 15:55:37.869163 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.159740 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556956-dbfgh"] Mar 13 15:56:00 crc kubenswrapper[4786]: E0313 15:56:00.160657 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="extract-content" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.160682 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="extract-content" Mar 13 15:56:00 crc kubenswrapper[4786]: E0313 15:56:00.160702 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="extract-utilities" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.160710 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="extract-utilities" Mar 13 15:56:00 crc kubenswrapper[4786]: E0313 15:56:00.160731 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="registry-server" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.160739 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="registry-server" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.160937 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc431c71-d6df-4dbd-99a8-3c783c085180" containerName="registry-server" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.161557 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.163714 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.163778 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.164489 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.172419 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556956-dbfgh"] Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.182668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j5bj\" (UniqueName: \"kubernetes.io/projected/45d2b22d-3a12-4982-9393-af2a24093bcb-kube-api-access-5j5bj\") pod \"auto-csr-approver-29556956-dbfgh\" (UID: \"45d2b22d-3a12-4982-9393-af2a24093bcb\") " pod="openshift-infra/auto-csr-approver-29556956-dbfgh" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.284152 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j5bj\" (UniqueName: \"kubernetes.io/projected/45d2b22d-3a12-4982-9393-af2a24093bcb-kube-api-access-5j5bj\") pod \"auto-csr-approver-29556956-dbfgh\" (UID: \"45d2b22d-3a12-4982-9393-af2a24093bcb\") " pod="openshift-infra/auto-csr-approver-29556956-dbfgh" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.303188 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j5bj\" (UniqueName: \"kubernetes.io/projected/45d2b22d-3a12-4982-9393-af2a24093bcb-kube-api-access-5j5bj\") pod \"auto-csr-approver-29556956-dbfgh\" (UID: \"45d2b22d-3a12-4982-9393-af2a24093bcb\") " pod="openshift-infra/auto-csr-approver-29556956-dbfgh" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.486692 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" Mar 13 15:56:00 crc kubenswrapper[4786]: I0313 15:56:00.912125 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556956-dbfgh"] Mar 13 15:56:01 crc kubenswrapper[4786]: I0313 15:56:01.440427 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" event={"ID":"45d2b22d-3a12-4982-9393-af2a24093bcb","Type":"ContainerStarted","Data":"35223aabe9c338a4ccde95dc3d6ec4fa1914c57537d91b7b6f6175d4d6e32e0a"} Mar 13 15:56:02 crc kubenswrapper[4786]: I0313 15:56:02.449829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" event={"ID":"45d2b22d-3a12-4982-9393-af2a24093bcb","Type":"ContainerStarted","Data":"01cdae5dc0f3126b1a1f0c5e9d2771fb0b7eb3a1339193bd9e4e20f02a31f848"} Mar 13 15:56:02 crc kubenswrapper[4786]: I0313 15:56:02.469141 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" podStartSLOduration=1.453697407 podStartE2EDuration="2.469120483s" podCreationTimestamp="2026-03-13 15:56:00 +0000 UTC" firstStartedPulling="2026-03-13 15:56:00.92027587 +0000 UTC m=+3191.083487681" lastFinishedPulling="2026-03-13 15:56:01.935698936 +0000 UTC m=+3192.098910757" observedRunningTime="2026-03-13 15:56:02.466816925 +0000 UTC m=+3192.630028736" watchObservedRunningTime="2026-03-13 15:56:02.469120483 +0000 UTC m=+3192.632332304" Mar 13 15:56:03 crc kubenswrapper[4786]: I0313 15:56:03.458065 4786 generic.go:334] "Generic (PLEG): container finished" podID="45d2b22d-3a12-4982-9393-af2a24093bcb" containerID="01cdae5dc0f3126b1a1f0c5e9d2771fb0b7eb3a1339193bd9e4e20f02a31f848" exitCode=0 Mar 13 15:56:03 crc kubenswrapper[4786]: I0313 15:56:03.458114 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" event={"ID":"45d2b22d-3a12-4982-9393-af2a24093bcb","Type":"ContainerDied","Data":"01cdae5dc0f3126b1a1f0c5e9d2771fb0b7eb3a1339193bd9e4e20f02a31f848"} Mar 13 15:56:04 crc kubenswrapper[4786]: I0313 15:56:04.774807 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" Mar 13 15:56:04 crc kubenswrapper[4786]: I0313 15:56:04.947529 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j5bj\" (UniqueName: \"kubernetes.io/projected/45d2b22d-3a12-4982-9393-af2a24093bcb-kube-api-access-5j5bj\") pod \"45d2b22d-3a12-4982-9393-af2a24093bcb\" (UID: \"45d2b22d-3a12-4982-9393-af2a24093bcb\") " Mar 13 15:56:04 crc kubenswrapper[4786]: I0313 15:56:04.952829 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d2b22d-3a12-4982-9393-af2a24093bcb-kube-api-access-5j5bj" (OuterVolumeSpecName: "kube-api-access-5j5bj") pod "45d2b22d-3a12-4982-9393-af2a24093bcb" (UID: "45d2b22d-3a12-4982-9393-af2a24093bcb"). InnerVolumeSpecName "kube-api-access-5j5bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:56:05 crc kubenswrapper[4786]: I0313 15:56:05.051259 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j5bj\" (UniqueName: \"kubernetes.io/projected/45d2b22d-3a12-4982-9393-af2a24093bcb-kube-api-access-5j5bj\") on node \"crc\" DevicePath \"\"" Mar 13 15:56:05 crc kubenswrapper[4786]: I0313 15:56:05.474378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" event={"ID":"45d2b22d-3a12-4982-9393-af2a24093bcb","Type":"ContainerDied","Data":"35223aabe9c338a4ccde95dc3d6ec4fa1914c57537d91b7b6f6175d4d6e32e0a"} Mar 13 15:56:05 crc kubenswrapper[4786]: I0313 15:56:05.474438 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35223aabe9c338a4ccde95dc3d6ec4fa1914c57537d91b7b6f6175d4d6e32e0a" Mar 13 15:56:05 crc kubenswrapper[4786]: I0313 15:56:05.474465 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556956-dbfgh" Mar 13 15:56:05 crc kubenswrapper[4786]: I0313 15:56:05.539891 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556950-mzkqk"] Mar 13 15:56:05 crc kubenswrapper[4786]: I0313 15:56:05.544570 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556950-mzkqk"] Mar 13 15:56:06 crc kubenswrapper[4786]: I0313 15:56:06.560160 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38fdd75d-0d84-46b3-b590-259ed2254d5e" path="/var/lib/kubelet/pods/38fdd75d-0d84-46b3-b590-259ed2254d5e/volumes" Mar 13 15:56:07 crc kubenswrapper[4786]: I0313 15:56:07.869115 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:56:07 crc kubenswrapper[4786]: I0313 15:56:07.869176 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:56:07 crc kubenswrapper[4786]: I0313 15:56:07.869224 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 15:56:07 crc kubenswrapper[4786]: I0313 15:56:07.870559 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:56:07 crc kubenswrapper[4786]: I0313 15:56:07.870624 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" gracePeriod=600 Mar 13 15:56:08 crc kubenswrapper[4786]: E0313 15:56:08.010393 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:56:08 crc kubenswrapper[4786]: I0313 15:56:08.501699 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" exitCode=0 Mar 13 15:56:08 crc kubenswrapper[4786]: I0313 15:56:08.501761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5"} Mar 13 15:56:08 crc kubenswrapper[4786]: I0313 15:56:08.501811 4786 scope.go:117] "RemoveContainer" containerID="2434f68ff036d4c7e88066d4c9206f1ac88f25c420d13e8648ea6e86bcd2c9ac" Mar 13 15:56:08 crc kubenswrapper[4786]: I0313 15:56:08.502423 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:56:08 crc kubenswrapper[4786]: E0313 15:56:08.502742 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:56:11 crc kubenswrapper[4786]: I0313 15:56:11.676351 4786 scope.go:117] "RemoveContainer" containerID="c3497af5622a651e0755047c3ef44f492969cf2b074e92b4979e18f9ff0b898e" Mar 13 15:56:20 crc kubenswrapper[4786]: I0313 15:56:20.557559 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:56:20 crc kubenswrapper[4786]: E0313 15:56:20.559722 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:56:32 crc kubenswrapper[4786]: I0313 15:56:32.551713 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:56:32 crc kubenswrapper[4786]: E0313 15:56:32.552319 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:56:46 crc kubenswrapper[4786]: I0313 15:56:46.552984 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:56:46 crc kubenswrapper[4786]: E0313 15:56:46.556180 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:57:00 crc kubenswrapper[4786]: I0313 15:57:00.555971 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:57:00 crc kubenswrapper[4786]: E0313 15:57:00.556731 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:57:11 crc kubenswrapper[4786]: I0313 15:57:11.552478 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:57:11 crc kubenswrapper[4786]: E0313 15:57:11.553227 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:57:24 crc kubenswrapper[4786]: I0313 15:57:24.552679 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:57:24 crc kubenswrapper[4786]: E0313 15:57:24.553587 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:57:38 crc kubenswrapper[4786]: I0313 15:57:38.552651 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:57:38 crc kubenswrapper[4786]: E0313 15:57:38.553574 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:57:50 crc kubenswrapper[4786]: I0313 15:57:50.558091 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:57:50 crc kubenswrapper[4786]: E0313 15:57:50.559542 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.165320 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556958-fg8l9"] Mar 13 15:58:00 crc kubenswrapper[4786]: E0313 15:58:00.166232 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d2b22d-3a12-4982-9393-af2a24093bcb" containerName="oc" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.166245 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d2b22d-3a12-4982-9393-af2a24093bcb" containerName="oc" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.166428 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d2b22d-3a12-4982-9393-af2a24093bcb" containerName="oc" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.166928 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556958-fg8l9" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.169230 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.169306 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.171016 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.184126 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556958-fg8l9"] Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.227432 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9btw\" (UniqueName: \"kubernetes.io/projected/f0d4f449-32f3-4d33-ac5f-d32fba63c622-kube-api-access-g9btw\") pod \"auto-csr-approver-29556958-fg8l9\" (UID: \"f0d4f449-32f3-4d33-ac5f-d32fba63c622\") " pod="openshift-infra/auto-csr-approver-29556958-fg8l9" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.328927 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9btw\" (UniqueName: \"kubernetes.io/projected/f0d4f449-32f3-4d33-ac5f-d32fba63c622-kube-api-access-g9btw\") pod \"auto-csr-approver-29556958-fg8l9\" (UID: \"f0d4f449-32f3-4d33-ac5f-d32fba63c622\") " pod="openshift-infra/auto-csr-approver-29556958-fg8l9" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.352620 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9btw\" (UniqueName: \"kubernetes.io/projected/f0d4f449-32f3-4d33-ac5f-d32fba63c622-kube-api-access-g9btw\") pod \"auto-csr-approver-29556958-fg8l9\" (UID: \"f0d4f449-32f3-4d33-ac5f-d32fba63c622\") " pod="openshift-infra/auto-csr-approver-29556958-fg8l9" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.486367 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556958-fg8l9" Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.922388 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556958-fg8l9"] Mar 13 15:58:00 crc kubenswrapper[4786]: I0313 15:58:00.924343 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:58:01 crc kubenswrapper[4786]: I0313 15:58:01.436356 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556958-fg8l9" event={"ID":"f0d4f449-32f3-4d33-ac5f-d32fba63c622","Type":"ContainerStarted","Data":"c217b93aa6eb34e989c42f0226d39a689674882ee0dbb5aba74611ac575f2131"} Mar 13 15:58:03 crc kubenswrapper[4786]: I0313 15:58:03.459380 4786 generic.go:334] "Generic (PLEG): container finished" podID="f0d4f449-32f3-4d33-ac5f-d32fba63c622" containerID="2b0b366dcc57062c136dc6c196cc3ab4ea9eb59bb75dc6e39711ae663d1c713d" exitCode=0 Mar 13 15:58:03 crc kubenswrapper[4786]: I0313 15:58:03.459505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556958-fg8l9" event={"ID":"f0d4f449-32f3-4d33-ac5f-d32fba63c622","Type":"ContainerDied","Data":"2b0b366dcc57062c136dc6c196cc3ab4ea9eb59bb75dc6e39711ae663d1c713d"} Mar 13 15:58:03 crc kubenswrapper[4786]: I0313 15:58:03.551984 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:58:03 crc kubenswrapper[4786]: E0313 15:58:03.552392 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:58:04 crc kubenswrapper[4786]: I0313 15:58:04.806148 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556958-fg8l9" Mar 13 15:58:04 crc kubenswrapper[4786]: I0313 15:58:04.909417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9btw\" (UniqueName: \"kubernetes.io/projected/f0d4f449-32f3-4d33-ac5f-d32fba63c622-kube-api-access-g9btw\") pod \"f0d4f449-32f3-4d33-ac5f-d32fba63c622\" (UID: \"f0d4f449-32f3-4d33-ac5f-d32fba63c622\") " Mar 13 15:58:04 crc kubenswrapper[4786]: I0313 15:58:04.914901 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d4f449-32f3-4d33-ac5f-d32fba63c622-kube-api-access-g9btw" (OuterVolumeSpecName: "kube-api-access-g9btw") pod "f0d4f449-32f3-4d33-ac5f-d32fba63c622" (UID: "f0d4f449-32f3-4d33-ac5f-d32fba63c622"). InnerVolumeSpecName "kube-api-access-g9btw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:58:05 crc kubenswrapper[4786]: I0313 15:58:05.011262 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9btw\" (UniqueName: \"kubernetes.io/projected/f0d4f449-32f3-4d33-ac5f-d32fba63c622-kube-api-access-g9btw\") on node \"crc\" DevicePath \"\"" Mar 13 15:58:05 crc kubenswrapper[4786]: I0313 15:58:05.473493 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556958-fg8l9" event={"ID":"f0d4f449-32f3-4d33-ac5f-d32fba63c622","Type":"ContainerDied","Data":"c217b93aa6eb34e989c42f0226d39a689674882ee0dbb5aba74611ac575f2131"} Mar 13 15:58:05 crc kubenswrapper[4786]: I0313 15:58:05.473527 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556958-fg8l9" Mar 13 15:58:05 crc kubenswrapper[4786]: I0313 15:58:05.473532 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c217b93aa6eb34e989c42f0226d39a689674882ee0dbb5aba74611ac575f2131" Mar 13 15:58:05 crc kubenswrapper[4786]: I0313 15:58:05.885222 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556952-58dn5"] Mar 13 15:58:05 crc kubenswrapper[4786]: I0313 15:58:05.892376 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556952-58dn5"] Mar 13 15:58:06 crc kubenswrapper[4786]: I0313 15:58:06.561248 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="758a50ff-5c59-4468-bb7f-ebaf5e25d80c" path="/var/lib/kubelet/pods/758a50ff-5c59-4468-bb7f-ebaf5e25d80c/volumes" Mar 13 15:58:11 crc kubenswrapper[4786]: I0313 15:58:11.779534 4786 scope.go:117] "RemoveContainer" containerID="fae552754228dd64cd44e162424fb81298fcea53bf6516f4f77d9d33d5d88436" Mar 13 15:58:14 crc kubenswrapper[4786]: I0313 15:58:14.552204 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:58:14 crc kubenswrapper[4786]: E0313 15:58:14.553292 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:58:25 crc kubenswrapper[4786]: I0313 15:58:25.552705 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:58:25 crc kubenswrapper[4786]: E0313 15:58:25.553724 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:58:37 crc kubenswrapper[4786]: I0313 15:58:37.552515 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:58:37 crc kubenswrapper[4786]: E0313 15:58:37.553486 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:58:51 crc kubenswrapper[4786]: I0313 15:58:51.551847 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:58:51 crc kubenswrapper[4786]: E0313 15:58:51.554177 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:59:05 crc kubenswrapper[4786]: I0313 15:59:05.551858 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:59:05 crc kubenswrapper[4786]: E0313 15:59:05.552635 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:59:19 crc kubenswrapper[4786]: I0313 15:59:19.552583 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:59:19 crc kubenswrapper[4786]: E0313 15:59:19.553648 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:59:30 crc kubenswrapper[4786]: I0313 15:59:30.555814 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:59:30 crc kubenswrapper[4786]: E0313 15:59:30.556580 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:59:45 crc kubenswrapper[4786]: I0313 15:59:45.554503 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:59:45 crc kubenswrapper[4786]: E0313 15:59:45.556522 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 15:59:59 crc kubenswrapper[4786]: I0313 15:59:59.552459 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 15:59:59 crc kubenswrapper[4786]: E0313 15:59:59.553584 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.161080 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr"] Mar 13 16:00:00 crc kubenswrapper[4786]: E0313 16:00:00.161552 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d4f449-32f3-4d33-ac5f-d32fba63c622" containerName="oc" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.161581 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d4f449-32f3-4d33-ac5f-d32fba63c622" containerName="oc" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.161835 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d4f449-32f3-4d33-ac5f-d32fba63c622" containerName="oc" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.162641 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.164789 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.165085 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.172964 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556960-thhvl"] Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.174094 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556960-thhvl" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.175754 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.176100 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.177130 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.183044 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556960-thhvl"] Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.189604 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr"] Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.222071 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjvjf\" (UniqueName: \"kubernetes.io/projected/142a1d24-73ca-476f-92c9-0983ac69687b-kube-api-access-gjvjf\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.222142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmvj5\" (UniqueName: \"kubernetes.io/projected/4da27dfa-2fc7-4e23-9115-d73b09eebb8e-kube-api-access-hmvj5\") pod \"auto-csr-approver-29556960-thhvl\" (UID: \"4da27dfa-2fc7-4e23-9115-d73b09eebb8e\") " pod="openshift-infra/auto-csr-approver-29556960-thhvl" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.222182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/142a1d24-73ca-476f-92c9-0983ac69687b-secret-volume\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.222304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/142a1d24-73ca-476f-92c9-0983ac69687b-config-volume\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.323914 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjvjf\" (UniqueName: \"kubernetes.io/projected/142a1d24-73ca-476f-92c9-0983ac69687b-kube-api-access-gjvjf\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.324051 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmvj5\" (UniqueName: \"kubernetes.io/projected/4da27dfa-2fc7-4e23-9115-d73b09eebb8e-kube-api-access-hmvj5\") pod \"auto-csr-approver-29556960-thhvl\" (UID: \"4da27dfa-2fc7-4e23-9115-d73b09eebb8e\") " pod="openshift-infra/auto-csr-approver-29556960-thhvl" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.324106 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/142a1d24-73ca-476f-92c9-0983ac69687b-secret-volume\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.324145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/142a1d24-73ca-476f-92c9-0983ac69687b-config-volume\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.325803 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/142a1d24-73ca-476f-92c9-0983ac69687b-config-volume\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.341830 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/142a1d24-73ca-476f-92c9-0983ac69687b-secret-volume\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.345694 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmvj5\" (UniqueName: \"kubernetes.io/projected/4da27dfa-2fc7-4e23-9115-d73b09eebb8e-kube-api-access-hmvj5\") pod \"auto-csr-approver-29556960-thhvl\" (UID: \"4da27dfa-2fc7-4e23-9115-d73b09eebb8e\") " pod="openshift-infra/auto-csr-approver-29556960-thhvl" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.351666 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjvjf\" (UniqueName: \"kubernetes.io/projected/142a1d24-73ca-476f-92c9-0983ac69687b-kube-api-access-gjvjf\") pod \"collect-profiles-29556960-zsvtr\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.493138 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.504817 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556960-thhvl" Mar 13 16:00:00 crc kubenswrapper[4786]: W0313 16:00:00.932018 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4da27dfa_2fc7_4e23_9115_d73b09eebb8e.slice/crio-44444f25053415ff6651186ebbe31ba3f970127e31b4e11e138ed9917c28d70b WatchSource:0}: Error finding container 44444f25053415ff6651186ebbe31ba3f970127e31b4e11e138ed9917c28d70b: Status 404 returned error can't find the container with id 44444f25053415ff6651186ebbe31ba3f970127e31b4e11e138ed9917c28d70b Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.932119 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556960-thhvl"] Mar 13 16:00:00 crc kubenswrapper[4786]: I0313 16:00:00.996818 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr"] Mar 13 16:00:00 crc kubenswrapper[4786]: W0313 16:00:00.999465 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod142a1d24_73ca_476f_92c9_0983ac69687b.slice/crio-841abfb517e1d37cd4f285eaae3bafeb55ebf9c79f20b6ac3d9b3a8b268f4d61 WatchSource:0}: Error finding container 841abfb517e1d37cd4f285eaae3bafeb55ebf9c79f20b6ac3d9b3a8b268f4d61: Status 404 returned error can't find the container with id 841abfb517e1d37cd4f285eaae3bafeb55ebf9c79f20b6ac3d9b3a8b268f4d61 Mar 13 16:00:01 crc kubenswrapper[4786]: I0313 16:00:01.385275 4786 generic.go:334] "Generic (PLEG): container finished" podID="142a1d24-73ca-476f-92c9-0983ac69687b" containerID="53d2dc6a268adc8be9eaabbb78446fc5080e6fdd81e1b81f979048c7557b2b51" exitCode=0 Mar 13 16:00:01 crc kubenswrapper[4786]: I0313 16:00:01.385352 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" event={"ID":"142a1d24-73ca-476f-92c9-0983ac69687b","Type":"ContainerDied","Data":"53d2dc6a268adc8be9eaabbb78446fc5080e6fdd81e1b81f979048c7557b2b51"} Mar 13 16:00:01 crc kubenswrapper[4786]: I0313 16:00:01.385402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" event={"ID":"142a1d24-73ca-476f-92c9-0983ac69687b","Type":"ContainerStarted","Data":"841abfb517e1d37cd4f285eaae3bafeb55ebf9c79f20b6ac3d9b3a8b268f4d61"} Mar 13 16:00:01 crc kubenswrapper[4786]: I0313 16:00:01.386833 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556960-thhvl" event={"ID":"4da27dfa-2fc7-4e23-9115-d73b09eebb8e","Type":"ContainerStarted","Data":"44444f25053415ff6651186ebbe31ba3f970127e31b4e11e138ed9917c28d70b"} Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.719804 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.863457 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjvjf\" (UniqueName: \"kubernetes.io/projected/142a1d24-73ca-476f-92c9-0983ac69687b-kube-api-access-gjvjf\") pod \"142a1d24-73ca-476f-92c9-0983ac69687b\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.863722 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/142a1d24-73ca-476f-92c9-0983ac69687b-secret-volume\") pod \"142a1d24-73ca-476f-92c9-0983ac69687b\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.864410 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/142a1d24-73ca-476f-92c9-0983ac69687b-config-volume\") pod \"142a1d24-73ca-476f-92c9-0983ac69687b\" (UID: \"142a1d24-73ca-476f-92c9-0983ac69687b\") " Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.865006 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/142a1d24-73ca-476f-92c9-0983ac69687b-config-volume" (OuterVolumeSpecName: "config-volume") pod "142a1d24-73ca-476f-92c9-0983ac69687b" (UID: "142a1d24-73ca-476f-92c9-0983ac69687b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.865392 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/142a1d24-73ca-476f-92c9-0983ac69687b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.869264 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142a1d24-73ca-476f-92c9-0983ac69687b-kube-api-access-gjvjf" (OuterVolumeSpecName: "kube-api-access-gjvjf") pod "142a1d24-73ca-476f-92c9-0983ac69687b" (UID: "142a1d24-73ca-476f-92c9-0983ac69687b"). InnerVolumeSpecName "kube-api-access-gjvjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.870612 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142a1d24-73ca-476f-92c9-0983ac69687b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "142a1d24-73ca-476f-92c9-0983ac69687b" (UID: "142a1d24-73ca-476f-92c9-0983ac69687b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.967078 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/142a1d24-73ca-476f-92c9-0983ac69687b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 16:00:02 crc kubenswrapper[4786]: I0313 16:00:02.967110 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjvjf\" (UniqueName: \"kubernetes.io/projected/142a1d24-73ca-476f-92c9-0983ac69687b-kube-api-access-gjvjf\") on node \"crc\" DevicePath \"\"" Mar 13 16:00:03 crc kubenswrapper[4786]: I0313 16:00:03.403498 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" event={"ID":"142a1d24-73ca-476f-92c9-0983ac69687b","Type":"ContainerDied","Data":"841abfb517e1d37cd4f285eaae3bafeb55ebf9c79f20b6ac3d9b3a8b268f4d61"} Mar 13 16:00:03 crc kubenswrapper[4786]: I0313 16:00:03.403534 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="841abfb517e1d37cd4f285eaae3bafeb55ebf9c79f20b6ac3d9b3a8b268f4d61" Mar 13 16:00:03 crc kubenswrapper[4786]: I0313 16:00:03.403555 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr" Mar 13 16:00:03 crc kubenswrapper[4786]: I0313 16:00:03.799645 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d"] Mar 13 16:00:03 crc kubenswrapper[4786]: I0313 16:00:03.805155 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-b5f5d"] Mar 13 16:00:04 crc kubenswrapper[4786]: I0313 16:00:04.413478 4786 generic.go:334] "Generic (PLEG): container finished" podID="4da27dfa-2fc7-4e23-9115-d73b09eebb8e" containerID="222130f9874f6de9e7fc5034655c8b44b4d379cde0e18430658fc2e09ea98577" exitCode=0 Mar 13 16:00:04 crc kubenswrapper[4786]: I0313 16:00:04.413552 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556960-thhvl" event={"ID":"4da27dfa-2fc7-4e23-9115-d73b09eebb8e","Type":"ContainerDied","Data":"222130f9874f6de9e7fc5034655c8b44b4d379cde0e18430658fc2e09ea98577"} Mar 13 16:00:04 crc kubenswrapper[4786]: I0313 16:00:04.571076 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c965d5e-882a-4c8d-8290-ecddba18b208" path="/var/lib/kubelet/pods/2c965d5e-882a-4c8d-8290-ecddba18b208/volumes" Mar 13 16:00:05 crc kubenswrapper[4786]: I0313 16:00:05.762322 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556960-thhvl" Mar 13 16:00:05 crc kubenswrapper[4786]: I0313 16:00:05.910677 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmvj5\" (UniqueName: \"kubernetes.io/projected/4da27dfa-2fc7-4e23-9115-d73b09eebb8e-kube-api-access-hmvj5\") pod \"4da27dfa-2fc7-4e23-9115-d73b09eebb8e\" (UID: \"4da27dfa-2fc7-4e23-9115-d73b09eebb8e\") " Mar 13 16:00:05 crc kubenswrapper[4786]: I0313 16:00:05.917298 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da27dfa-2fc7-4e23-9115-d73b09eebb8e-kube-api-access-hmvj5" (OuterVolumeSpecName: "kube-api-access-hmvj5") pod "4da27dfa-2fc7-4e23-9115-d73b09eebb8e" (UID: "4da27dfa-2fc7-4e23-9115-d73b09eebb8e"). InnerVolumeSpecName "kube-api-access-hmvj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:00:06 crc kubenswrapper[4786]: I0313 16:00:06.012828 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmvj5\" (UniqueName: \"kubernetes.io/projected/4da27dfa-2fc7-4e23-9115-d73b09eebb8e-kube-api-access-hmvj5\") on node \"crc\" DevicePath \"\"" Mar 13 16:00:06 crc kubenswrapper[4786]: I0313 16:00:06.432001 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556960-thhvl" event={"ID":"4da27dfa-2fc7-4e23-9115-d73b09eebb8e","Type":"ContainerDied","Data":"44444f25053415ff6651186ebbe31ba3f970127e31b4e11e138ed9917c28d70b"} Mar 13 16:00:06 crc kubenswrapper[4786]: I0313 16:00:06.432044 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44444f25053415ff6651186ebbe31ba3f970127e31b4e11e138ed9917c28d70b" Mar 13 16:00:06 crc kubenswrapper[4786]: I0313 16:00:06.432089 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556960-thhvl" Mar 13 16:00:06 crc kubenswrapper[4786]: I0313 16:00:06.816148 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556954-r4cpp"] Mar 13 16:00:06 crc kubenswrapper[4786]: I0313 16:00:06.823201 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556954-r4cpp"] Mar 13 16:00:08 crc kubenswrapper[4786]: I0313 16:00:08.569175 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed033ef8-d2cc-453f-bf4f-e2f36c6caac9" path="/var/lib/kubelet/pods/ed033ef8-d2cc-453f-bf4f-e2f36c6caac9/volumes" Mar 13 16:00:11 crc kubenswrapper[4786]: I0313 16:00:11.857214 4786 scope.go:117] "RemoveContainer" containerID="5afaf18485b511642d3176a7447c825eee0ab4e07b97ba4bbb2f90e5a21eb1e5" Mar 13 16:00:11 crc kubenswrapper[4786]: I0313 16:00:11.894174 4786 scope.go:117] "RemoveContainer" containerID="d753256315f1310bb28f0691628ce24ffb13984ac11fd1d3c1cff5812092c02e" Mar 13 16:00:13 crc kubenswrapper[4786]: I0313 16:00:13.552361 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 16:00:13 crc kubenswrapper[4786]: E0313 16:00:13.553053 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:00:24 crc kubenswrapper[4786]: I0313 16:00:24.552670 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 16:00:24 crc kubenswrapper[4786]: E0313 16:00:24.553440 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:00:35 crc kubenswrapper[4786]: I0313 16:00:35.552001 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 16:00:35 crc kubenswrapper[4786]: E0313 16:00:35.552728 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:00:49 crc kubenswrapper[4786]: I0313 16:00:49.552561 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 16:00:49 crc kubenswrapper[4786]: E0313 16:00:49.553725 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:01:02 crc kubenswrapper[4786]: I0313 16:01:02.552903 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 16:01:02 crc kubenswrapper[4786]: E0313 16:01:02.554342 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:01:14 crc kubenswrapper[4786]: I0313 16:01:14.552633 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 16:01:15 crc kubenswrapper[4786]: I0313 16:01:15.020692 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"62c6a93c6e3cfc1cebe1b9f985eaa03062c6ab901d2197129218113ec960cea2"} Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.182686 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556962-7ldvq"] Mar 13 16:02:00 crc kubenswrapper[4786]: E0313 16:02:00.183516 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142a1d24-73ca-476f-92c9-0983ac69687b" containerName="collect-profiles" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.183532 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="142a1d24-73ca-476f-92c9-0983ac69687b" containerName="collect-profiles" Mar 13 16:02:00 crc kubenswrapper[4786]: E0313 16:02:00.183550 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da27dfa-2fc7-4e23-9115-d73b09eebb8e" containerName="oc" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.183557 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da27dfa-2fc7-4e23-9115-d73b09eebb8e" containerName="oc" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.183714 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="142a1d24-73ca-476f-92c9-0983ac69687b" containerName="collect-profiles" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.183738 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4da27dfa-2fc7-4e23-9115-d73b09eebb8e" containerName="oc" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.184340 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556962-7ldvq" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.189605 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.191519 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gnlh\" (UniqueName: \"kubernetes.io/projected/a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82-kube-api-access-9gnlh\") pod \"auto-csr-approver-29556962-7ldvq\" (UID: \"a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82\") " pod="openshift-infra/auto-csr-approver-29556962-7ldvq" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.191731 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.192385 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.192979 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556962-7ldvq"] Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.292701 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gnlh\" (UniqueName: \"kubernetes.io/projected/a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82-kube-api-access-9gnlh\") pod \"auto-csr-approver-29556962-7ldvq\" (UID: \"a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82\") " pod="openshift-infra/auto-csr-approver-29556962-7ldvq" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.323546 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gnlh\" (UniqueName: \"kubernetes.io/projected/a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82-kube-api-access-9gnlh\") pod \"auto-csr-approver-29556962-7ldvq\" (UID: \"a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82\") " pod="openshift-infra/auto-csr-approver-29556962-7ldvq" Mar 13 16:02:00 crc kubenswrapper[4786]: I0313 16:02:00.510659 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556962-7ldvq" Mar 13 16:02:01 crc kubenswrapper[4786]: I0313 16:02:01.024160 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556962-7ldvq"] Mar 13 16:02:01 crc kubenswrapper[4786]: W0313 16:02:01.032889 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7e7eb93_3dc4_4ff7_a132_0edb1d0d9e82.slice/crio-1197f58834b64aa511fd2e51793c21533c13e461c9c296986ec45cf6e66b48b3 WatchSource:0}: Error finding container 1197f58834b64aa511fd2e51793c21533c13e461c9c296986ec45cf6e66b48b3: Status 404 returned error can't find the container with id 1197f58834b64aa511fd2e51793c21533c13e461c9c296986ec45cf6e66b48b3 Mar 13 16:02:01 crc kubenswrapper[4786]: I0313 16:02:01.401918 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556962-7ldvq" event={"ID":"a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82","Type":"ContainerStarted","Data":"1197f58834b64aa511fd2e51793c21533c13e461c9c296986ec45cf6e66b48b3"} Mar 13 16:02:03 crc kubenswrapper[4786]: I0313 16:02:03.428005 4786 generic.go:334] "Generic (PLEG): container finished" podID="a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82" containerID="8ef94e51229eb78bce94a265f0bc19f7abb8817b7085c9525aa84c742fd4578c" exitCode=0 Mar 13 16:02:03 crc kubenswrapper[4786]: I0313 16:02:03.428121 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556962-7ldvq" event={"ID":"a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82","Type":"ContainerDied","Data":"8ef94e51229eb78bce94a265f0bc19f7abb8817b7085c9525aa84c742fd4578c"} Mar 13 16:02:03 crc kubenswrapper[4786]: I0313 16:02:03.946245 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j7djh"] Mar 13 16:02:03 crc kubenswrapper[4786]: I0313 16:02:03.950925 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:03 crc kubenswrapper[4786]: I0313 16:02:03.961072 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j7djh"] Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.054346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknxk\" (UniqueName: \"kubernetes.io/projected/7987dd1e-63f6-406c-b392-1ee31584a3bd-kube-api-access-kknxk\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.054416 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-catalog-content\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.054728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-utilities\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.156190 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-utilities\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.156294 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kknxk\" (UniqueName: \"kubernetes.io/projected/7987dd1e-63f6-406c-b392-1ee31584a3bd-kube-api-access-kknxk\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.156336 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-catalog-content\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.156744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-utilities\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.156873 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-catalog-content\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.174936 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknxk\" (UniqueName: \"kubernetes.io/projected/7987dd1e-63f6-406c-b392-1ee31584a3bd-kube-api-access-kknxk\") pod \"community-operators-j7djh\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:04 crc kubenswrapper[4786]: I0313 16:02:04.285364 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:04.769646 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j7djh"] Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:04.773594 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556962-7ldvq" Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:04.973346 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gnlh\" (UniqueName: \"kubernetes.io/projected/a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82-kube-api-access-9gnlh\") pod \"a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82\" (UID: \"a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82\") " Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:04.978037 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82-kube-api-access-9gnlh" (OuterVolumeSpecName: "kube-api-access-9gnlh") pod "a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82" (UID: "a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82"). InnerVolumeSpecName "kube-api-access-9gnlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.075007 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gnlh\" (UniqueName: \"kubernetes.io/projected/a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82-kube-api-access-9gnlh\") on node \"crc\" DevicePath \"\"" Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.477155 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556962-7ldvq" event={"ID":"a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82","Type":"ContainerDied","Data":"1197f58834b64aa511fd2e51793c21533c13e461c9c296986ec45cf6e66b48b3"} Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.477202 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1197f58834b64aa511fd2e51793c21533c13e461c9c296986ec45cf6e66b48b3" Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.477200 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556962-7ldvq" Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.481175 4786 generic.go:334] "Generic (PLEG): container finished" podID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerID="b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2" exitCode=0 Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.481251 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7djh" event={"ID":"7987dd1e-63f6-406c-b392-1ee31584a3bd","Type":"ContainerDied","Data":"b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2"} Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.481301 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7djh" event={"ID":"7987dd1e-63f6-406c-b392-1ee31584a3bd","Type":"ContainerStarted","Data":"a4bd8f59eb08e386ed6284e59b3382050465d5027072fd7c8bc6c4ecbdd45e54"} Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.843244 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556956-dbfgh"] Mar 13 16:02:05 crc kubenswrapper[4786]: I0313 16:02:05.852920 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556956-dbfgh"] Mar 13 16:02:06 crc kubenswrapper[4786]: I0313 16:02:06.495380 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7djh" event={"ID":"7987dd1e-63f6-406c-b392-1ee31584a3bd","Type":"ContainerStarted","Data":"fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01"} Mar 13 16:02:06 crc kubenswrapper[4786]: I0313 16:02:06.562261 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d2b22d-3a12-4982-9393-af2a24093bcb" path="/var/lib/kubelet/pods/45d2b22d-3a12-4982-9393-af2a24093bcb/volumes" Mar 13 16:02:07 crc kubenswrapper[4786]: I0313 16:02:07.507074 4786 generic.go:334] "Generic (PLEG): container finished" podID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerID="fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01" exitCode=0 Mar 13 16:02:07 crc kubenswrapper[4786]: I0313 16:02:07.507123 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7djh" event={"ID":"7987dd1e-63f6-406c-b392-1ee31584a3bd","Type":"ContainerDied","Data":"fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01"} Mar 13 16:02:08 crc kubenswrapper[4786]: I0313 16:02:08.516576 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7djh" event={"ID":"7987dd1e-63f6-406c-b392-1ee31584a3bd","Type":"ContainerStarted","Data":"d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94"} Mar 13 16:02:08 crc kubenswrapper[4786]: I0313 16:02:08.546455 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j7djh" podStartSLOduration=2.992870568 podStartE2EDuration="5.546434548s" podCreationTimestamp="2026-03-13 16:02:03 +0000 UTC" firstStartedPulling="2026-03-13 16:02:05.484450518 +0000 UTC m=+3555.647662369" lastFinishedPulling="2026-03-13 16:02:08.038014538 +0000 UTC m=+3558.201226349" observedRunningTime="2026-03-13 16:02:08.539125642 +0000 UTC m=+3558.702337473" watchObservedRunningTime="2026-03-13 16:02:08.546434548 +0000 UTC m=+3558.709646369" Mar 13 16:02:11 crc kubenswrapper[4786]: I0313 16:02:11.980077 4786 scope.go:117] "RemoveContainer" containerID="01cdae5dc0f3126b1a1f0c5e9d2771fb0b7eb3a1339193bd9e4e20f02a31f848" Mar 13 16:02:14 crc kubenswrapper[4786]: I0313 16:02:14.285663 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:14 crc kubenswrapper[4786]: I0313 16:02:14.288287 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:14 crc kubenswrapper[4786]: I0313 16:02:14.357644 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:14 crc kubenswrapper[4786]: I0313 16:02:14.616351 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:14 crc kubenswrapper[4786]: I0313 16:02:14.662125 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j7djh"] Mar 13 16:02:16 crc kubenswrapper[4786]: I0313 16:02:16.590985 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j7djh" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerName="registry-server" containerID="cri-o://d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94" gracePeriod=2 Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.033047 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.171463 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kknxk\" (UniqueName: \"kubernetes.io/projected/7987dd1e-63f6-406c-b392-1ee31584a3bd-kube-api-access-kknxk\") pod \"7987dd1e-63f6-406c-b392-1ee31584a3bd\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.171513 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-utilities\") pod \"7987dd1e-63f6-406c-b392-1ee31584a3bd\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.171602 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-catalog-content\") pod \"7987dd1e-63f6-406c-b392-1ee31584a3bd\" (UID: \"7987dd1e-63f6-406c-b392-1ee31584a3bd\") " Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.173050 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-utilities" (OuterVolumeSpecName: "utilities") pod "7987dd1e-63f6-406c-b392-1ee31584a3bd" (UID: "7987dd1e-63f6-406c-b392-1ee31584a3bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.180348 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7987dd1e-63f6-406c-b392-1ee31584a3bd-kube-api-access-kknxk" (OuterVolumeSpecName: "kube-api-access-kknxk") pod "7987dd1e-63f6-406c-b392-1ee31584a3bd" (UID: "7987dd1e-63f6-406c-b392-1ee31584a3bd"). InnerVolumeSpecName "kube-api-access-kknxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.273271 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kknxk\" (UniqueName: \"kubernetes.io/projected/7987dd1e-63f6-406c-b392-1ee31584a3bd-kube-api-access-kknxk\") on node \"crc\" DevicePath \"\"" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.273309 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.600818 4786 generic.go:334] "Generic (PLEG): container finished" podID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerID="d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94" exitCode=0 Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.600871 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7djh" event={"ID":"7987dd1e-63f6-406c-b392-1ee31584a3bd","Type":"ContainerDied","Data":"d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94"} Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.600896 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7djh" event={"ID":"7987dd1e-63f6-406c-b392-1ee31584a3bd","Type":"ContainerDied","Data":"a4bd8f59eb08e386ed6284e59b3382050465d5027072fd7c8bc6c4ecbdd45e54"} Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.600911 4786 scope.go:117] "RemoveContainer" containerID="d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.601011 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7djh" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.631141 4786 scope.go:117] "RemoveContainer" containerID="fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.654522 4786 scope.go:117] "RemoveContainer" containerID="b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.687102 4786 scope.go:117] "RemoveContainer" containerID="d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94" Mar 13 16:02:17 crc kubenswrapper[4786]: E0313 16:02:17.687594 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94\": container with ID starting with d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94 not found: ID does not exist" containerID="d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.687635 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94"} err="failed to get container status \"d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94\": rpc error: code = NotFound desc = could not find container \"d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94\": container with ID starting with d8e0c5c63e4aa85d168ddd278e5200214b6f4b79bf8d02328f0a4be17a8e6b94 not found: ID does not exist" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.687660 4786 scope.go:117] "RemoveContainer" containerID="fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01" Mar 13 16:02:17 crc kubenswrapper[4786]: E0313 16:02:17.688363 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01\": container with ID starting with fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01 not found: ID does not exist" containerID="fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.688459 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01"} err="failed to get container status \"fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01\": rpc error: code = NotFound desc = could not find container \"fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01\": container with ID starting with fcb18f1cf020b8a0ac4f7057bf268d65016248d444a7f5e4b42f87ef6d543b01 not found: ID does not exist" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.688514 4786 scope.go:117] "RemoveContainer" containerID="b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2" Mar 13 16:02:17 crc kubenswrapper[4786]: E0313 16:02:17.689711 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2\": container with ID starting with b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2 not found: ID does not exist" containerID="b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.689752 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2"} err="failed to get container status \"b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2\": rpc error: code = NotFound desc = could not find container \"b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2\": container with ID starting with b1802f864884378d8466d5f40f844c4d783a224f0fe4832ea4559f0a03cb1ba2 not found: ID does not exist" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.698331 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7987dd1e-63f6-406c-b392-1ee31584a3bd" (UID: "7987dd1e-63f6-406c-b392-1ee31584a3bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.781207 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7987dd1e-63f6-406c-b392-1ee31584a3bd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.946071 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j7djh"] Mar 13 16:02:17 crc kubenswrapper[4786]: I0313 16:02:17.954429 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j7djh"] Mar 13 16:02:18 crc kubenswrapper[4786]: I0313 16:02:18.570093 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" path="/var/lib/kubelet/pods/7987dd1e-63f6-406c-b392-1ee31584a3bd/volumes" Mar 13 16:03:37 crc kubenswrapper[4786]: I0313 16:03:37.868269 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:03:37 crc kubenswrapper[4786]: I0313 16:03:37.868880 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.145134 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556964-577tf"] Mar 13 16:04:00 crc kubenswrapper[4786]: E0313 16:04:00.146692 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82" containerName="oc" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.146827 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82" containerName="oc" Mar 13 16:04:00 crc kubenswrapper[4786]: E0313 16:04:00.146903 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerName="registry-server" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.146974 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerName="registry-server" Mar 13 16:04:00 crc kubenswrapper[4786]: E0313 16:04:00.147045 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerName="extract-utilities" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.147152 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerName="extract-utilities" Mar 13 16:04:00 crc kubenswrapper[4786]: E0313 16:04:00.147220 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerName="extract-content" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.147273 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerName="extract-content" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.147502 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82" containerName="oc" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.147568 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7987dd1e-63f6-406c-b392-1ee31584a3bd" containerName="registry-server" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.148042 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556964-577tf" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.151695 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.151707 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.163064 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.165177 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556964-577tf"] Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.262243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pnx\" (UniqueName: \"kubernetes.io/projected/324f6394-4376-4e73-b468-3a00002479bf-kube-api-access-f7pnx\") pod \"auto-csr-approver-29556964-577tf\" (UID: \"324f6394-4376-4e73-b468-3a00002479bf\") " pod="openshift-infra/auto-csr-approver-29556964-577tf" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.363610 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pnx\" (UniqueName: \"kubernetes.io/projected/324f6394-4376-4e73-b468-3a00002479bf-kube-api-access-f7pnx\") pod \"auto-csr-approver-29556964-577tf\" (UID: \"324f6394-4376-4e73-b468-3a00002479bf\") " pod="openshift-infra/auto-csr-approver-29556964-577tf" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.393747 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pnx\" (UniqueName: \"kubernetes.io/projected/324f6394-4376-4e73-b468-3a00002479bf-kube-api-access-f7pnx\") pod \"auto-csr-approver-29556964-577tf\" (UID: \"324f6394-4376-4e73-b468-3a00002479bf\") " pod="openshift-infra/auto-csr-approver-29556964-577tf" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.465451 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556964-577tf" Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.897441 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556964-577tf"] Mar 13 16:04:00 crc kubenswrapper[4786]: I0313 16:04:00.909145 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:04:01 crc kubenswrapper[4786]: I0313 16:04:01.483032 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556964-577tf" event={"ID":"324f6394-4376-4e73-b468-3a00002479bf","Type":"ContainerStarted","Data":"0be460971da38816a15c8212636de875a9697516d31a2ed903353a48780f8350"} Mar 13 16:04:03 crc kubenswrapper[4786]: I0313 16:04:03.500828 4786 generic.go:334] "Generic (PLEG): container finished" podID="324f6394-4376-4e73-b468-3a00002479bf" containerID="07c61dd11ed294f77e29e412b8cc509fb5e2261d8a81017ab24116f22a007b8b" exitCode=0 Mar 13 16:04:03 crc kubenswrapper[4786]: I0313 16:04:03.501619 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556964-577tf" event={"ID":"324f6394-4376-4e73-b468-3a00002479bf","Type":"ContainerDied","Data":"07c61dd11ed294f77e29e412b8cc509fb5e2261d8a81017ab24116f22a007b8b"} Mar 13 16:04:04 crc kubenswrapper[4786]: I0313 16:04:04.766712 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556964-577tf" Mar 13 16:04:04 crc kubenswrapper[4786]: I0313 16:04:04.833099 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7pnx\" (UniqueName: \"kubernetes.io/projected/324f6394-4376-4e73-b468-3a00002479bf-kube-api-access-f7pnx\") pod \"324f6394-4376-4e73-b468-3a00002479bf\" (UID: \"324f6394-4376-4e73-b468-3a00002479bf\") " Mar 13 16:04:04 crc kubenswrapper[4786]: I0313 16:04:04.841245 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324f6394-4376-4e73-b468-3a00002479bf-kube-api-access-f7pnx" (OuterVolumeSpecName: "kube-api-access-f7pnx") pod "324f6394-4376-4e73-b468-3a00002479bf" (UID: "324f6394-4376-4e73-b468-3a00002479bf"). InnerVolumeSpecName "kube-api-access-f7pnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:04:04 crc kubenswrapper[4786]: I0313 16:04:04.935432 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7pnx\" (UniqueName: \"kubernetes.io/projected/324f6394-4376-4e73-b468-3a00002479bf-kube-api-access-f7pnx\") on node \"crc\" DevicePath \"\"" Mar 13 16:04:05 crc kubenswrapper[4786]: I0313 16:04:05.524284 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556964-577tf" event={"ID":"324f6394-4376-4e73-b468-3a00002479bf","Type":"ContainerDied","Data":"0be460971da38816a15c8212636de875a9697516d31a2ed903353a48780f8350"} Mar 13 16:04:05 crc kubenswrapper[4786]: I0313 16:04:05.524616 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0be460971da38816a15c8212636de875a9697516d31a2ed903353a48780f8350" Mar 13 16:04:05 crc kubenswrapper[4786]: I0313 16:04:05.524385 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556964-577tf" Mar 13 16:04:05 crc kubenswrapper[4786]: I0313 16:04:05.840197 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556958-fg8l9"] Mar 13 16:04:05 crc kubenswrapper[4786]: I0313 16:04:05.846371 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556958-fg8l9"] Mar 13 16:04:06 crc kubenswrapper[4786]: I0313 16:04:06.562878 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d4f449-32f3-4d33-ac5f-d32fba63c622" path="/var/lib/kubelet/pods/f0d4f449-32f3-4d33-ac5f-d32fba63c622/volumes" Mar 13 16:04:07 crc kubenswrapper[4786]: I0313 16:04:07.869051 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:04:07 crc kubenswrapper[4786]: I0313 16:04:07.869442 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:04:12 crc kubenswrapper[4786]: I0313 16:04:12.081188 4786 scope.go:117] "RemoveContainer" containerID="2b0b366dcc57062c136dc6c196cc3ab4ea9eb59bb75dc6e39711ae663d1c713d" Mar 13 16:04:37 crc kubenswrapper[4786]: I0313 16:04:37.868304 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:04:37 crc kubenswrapper[4786]: I0313 16:04:37.869077 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:04:37 crc kubenswrapper[4786]: I0313 16:04:37.869161 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:04:37 crc kubenswrapper[4786]: I0313 16:04:37.870156 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62c6a93c6e3cfc1cebe1b9f985eaa03062c6ab901d2197129218113ec960cea2"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:04:37 crc kubenswrapper[4786]: I0313 16:04:37.870272 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://62c6a93c6e3cfc1cebe1b9f985eaa03062c6ab901d2197129218113ec960cea2" gracePeriod=600 Mar 13 16:04:38 crc kubenswrapper[4786]: I0313 16:04:38.808560 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="62c6a93c6e3cfc1cebe1b9f985eaa03062c6ab901d2197129218113ec960cea2" exitCode=0 Mar 13 16:04:38 crc kubenswrapper[4786]: I0313 16:04:38.808665 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"62c6a93c6e3cfc1cebe1b9f985eaa03062c6ab901d2197129218113ec960cea2"} Mar 13 16:04:38 crc kubenswrapper[4786]: I0313 16:04:38.808901 4786 scope.go:117] "RemoveContainer" containerID="5cc85dc6974d78469e9707c8aeae9283f64fd62acc50d3109849370f7fe459c5" Mar 13 16:04:39 crc kubenswrapper[4786]: I0313 16:04:39.820544 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69"} Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.159776 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556966-gw7lr"] Mar 13 16:06:00 crc kubenswrapper[4786]: E0313 16:06:00.160647 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324f6394-4376-4e73-b468-3a00002479bf" containerName="oc" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.160661 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="324f6394-4376-4e73-b468-3a00002479bf" containerName="oc" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.160872 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="324f6394-4376-4e73-b468-3a00002479bf" containerName="oc" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.161383 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.162912 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.164626 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.164727 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.185691 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556966-gw7lr"] Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.228122 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwrdw\" (UniqueName: \"kubernetes.io/projected/0fec79d5-fb54-45d8-9dfa-61dd1a202814-kube-api-access-lwrdw\") pod \"auto-csr-approver-29556966-gw7lr\" (UID: \"0fec79d5-fb54-45d8-9dfa-61dd1a202814\") " pod="openshift-infra/auto-csr-approver-29556966-gw7lr" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.329090 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwrdw\" (UniqueName: \"kubernetes.io/projected/0fec79d5-fb54-45d8-9dfa-61dd1a202814-kube-api-access-lwrdw\") pod \"auto-csr-approver-29556966-gw7lr\" (UID: \"0fec79d5-fb54-45d8-9dfa-61dd1a202814\") " pod="openshift-infra/auto-csr-approver-29556966-gw7lr" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.356751 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwrdw\" (UniqueName: \"kubernetes.io/projected/0fec79d5-fb54-45d8-9dfa-61dd1a202814-kube-api-access-lwrdw\") pod \"auto-csr-approver-29556966-gw7lr\" (UID: \"0fec79d5-fb54-45d8-9dfa-61dd1a202814\") " pod="openshift-infra/auto-csr-approver-29556966-gw7lr" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.497975 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" Mar 13 16:06:00 crc kubenswrapper[4786]: I0313 16:06:00.958839 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556966-gw7lr"] Mar 13 16:06:01 crc kubenswrapper[4786]: I0313 16:06:01.510463 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" event={"ID":"0fec79d5-fb54-45d8-9dfa-61dd1a202814","Type":"ContainerStarted","Data":"f54ea1430edfc4596f631887c400318ef90f3138a056e114c9104abc1d56a369"} Mar 13 16:06:02 crc kubenswrapper[4786]: I0313 16:06:02.530258 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" event={"ID":"0fec79d5-fb54-45d8-9dfa-61dd1a202814","Type":"ContainerStarted","Data":"97796641b3993e808da63732bea2406630273f6d13d3d101e4e4c73523276d02"} Mar 13 16:06:02 crc kubenswrapper[4786]: I0313 16:06:02.551998 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" podStartSLOduration=1.43578312 podStartE2EDuration="2.551965701s" podCreationTimestamp="2026-03-13 16:06:00 +0000 UTC" firstStartedPulling="2026-03-13 16:06:00.968662159 +0000 UTC m=+3791.131873970" lastFinishedPulling="2026-03-13 16:06:02.0848447 +0000 UTC m=+3792.248056551" observedRunningTime="2026-03-13 16:06:02.544236505 +0000 UTC m=+3792.707448316" watchObservedRunningTime="2026-03-13 16:06:02.551965701 +0000 UTC m=+3792.715177522" Mar 13 16:06:03 crc kubenswrapper[4786]: I0313 16:06:03.541310 4786 generic.go:334] "Generic (PLEG): container finished" podID="0fec79d5-fb54-45d8-9dfa-61dd1a202814" containerID="97796641b3993e808da63732bea2406630273f6d13d3d101e4e4c73523276d02" exitCode=0 Mar 13 16:06:03 crc kubenswrapper[4786]: I0313 16:06:03.541358 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" event={"ID":"0fec79d5-fb54-45d8-9dfa-61dd1a202814","Type":"ContainerDied","Data":"97796641b3993e808da63732bea2406630273f6d13d3d101e4e4c73523276d02"} Mar 13 16:06:04 crc kubenswrapper[4786]: I0313 16:06:04.870759 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" Mar 13 16:06:04 crc kubenswrapper[4786]: I0313 16:06:04.893252 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwrdw\" (UniqueName: \"kubernetes.io/projected/0fec79d5-fb54-45d8-9dfa-61dd1a202814-kube-api-access-lwrdw\") pod \"0fec79d5-fb54-45d8-9dfa-61dd1a202814\" (UID: \"0fec79d5-fb54-45d8-9dfa-61dd1a202814\") " Mar 13 16:06:04 crc kubenswrapper[4786]: I0313 16:06:04.902423 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fec79d5-fb54-45d8-9dfa-61dd1a202814-kube-api-access-lwrdw" (OuterVolumeSpecName: "kube-api-access-lwrdw") pod "0fec79d5-fb54-45d8-9dfa-61dd1a202814" (UID: "0fec79d5-fb54-45d8-9dfa-61dd1a202814"). InnerVolumeSpecName "kube-api-access-lwrdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:06:04 crc kubenswrapper[4786]: I0313 16:06:04.995038 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwrdw\" (UniqueName: \"kubernetes.io/projected/0fec79d5-fb54-45d8-9dfa-61dd1a202814-kube-api-access-lwrdw\") on node \"crc\" DevicePath \"\"" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.468113 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkp8x"] Mar 13 16:06:05 crc kubenswrapper[4786]: E0313 16:06:05.468511 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fec79d5-fb54-45d8-9dfa-61dd1a202814" containerName="oc" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.468540 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fec79d5-fb54-45d8-9dfa-61dd1a202814" containerName="oc" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.468738 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fec79d5-fb54-45d8-9dfa-61dd1a202814" containerName="oc" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.469927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.478360 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkp8x"] Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.560658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" event={"ID":"0fec79d5-fb54-45d8-9dfa-61dd1a202814","Type":"ContainerDied","Data":"f54ea1430edfc4596f631887c400318ef90f3138a056e114c9104abc1d56a369"} Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.560728 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f54ea1430edfc4596f631887c400318ef90f3138a056e114c9104abc1d56a369" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.560810 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556966-gw7lr" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.616946 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl7qc\" (UniqueName: \"kubernetes.io/projected/4bd5d400-14fc-441b-81db-02c15ae12744-kube-api-access-bl7qc\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.617089 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-catalog-content\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.617142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-utilities\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.657211 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556960-thhvl"] Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.670445 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556960-thhvl"] Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.719117 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl7qc\" (UniqueName: \"kubernetes.io/projected/4bd5d400-14fc-441b-81db-02c15ae12744-kube-api-access-bl7qc\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.719206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-catalog-content\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.719255 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-utilities\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.721708 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-catalog-content\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.721727 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-utilities\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.740070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl7qc\" (UniqueName: \"kubernetes.io/projected/4bd5d400-14fc-441b-81db-02c15ae12744-kube-api-access-bl7qc\") pod \"redhat-operators-xkp8x\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:05 crc kubenswrapper[4786]: I0313 16:06:05.789737 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:06 crc kubenswrapper[4786]: I0313 16:06:06.219335 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkp8x"] Mar 13 16:06:06 crc kubenswrapper[4786]: I0313 16:06:06.562877 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da27dfa-2fc7-4e23-9115-d73b09eebb8e" path="/var/lib/kubelet/pods/4da27dfa-2fc7-4e23-9115-d73b09eebb8e/volumes" Mar 13 16:06:06 crc kubenswrapper[4786]: I0313 16:06:06.568450 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bd5d400-14fc-441b-81db-02c15ae12744" containerID="836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049" exitCode=0 Mar 13 16:06:06 crc kubenswrapper[4786]: I0313 16:06:06.568503 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkp8x" event={"ID":"4bd5d400-14fc-441b-81db-02c15ae12744","Type":"ContainerDied","Data":"836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049"} Mar 13 16:06:06 crc kubenswrapper[4786]: I0313 16:06:06.568534 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkp8x" event={"ID":"4bd5d400-14fc-441b-81db-02c15ae12744","Type":"ContainerStarted","Data":"a11ff338fdb3cde8e5f3f00eb29b11c69aaaa5d5e92c0c3387742190ea60d282"} Mar 13 16:06:09 crc kubenswrapper[4786]: I0313 16:06:09.240546 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bd5d400-14fc-441b-81db-02c15ae12744" containerID="926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8" exitCode=0 Mar 13 16:06:09 crc kubenswrapper[4786]: I0313 16:06:09.240658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkp8x" event={"ID":"4bd5d400-14fc-441b-81db-02c15ae12744","Type":"ContainerDied","Data":"926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8"} Mar 13 16:06:10 crc kubenswrapper[4786]: I0313 16:06:10.249585 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkp8x" event={"ID":"4bd5d400-14fc-441b-81db-02c15ae12744","Type":"ContainerStarted","Data":"bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971"} Mar 13 16:06:10 crc kubenswrapper[4786]: I0313 16:06:10.270341 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkp8x" podStartSLOduration=2.125165523 podStartE2EDuration="5.270315833s" podCreationTimestamp="2026-03-13 16:06:05 +0000 UTC" firstStartedPulling="2026-03-13 16:06:06.570067548 +0000 UTC m=+3796.733279359" lastFinishedPulling="2026-03-13 16:06:09.715217858 +0000 UTC m=+3799.878429669" observedRunningTime="2026-03-13 16:06:10.265909811 +0000 UTC m=+3800.429121622" watchObservedRunningTime="2026-03-13 16:06:10.270315833 +0000 UTC m=+3800.433527674" Mar 13 16:06:12 crc kubenswrapper[4786]: I0313 16:06:12.197656 4786 scope.go:117] "RemoveContainer" containerID="222130f9874f6de9e7fc5034655c8b44b4d379cde0e18430658fc2e09ea98577" Mar 13 16:06:15 crc kubenswrapper[4786]: I0313 16:06:15.790460 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:15 crc kubenswrapper[4786]: I0313 16:06:15.791234 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:16 crc kubenswrapper[4786]: I0313 16:06:16.833506 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xkp8x" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="registry-server" probeResult="failure" output=< Mar 13 16:06:16 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 16:06:16 crc kubenswrapper[4786]: > Mar 13 16:06:25 crc kubenswrapper[4786]: I0313 16:06:25.838321 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:25 crc kubenswrapper[4786]: I0313 16:06:25.890770 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:26 crc kubenswrapper[4786]: I0313 16:06:26.079613 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkp8x"] Mar 13 16:06:27 crc kubenswrapper[4786]: I0313 16:06:27.454277 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkp8x" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="registry-server" containerID="cri-o://bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971" gracePeriod=2 Mar 13 16:06:27 crc kubenswrapper[4786]: I0313 16:06:27.840884 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:27 crc kubenswrapper[4786]: I0313 16:06:27.955239 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-catalog-content\") pod \"4bd5d400-14fc-441b-81db-02c15ae12744\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " Mar 13 16:06:27 crc kubenswrapper[4786]: I0313 16:06:27.955348 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-utilities\") pod \"4bd5d400-14fc-441b-81db-02c15ae12744\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " Mar 13 16:06:27 crc kubenswrapper[4786]: I0313 16:06:27.955615 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl7qc\" (UniqueName: \"kubernetes.io/projected/4bd5d400-14fc-441b-81db-02c15ae12744-kube-api-access-bl7qc\") pod \"4bd5d400-14fc-441b-81db-02c15ae12744\" (UID: \"4bd5d400-14fc-441b-81db-02c15ae12744\") " Mar 13 16:06:27 crc kubenswrapper[4786]: I0313 16:06:27.956205 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-utilities" (OuterVolumeSpecName: "utilities") pod "4bd5d400-14fc-441b-81db-02c15ae12744" (UID: "4bd5d400-14fc-441b-81db-02c15ae12744"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:06:27 crc kubenswrapper[4786]: I0313 16:06:27.960930 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd5d400-14fc-441b-81db-02c15ae12744-kube-api-access-bl7qc" (OuterVolumeSpecName: "kube-api-access-bl7qc") pod "4bd5d400-14fc-441b-81db-02c15ae12744" (UID: "4bd5d400-14fc-441b-81db-02c15ae12744"). InnerVolumeSpecName "kube-api-access-bl7qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.057441 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.057497 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl7qc\" (UniqueName: \"kubernetes.io/projected/4bd5d400-14fc-441b-81db-02c15ae12744-kube-api-access-bl7qc\") on node \"crc\" DevicePath \"\"" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.108958 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bd5d400-14fc-441b-81db-02c15ae12744" (UID: "4bd5d400-14fc-441b-81db-02c15ae12744"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.159389 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd5d400-14fc-441b-81db-02c15ae12744-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.467466 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bd5d400-14fc-441b-81db-02c15ae12744" containerID="bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971" exitCode=0 Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.467532 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkp8x" event={"ID":"4bd5d400-14fc-441b-81db-02c15ae12744","Type":"ContainerDied","Data":"bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971"} Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.467572 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkp8x" event={"ID":"4bd5d400-14fc-441b-81db-02c15ae12744","Type":"ContainerDied","Data":"a11ff338fdb3cde8e5f3f00eb29b11c69aaaa5d5e92c0c3387742190ea60d282"} Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.467600 4786 scope.go:117] "RemoveContainer" containerID="bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.467765 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkp8x" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.504413 4786 scope.go:117] "RemoveContainer" containerID="926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.521640 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkp8x"] Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.529087 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkp8x"] Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.545717 4786 scope.go:117] "RemoveContainer" containerID="836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.566526 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" path="/var/lib/kubelet/pods/4bd5d400-14fc-441b-81db-02c15ae12744/volumes" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.577982 4786 scope.go:117] "RemoveContainer" containerID="bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971" Mar 13 16:06:28 crc kubenswrapper[4786]: E0313 16:06:28.578524 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971\": container with ID starting with bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971 not found: ID does not exist" containerID="bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.578552 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971"} err="failed to get container status \"bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971\": rpc error: code = NotFound desc = could not find container \"bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971\": container with ID starting with bb11b9a0bdedd298f4beb2342e2b323e6e91112d0d9fea2ab2c1df9cde9c4971 not found: ID does not exist" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.578571 4786 scope.go:117] "RemoveContainer" containerID="926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8" Mar 13 16:06:28 crc kubenswrapper[4786]: E0313 16:06:28.578892 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8\": container with ID starting with 926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8 not found: ID does not exist" containerID="926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.578959 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8"} err="failed to get container status \"926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8\": rpc error: code = NotFound desc = could not find container \"926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8\": container with ID starting with 926d6aa1d846e806ba65a4fece3b055087696f10b9464ae5ed944dfdbe12d1b8 not found: ID does not exist" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.578997 4786 scope.go:117] "RemoveContainer" containerID="836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049" Mar 13 16:06:28 crc kubenswrapper[4786]: E0313 16:06:28.579298 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049\": container with ID starting with 836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049 not found: ID does not exist" containerID="836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049" Mar 13 16:06:28 crc kubenswrapper[4786]: I0313 16:06:28.579325 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049"} err="failed to get container status \"836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049\": rpc error: code = NotFound desc = could not find container \"836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049\": container with ID starting with 836eaaa9ca854eb3ed12ef99871a537fa89685090694ede15ac666bc3075b049 not found: ID does not exist" Mar 13 16:07:07 crc kubenswrapper[4786]: I0313 16:07:07.868552 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:07:07 crc kubenswrapper[4786]: I0313 16:07:07.869241 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:07:37 crc kubenswrapper[4786]: I0313 16:07:37.868825 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:07:37 crc kubenswrapper[4786]: I0313 16:07:37.869546 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.160707 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556968-996h4"] Mar 13 16:08:00 crc kubenswrapper[4786]: E0313 16:08:00.161966 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="extract-utilities" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.161994 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="extract-utilities" Mar 13 16:08:00 crc kubenswrapper[4786]: E0313 16:08:00.162025 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="extract-content" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.162043 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="extract-content" Mar 13 16:08:00 crc kubenswrapper[4786]: E0313 16:08:00.162109 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="registry-server" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.162126 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="registry-server" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.162428 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd5d400-14fc-441b-81db-02c15ae12744" containerName="registry-server" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.163577 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556968-996h4" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.173132 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.174267 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.174746 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.182708 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556968-996h4"] Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.336103 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxkjj\" (UniqueName: \"kubernetes.io/projected/3aaf938a-e3c6-49de-80b3-80c48d1d6a71-kube-api-access-qxkjj\") pod \"auto-csr-approver-29556968-996h4\" (UID: \"3aaf938a-e3c6-49de-80b3-80c48d1d6a71\") " pod="openshift-infra/auto-csr-approver-29556968-996h4" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.438115 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxkjj\" (UniqueName: \"kubernetes.io/projected/3aaf938a-e3c6-49de-80b3-80c48d1d6a71-kube-api-access-qxkjj\") pod \"auto-csr-approver-29556968-996h4\" (UID: \"3aaf938a-e3c6-49de-80b3-80c48d1d6a71\") " pod="openshift-infra/auto-csr-approver-29556968-996h4" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.479505 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxkjj\" (UniqueName: \"kubernetes.io/projected/3aaf938a-e3c6-49de-80b3-80c48d1d6a71-kube-api-access-qxkjj\") pod \"auto-csr-approver-29556968-996h4\" (UID: \"3aaf938a-e3c6-49de-80b3-80c48d1d6a71\") " pod="openshift-infra/auto-csr-approver-29556968-996h4" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.492910 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556968-996h4" Mar 13 16:08:00 crc kubenswrapper[4786]: I0313 16:08:00.919389 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556968-996h4"] Mar 13 16:08:01 crc kubenswrapper[4786]: I0313 16:08:01.273450 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556968-996h4" event={"ID":"3aaf938a-e3c6-49de-80b3-80c48d1d6a71","Type":"ContainerStarted","Data":"3e4e3335218ebe98a2e35c4c447d868513edafba68104a9d077f9bab57a35c4f"} Mar 13 16:08:02 crc kubenswrapper[4786]: I0313 16:08:02.281580 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556968-996h4" event={"ID":"3aaf938a-e3c6-49de-80b3-80c48d1d6a71","Type":"ContainerStarted","Data":"09107f4a85999e2a54d60f29744a425150bad4fb0c04001d79894b322e0df740"} Mar 13 16:08:02 crc kubenswrapper[4786]: I0313 16:08:02.304158 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556968-996h4" podStartSLOduration=1.276320017 podStartE2EDuration="2.304129535s" podCreationTimestamp="2026-03-13 16:08:00 +0000 UTC" firstStartedPulling="2026-03-13 16:08:00.928830794 +0000 UTC m=+3911.092042625" lastFinishedPulling="2026-03-13 16:08:01.956640312 +0000 UTC m=+3912.119852143" observedRunningTime="2026-03-13 16:08:02.297381644 +0000 UTC m=+3912.460593525" watchObservedRunningTime="2026-03-13 16:08:02.304129535 +0000 UTC m=+3912.467341386" Mar 13 16:08:03 crc kubenswrapper[4786]: I0313 16:08:03.293636 4786 generic.go:334] "Generic (PLEG): container finished" podID="3aaf938a-e3c6-49de-80b3-80c48d1d6a71" containerID="09107f4a85999e2a54d60f29744a425150bad4fb0c04001d79894b322e0df740" exitCode=0 Mar 13 16:08:03 crc kubenswrapper[4786]: I0313 16:08:03.293709 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556968-996h4" event={"ID":"3aaf938a-e3c6-49de-80b3-80c48d1d6a71","Type":"ContainerDied","Data":"09107f4a85999e2a54d60f29744a425150bad4fb0c04001d79894b322e0df740"} Mar 13 16:08:05 crc kubenswrapper[4786]: I0313 16:08:05.314310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556968-996h4" event={"ID":"3aaf938a-e3c6-49de-80b3-80c48d1d6a71","Type":"ContainerDied","Data":"3e4e3335218ebe98a2e35c4c447d868513edafba68104a9d077f9bab57a35c4f"} Mar 13 16:08:05 crc kubenswrapper[4786]: I0313 16:08:05.314719 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e4e3335218ebe98a2e35c4c447d868513edafba68104a9d077f9bab57a35c4f" Mar 13 16:08:05 crc kubenswrapper[4786]: I0313 16:08:05.375236 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556968-996h4" Mar 13 16:08:05 crc kubenswrapper[4786]: I0313 16:08:05.521009 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxkjj\" (UniqueName: \"kubernetes.io/projected/3aaf938a-e3c6-49de-80b3-80c48d1d6a71-kube-api-access-qxkjj\") pod \"3aaf938a-e3c6-49de-80b3-80c48d1d6a71\" (UID: \"3aaf938a-e3c6-49de-80b3-80c48d1d6a71\") " Mar 13 16:08:05 crc kubenswrapper[4786]: I0313 16:08:05.529226 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aaf938a-e3c6-49de-80b3-80c48d1d6a71-kube-api-access-qxkjj" (OuterVolumeSpecName: "kube-api-access-qxkjj") pod "3aaf938a-e3c6-49de-80b3-80c48d1d6a71" (UID: "3aaf938a-e3c6-49de-80b3-80c48d1d6a71"). InnerVolumeSpecName "kube-api-access-qxkjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:08:05 crc kubenswrapper[4786]: I0313 16:08:05.623367 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxkjj\" (UniqueName: \"kubernetes.io/projected/3aaf938a-e3c6-49de-80b3-80c48d1d6a71-kube-api-access-qxkjj\") on node \"crc\" DevicePath \"\"" Mar 13 16:08:06 crc kubenswrapper[4786]: I0313 16:08:06.323174 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556968-996h4" Mar 13 16:08:06 crc kubenswrapper[4786]: I0313 16:08:06.466837 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556962-7ldvq"] Mar 13 16:08:06 crc kubenswrapper[4786]: I0313 16:08:06.473981 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556962-7ldvq"] Mar 13 16:08:06 crc kubenswrapper[4786]: I0313 16:08:06.566489 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82" path="/var/lib/kubelet/pods/a7e7eb93-3dc4-4ff7-a132-0edb1d0d9e82/volumes" Mar 13 16:08:07 crc kubenswrapper[4786]: I0313 16:08:07.868650 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:08:07 crc kubenswrapper[4786]: I0313 16:08:07.869306 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:08:07 crc kubenswrapper[4786]: I0313 16:08:07.869417 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:08:07 crc kubenswrapper[4786]: I0313 16:08:07.870065 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:08:07 crc kubenswrapper[4786]: I0313 16:08:07.870194 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" gracePeriod=600 Mar 13 16:08:07 crc kubenswrapper[4786]: E0313 16:08:07.996397 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:08:08 crc kubenswrapper[4786]: I0313 16:08:08.340515 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" exitCode=0 Mar 13 16:08:08 crc kubenswrapper[4786]: I0313 16:08:08.340932 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69"} Mar 13 16:08:08 crc kubenswrapper[4786]: I0313 16:08:08.340972 4786 scope.go:117] "RemoveContainer" containerID="62c6a93c6e3cfc1cebe1b9f985eaa03062c6ab901d2197129218113ec960cea2" Mar 13 16:08:08 crc kubenswrapper[4786]: I0313 16:08:08.341594 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:08:08 crc kubenswrapper[4786]: E0313 16:08:08.341849 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:08:12 crc kubenswrapper[4786]: I0313 16:08:12.350300 4786 scope.go:117] "RemoveContainer" containerID="8ef94e51229eb78bce94a265f0bc19f7abb8817b7085c9525aa84c742fd4578c" Mar 13 16:08:20 crc kubenswrapper[4786]: I0313 16:08:20.556321 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:08:20 crc kubenswrapper[4786]: E0313 16:08:20.557167 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:08:31 crc kubenswrapper[4786]: I0313 16:08:31.552218 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:08:31 crc kubenswrapper[4786]: E0313 16:08:31.553515 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:08:42 crc kubenswrapper[4786]: I0313 16:08:42.554592 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:08:42 crc kubenswrapper[4786]: E0313 16:08:42.556545 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:08:56 crc kubenswrapper[4786]: I0313 16:08:56.552733 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:08:56 crc kubenswrapper[4786]: E0313 16:08:56.553999 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:09:10 crc kubenswrapper[4786]: I0313 16:09:10.555439 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:09:10 crc kubenswrapper[4786]: E0313 16:09:10.556159 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:09:21 crc kubenswrapper[4786]: I0313 16:09:21.551841 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:09:21 crc kubenswrapper[4786]: E0313 16:09:21.552899 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:09:34 crc kubenswrapper[4786]: I0313 16:09:34.552135 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:09:34 crc kubenswrapper[4786]: E0313 16:09:34.552768 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:09:46 crc kubenswrapper[4786]: I0313 16:09:46.552200 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:09:46 crc kubenswrapper[4786]: E0313 16:09:46.553304 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.153336 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556970-r49rz"] Mar 13 16:10:00 crc kubenswrapper[4786]: E0313 16:10:00.154271 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aaf938a-e3c6-49de-80b3-80c48d1d6a71" containerName="oc" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.154288 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aaf938a-e3c6-49de-80b3-80c48d1d6a71" containerName="oc" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.154476 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aaf938a-e3c6-49de-80b3-80c48d1d6a71" containerName="oc" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.155055 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556970-r49rz" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.161453 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.161556 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.162134 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.170915 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556970-r49rz"] Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.281999 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgp62\" (UniqueName: \"kubernetes.io/projected/8cfec5e6-cd20-44ac-abd4-aa7f70352ba9-kube-api-access-bgp62\") pod \"auto-csr-approver-29556970-r49rz\" (UID: \"8cfec5e6-cd20-44ac-abd4-aa7f70352ba9\") " pod="openshift-infra/auto-csr-approver-29556970-r49rz" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.384029 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgp62\" (UniqueName: \"kubernetes.io/projected/8cfec5e6-cd20-44ac-abd4-aa7f70352ba9-kube-api-access-bgp62\") pod \"auto-csr-approver-29556970-r49rz\" (UID: \"8cfec5e6-cd20-44ac-abd4-aa7f70352ba9\") " pod="openshift-infra/auto-csr-approver-29556970-r49rz" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.409606 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgp62\" (UniqueName: \"kubernetes.io/projected/8cfec5e6-cd20-44ac-abd4-aa7f70352ba9-kube-api-access-bgp62\") pod \"auto-csr-approver-29556970-r49rz\" (UID: \"8cfec5e6-cd20-44ac-abd4-aa7f70352ba9\") " pod="openshift-infra/auto-csr-approver-29556970-r49rz" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.477262 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556970-r49rz" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.559270 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:10:00 crc kubenswrapper[4786]: E0313 16:10:00.560161 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:10:00 crc kubenswrapper[4786]: I0313 16:10:00.997111 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556970-r49rz"] Mar 13 16:10:01 crc kubenswrapper[4786]: W0313 16:10:01.009237 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cfec5e6_cd20_44ac_abd4_aa7f70352ba9.slice/crio-2f3140fdfcbc3e19134dd636894bce8562cf7944cca0475b23b18033f7124bde WatchSource:0}: Error finding container 2f3140fdfcbc3e19134dd636894bce8562cf7944cca0475b23b18033f7124bde: Status 404 returned error can't find the container with id 2f3140fdfcbc3e19134dd636894bce8562cf7944cca0475b23b18033f7124bde Mar 13 16:10:01 crc kubenswrapper[4786]: I0313 16:10:01.017928 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:10:01 crc kubenswrapper[4786]: I0313 16:10:01.301643 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556970-r49rz" event={"ID":"8cfec5e6-cd20-44ac-abd4-aa7f70352ba9","Type":"ContainerStarted","Data":"2f3140fdfcbc3e19134dd636894bce8562cf7944cca0475b23b18033f7124bde"} Mar 13 16:10:03 crc kubenswrapper[4786]: I0313 16:10:03.320565 4786 generic.go:334] "Generic (PLEG): container finished" podID="8cfec5e6-cd20-44ac-abd4-aa7f70352ba9" containerID="e3ee3ff945ab513ffeeb0e13fb18af312e8019b96686936ccefebe9de49cbde7" exitCode=0 Mar 13 16:10:03 crc kubenswrapper[4786]: I0313 16:10:03.320659 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556970-r49rz" event={"ID":"8cfec5e6-cd20-44ac-abd4-aa7f70352ba9","Type":"ContainerDied","Data":"e3ee3ff945ab513ffeeb0e13fb18af312e8019b96686936ccefebe9de49cbde7"} Mar 13 16:10:04 crc kubenswrapper[4786]: I0313 16:10:04.728670 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556970-r49rz" Mar 13 16:10:04 crc kubenswrapper[4786]: I0313 16:10:04.855640 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgp62\" (UniqueName: \"kubernetes.io/projected/8cfec5e6-cd20-44ac-abd4-aa7f70352ba9-kube-api-access-bgp62\") pod \"8cfec5e6-cd20-44ac-abd4-aa7f70352ba9\" (UID: \"8cfec5e6-cd20-44ac-abd4-aa7f70352ba9\") " Mar 13 16:10:04 crc kubenswrapper[4786]: I0313 16:10:04.864301 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cfec5e6-cd20-44ac-abd4-aa7f70352ba9-kube-api-access-bgp62" (OuterVolumeSpecName: "kube-api-access-bgp62") pod "8cfec5e6-cd20-44ac-abd4-aa7f70352ba9" (UID: "8cfec5e6-cd20-44ac-abd4-aa7f70352ba9"). InnerVolumeSpecName "kube-api-access-bgp62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:10:04 crc kubenswrapper[4786]: I0313 16:10:04.957377 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgp62\" (UniqueName: \"kubernetes.io/projected/8cfec5e6-cd20-44ac-abd4-aa7f70352ba9-kube-api-access-bgp62\") on node \"crc\" DevicePath \"\"" Mar 13 16:10:05 crc kubenswrapper[4786]: I0313 16:10:05.339249 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556970-r49rz" event={"ID":"8cfec5e6-cd20-44ac-abd4-aa7f70352ba9","Type":"ContainerDied","Data":"2f3140fdfcbc3e19134dd636894bce8562cf7944cca0475b23b18033f7124bde"} Mar 13 16:10:05 crc kubenswrapper[4786]: I0313 16:10:05.339304 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f3140fdfcbc3e19134dd636894bce8562cf7944cca0475b23b18033f7124bde" Mar 13 16:10:05 crc kubenswrapper[4786]: I0313 16:10:05.339335 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556970-r49rz" Mar 13 16:10:05 crc kubenswrapper[4786]: I0313 16:10:05.819496 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556964-577tf"] Mar 13 16:10:05 crc kubenswrapper[4786]: I0313 16:10:05.828365 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556964-577tf"] Mar 13 16:10:06 crc kubenswrapper[4786]: I0313 16:10:06.569497 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="324f6394-4376-4e73-b468-3a00002479bf" path="/var/lib/kubelet/pods/324f6394-4376-4e73-b468-3a00002479bf/volumes" Mar 13 16:10:12 crc kubenswrapper[4786]: I0313 16:10:12.473079 4786 scope.go:117] "RemoveContainer" containerID="07c61dd11ed294f77e29e412b8cc509fb5e2261d8a81017ab24116f22a007b8b" Mar 13 16:10:12 crc kubenswrapper[4786]: I0313 16:10:12.554696 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:10:12 crc kubenswrapper[4786]: E0313 16:10:12.555438 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.594539 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6k8lh"] Mar 13 16:10:16 crc kubenswrapper[4786]: E0313 16:10:16.595234 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cfec5e6-cd20-44ac-abd4-aa7f70352ba9" containerName="oc" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.595249 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfec5e6-cd20-44ac-abd4-aa7f70352ba9" containerName="oc" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.595425 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cfec5e6-cd20-44ac-abd4-aa7f70352ba9" containerName="oc" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.596620 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.613419 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6k8lh"] Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.759138 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbnm2\" (UniqueName: \"kubernetes.io/projected/b32bc6c0-96a2-4003-a586-b6cb86233de2-kube-api-access-dbnm2\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.759182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-utilities\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.759199 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-catalog-content\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.860190 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbnm2\" (UniqueName: \"kubernetes.io/projected/b32bc6c0-96a2-4003-a586-b6cb86233de2-kube-api-access-dbnm2\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.860234 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-utilities\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.860258 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-catalog-content\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.860765 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-utilities\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.860798 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-catalog-content\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.877744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbnm2\" (UniqueName: \"kubernetes.io/projected/b32bc6c0-96a2-4003-a586-b6cb86233de2-kube-api-access-dbnm2\") pod \"certified-operators-6k8lh\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:16 crc kubenswrapper[4786]: I0313 16:10:16.913085 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:17 crc kubenswrapper[4786]: I0313 16:10:17.319034 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6k8lh"] Mar 13 16:10:17 crc kubenswrapper[4786]: I0313 16:10:17.451379 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6k8lh" event={"ID":"b32bc6c0-96a2-4003-a586-b6cb86233de2","Type":"ContainerStarted","Data":"bf51ef42c17c8ae72289a1b65fd6b330fb2fb8b81057eba814d1d56af2302e67"} Mar 13 16:10:18 crc kubenswrapper[4786]: I0313 16:10:18.463179 4786 generic.go:334] "Generic (PLEG): container finished" podID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerID="28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5" exitCode=0 Mar 13 16:10:18 crc kubenswrapper[4786]: I0313 16:10:18.463300 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6k8lh" event={"ID":"b32bc6c0-96a2-4003-a586-b6cb86233de2","Type":"ContainerDied","Data":"28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5"} Mar 13 16:10:19 crc kubenswrapper[4786]: I0313 16:10:19.473596 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6k8lh" event={"ID":"b32bc6c0-96a2-4003-a586-b6cb86233de2","Type":"ContainerStarted","Data":"d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc"} Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.393926 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qcdwf"] Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.396462 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.415927 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcdwf"] Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.494711 4786 generic.go:334] "Generic (PLEG): container finished" podID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerID="d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc" exitCode=0 Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.494761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6k8lh" event={"ID":"b32bc6c0-96a2-4003-a586-b6cb86233de2","Type":"ContainerDied","Data":"d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc"} Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.511402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-catalog-content\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.511581 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grfn7\" (UniqueName: \"kubernetes.io/projected/f5f28735-71ec-44d0-b16c-b2478500bd74-kube-api-access-grfn7\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.511705 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-utilities\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.614529 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-catalog-content\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.614629 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grfn7\" (UniqueName: \"kubernetes.io/projected/f5f28735-71ec-44d0-b16c-b2478500bd74-kube-api-access-grfn7\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.614698 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-utilities\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.615660 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-catalog-content\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.615963 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-utilities\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.636186 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grfn7\" (UniqueName: \"kubernetes.io/projected/f5f28735-71ec-44d0-b16c-b2478500bd74-kube-api-access-grfn7\") pod \"redhat-marketplace-qcdwf\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:20 crc kubenswrapper[4786]: I0313 16:10:20.782119 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:21 crc kubenswrapper[4786]: I0313 16:10:21.225523 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcdwf"] Mar 13 16:10:21 crc kubenswrapper[4786]: W0313 16:10:21.430359 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5f28735_71ec_44d0_b16c_b2478500bd74.slice/crio-f9b48c00bc0df2bcf5f5d549af5fe8373ebd1ef93c44b2273466643412b646ac WatchSource:0}: Error finding container f9b48c00bc0df2bcf5f5d549af5fe8373ebd1ef93c44b2273466643412b646ac: Status 404 returned error can't find the container with id f9b48c00bc0df2bcf5f5d549af5fe8373ebd1ef93c44b2273466643412b646ac Mar 13 16:10:21 crc kubenswrapper[4786]: I0313 16:10:21.501768 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcdwf" event={"ID":"f5f28735-71ec-44d0-b16c-b2478500bd74","Type":"ContainerStarted","Data":"f9b48c00bc0df2bcf5f5d549af5fe8373ebd1ef93c44b2273466643412b646ac"} Mar 13 16:10:22 crc kubenswrapper[4786]: I0313 16:10:22.509540 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerID="6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215" exitCode=0 Mar 13 16:10:22 crc kubenswrapper[4786]: I0313 16:10:22.509832 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcdwf" event={"ID":"f5f28735-71ec-44d0-b16c-b2478500bd74","Type":"ContainerDied","Data":"6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215"} Mar 13 16:10:22 crc kubenswrapper[4786]: I0313 16:10:22.512642 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6k8lh" event={"ID":"b32bc6c0-96a2-4003-a586-b6cb86233de2","Type":"ContainerStarted","Data":"f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600"} Mar 13 16:10:22 crc kubenswrapper[4786]: I0313 16:10:22.554841 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6k8lh" podStartSLOduration=4.081080211 podStartE2EDuration="6.554824525s" podCreationTimestamp="2026-03-13 16:10:16 +0000 UTC" firstStartedPulling="2026-03-13 16:10:18.467202517 +0000 UTC m=+4048.630414348" lastFinishedPulling="2026-03-13 16:10:20.940946851 +0000 UTC m=+4051.104158662" observedRunningTime="2026-03-13 16:10:22.548736941 +0000 UTC m=+4052.711948742" watchObservedRunningTime="2026-03-13 16:10:22.554824525 +0000 UTC m=+4052.718036336" Mar 13 16:10:23 crc kubenswrapper[4786]: I0313 16:10:23.521413 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerID="4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54" exitCode=0 Mar 13 16:10:23 crc kubenswrapper[4786]: I0313 16:10:23.521498 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcdwf" event={"ID":"f5f28735-71ec-44d0-b16c-b2478500bd74","Type":"ContainerDied","Data":"4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54"} Mar 13 16:10:24 crc kubenswrapper[4786]: I0313 16:10:24.529499 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcdwf" event={"ID":"f5f28735-71ec-44d0-b16c-b2478500bd74","Type":"ContainerStarted","Data":"d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee"} Mar 13 16:10:24 crc kubenswrapper[4786]: I0313 16:10:24.552969 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qcdwf" podStartSLOduration=3.144648125 podStartE2EDuration="4.552947558s" podCreationTimestamp="2026-03-13 16:10:20 +0000 UTC" firstStartedPulling="2026-03-13 16:10:22.511518239 +0000 UTC m=+4052.674730050" lastFinishedPulling="2026-03-13 16:10:23.919817632 +0000 UTC m=+4054.083029483" observedRunningTime="2026-03-13 16:10:24.548024313 +0000 UTC m=+4054.711236124" watchObservedRunningTime="2026-03-13 16:10:24.552947558 +0000 UTC m=+4054.716159379" Mar 13 16:10:26 crc kubenswrapper[4786]: I0313 16:10:26.914016 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:26 crc kubenswrapper[4786]: I0313 16:10:26.914740 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:26 crc kubenswrapper[4786]: I0313 16:10:26.987526 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:27 crc kubenswrapper[4786]: I0313 16:10:27.552218 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:10:27 crc kubenswrapper[4786]: E0313 16:10:27.552895 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:10:27 crc kubenswrapper[4786]: I0313 16:10:27.605537 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:27 crc kubenswrapper[4786]: I0313 16:10:27.979239 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6k8lh"] Mar 13 16:10:29 crc kubenswrapper[4786]: I0313 16:10:29.565906 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6k8lh" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerName="registry-server" containerID="cri-o://f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600" gracePeriod=2 Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.113196 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.300682 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbnm2\" (UniqueName: \"kubernetes.io/projected/b32bc6c0-96a2-4003-a586-b6cb86233de2-kube-api-access-dbnm2\") pod \"b32bc6c0-96a2-4003-a586-b6cb86233de2\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.300786 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-catalog-content\") pod \"b32bc6c0-96a2-4003-a586-b6cb86233de2\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.302256 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-utilities\") pod \"b32bc6c0-96a2-4003-a586-b6cb86233de2\" (UID: \"b32bc6c0-96a2-4003-a586-b6cb86233de2\") " Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.303659 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-utilities" (OuterVolumeSpecName: "utilities") pod "b32bc6c0-96a2-4003-a586-b6cb86233de2" (UID: "b32bc6c0-96a2-4003-a586-b6cb86233de2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.391909 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b32bc6c0-96a2-4003-a586-b6cb86233de2" (UID: "b32bc6c0-96a2-4003-a586-b6cb86233de2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.403946 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.404019 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32bc6c0-96a2-4003-a586-b6cb86233de2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.576787 4786 generic.go:334] "Generic (PLEG): container finished" podID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerID="f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600" exitCode=0 Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.576852 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6k8lh" event={"ID":"b32bc6c0-96a2-4003-a586-b6cb86233de2","Type":"ContainerDied","Data":"f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600"} Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.576910 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6k8lh" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.576936 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6k8lh" event={"ID":"b32bc6c0-96a2-4003-a586-b6cb86233de2","Type":"ContainerDied","Data":"bf51ef42c17c8ae72289a1b65fd6b330fb2fb8b81057eba814d1d56af2302e67"} Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.576966 4786 scope.go:117] "RemoveContainer" containerID="f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.615417 4786 scope.go:117] "RemoveContainer" containerID="d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.782335 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.782444 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.824022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32bc6c0-96a2-4003-a586-b6cb86233de2-kube-api-access-dbnm2" (OuterVolumeSpecName: "kube-api-access-dbnm2") pod "b32bc6c0-96a2-4003-a586-b6cb86233de2" (UID: "b32bc6c0-96a2-4003-a586-b6cb86233de2"). InnerVolumeSpecName "kube-api-access-dbnm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.846826 4786 scope.go:117] "RemoveContainer" containerID="28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.867685 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.887632 4786 scope.go:117] "RemoveContainer" containerID="f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600" Mar 13 16:10:30 crc kubenswrapper[4786]: E0313 16:10:30.888302 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600\": container with ID starting with f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600 not found: ID does not exist" containerID="f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.888336 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600"} err="failed to get container status \"f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600\": rpc error: code = NotFound desc = could not find container \"f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600\": container with ID starting with f5b820a9e0279bd8850d3fbb95dfe349a841d7029f21b644b06ca7b7ca074600 not found: ID does not exist" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.888355 4786 scope.go:117] "RemoveContainer" containerID="d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc" Mar 13 16:10:30 crc kubenswrapper[4786]: E0313 16:10:30.888563 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc\": container with ID starting with d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc not found: ID does not exist" containerID="d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.888587 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc"} err="failed to get container status \"d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc\": rpc error: code = NotFound desc = could not find container \"d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc\": container with ID starting with d7ce92b6eeae2cd03bcfb52981ba2faf0e752263906e53535689d581fb76f8cc not found: ID does not exist" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.888601 4786 scope.go:117] "RemoveContainer" containerID="28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5" Mar 13 16:10:30 crc kubenswrapper[4786]: E0313 16:10:30.889124 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5\": container with ID starting with 28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5 not found: ID does not exist" containerID="28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.889189 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5"} err="failed to get container status \"28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5\": rpc error: code = NotFound desc = could not find container \"28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5\": container with ID starting with 28cf4aa228926fa36baddbdde1eedff3058c06e8575f82c94ac45de70d1238a5 not found: ID does not exist" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.911703 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbnm2\" (UniqueName: \"kubernetes.io/projected/b32bc6c0-96a2-4003-a586-b6cb86233de2-kube-api-access-dbnm2\") on node \"crc\" DevicePath \"\"" Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.933084 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6k8lh"] Mar 13 16:10:30 crc kubenswrapper[4786]: I0313 16:10:30.939346 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6k8lh"] Mar 13 16:10:31 crc kubenswrapper[4786]: I0313 16:10:31.623951 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:32 crc kubenswrapper[4786]: I0313 16:10:32.566409 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" path="/var/lib/kubelet/pods/b32bc6c0-96a2-4003-a586-b6cb86233de2/volumes" Mar 13 16:10:33 crc kubenswrapper[4786]: I0313 16:10:33.173182 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcdwf"] Mar 13 16:10:34 crc kubenswrapper[4786]: I0313 16:10:34.610790 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qcdwf" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerName="registry-server" containerID="cri-o://d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee" gracePeriod=2 Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.096705 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.284565 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-catalog-content\") pod \"f5f28735-71ec-44d0-b16c-b2478500bd74\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.284724 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grfn7\" (UniqueName: \"kubernetes.io/projected/f5f28735-71ec-44d0-b16c-b2478500bd74-kube-api-access-grfn7\") pod \"f5f28735-71ec-44d0-b16c-b2478500bd74\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.284842 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-utilities\") pod \"f5f28735-71ec-44d0-b16c-b2478500bd74\" (UID: \"f5f28735-71ec-44d0-b16c-b2478500bd74\") " Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.286596 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-utilities" (OuterVolumeSpecName: "utilities") pod "f5f28735-71ec-44d0-b16c-b2478500bd74" (UID: "f5f28735-71ec-44d0-b16c-b2478500bd74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.293163 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f28735-71ec-44d0-b16c-b2478500bd74-kube-api-access-grfn7" (OuterVolumeSpecName: "kube-api-access-grfn7") pod "f5f28735-71ec-44d0-b16c-b2478500bd74" (UID: "f5f28735-71ec-44d0-b16c-b2478500bd74"). InnerVolumeSpecName "kube-api-access-grfn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.318960 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5f28735-71ec-44d0-b16c-b2478500bd74" (UID: "f5f28735-71ec-44d0-b16c-b2478500bd74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.385949 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.385980 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5f28735-71ec-44d0-b16c-b2478500bd74-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.385991 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grfn7\" (UniqueName: \"kubernetes.io/projected/f5f28735-71ec-44d0-b16c-b2478500bd74-kube-api-access-grfn7\") on node \"crc\" DevicePath \"\"" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.626643 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerID="d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee" exitCode=0 Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.626759 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcdwf" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.626811 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcdwf" event={"ID":"f5f28735-71ec-44d0-b16c-b2478500bd74","Type":"ContainerDied","Data":"d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee"} Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.628295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcdwf" event={"ID":"f5f28735-71ec-44d0-b16c-b2478500bd74","Type":"ContainerDied","Data":"f9b48c00bc0df2bcf5f5d549af5fe8373ebd1ef93c44b2273466643412b646ac"} Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.628336 4786 scope.go:117] "RemoveContainer" containerID="d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.670590 4786 scope.go:117] "RemoveContainer" containerID="4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.681141 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcdwf"] Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.692139 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcdwf"] Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.700184 4786 scope.go:117] "RemoveContainer" containerID="6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.725358 4786 scope.go:117] "RemoveContainer" containerID="d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee" Mar 13 16:10:35 crc kubenswrapper[4786]: E0313 16:10:35.725930 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee\": container with ID starting with d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee not found: ID does not exist" containerID="d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.725974 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee"} err="failed to get container status \"d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee\": rpc error: code = NotFound desc = could not find container \"d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee\": container with ID starting with d1317863d9c451eb579dd4a5933980510095a345478ddc706ed8160fb95ccbee not found: ID does not exist" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.726008 4786 scope.go:117] "RemoveContainer" containerID="4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54" Mar 13 16:10:35 crc kubenswrapper[4786]: E0313 16:10:35.726424 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54\": container with ID starting with 4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54 not found: ID does not exist" containerID="4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.726461 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54"} err="failed to get container status \"4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54\": rpc error: code = NotFound desc = could not find container \"4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54\": container with ID starting with 4a5ec2398cc03ae7e98a4c87124b2287cdf97cd9c72b729a8aadd4cc9d74bc54 not found: ID does not exist" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.726483 4786 scope.go:117] "RemoveContainer" containerID="6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215" Mar 13 16:10:35 crc kubenswrapper[4786]: E0313 16:10:35.726929 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215\": container with ID starting with 6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215 not found: ID does not exist" containerID="6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215" Mar 13 16:10:35 crc kubenswrapper[4786]: I0313 16:10:35.726960 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215"} err="failed to get container status \"6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215\": rpc error: code = NotFound desc = could not find container \"6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215\": container with ID starting with 6bbce035888d7f172dd4b3a1eabf17a647474af673b0a4ac096ab1d9df0c8215 not found: ID does not exist" Mar 13 16:10:36 crc kubenswrapper[4786]: I0313 16:10:36.569373 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" path="/var/lib/kubelet/pods/f5f28735-71ec-44d0-b16c-b2478500bd74/volumes" Mar 13 16:10:38 crc kubenswrapper[4786]: I0313 16:10:38.552564 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:10:38 crc kubenswrapper[4786]: E0313 16:10:38.553138 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:10:52 crc kubenswrapper[4786]: I0313 16:10:52.552017 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:10:52 crc kubenswrapper[4786]: E0313 16:10:52.552660 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:11:07 crc kubenswrapper[4786]: I0313 16:11:07.552682 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:11:07 crc kubenswrapper[4786]: E0313 16:11:07.553952 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:11:21 crc kubenswrapper[4786]: I0313 16:11:21.552030 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:11:21 crc kubenswrapper[4786]: E0313 16:11:21.552787 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:11:35 crc kubenswrapper[4786]: I0313 16:11:35.552094 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:11:35 crc kubenswrapper[4786]: E0313 16:11:35.552900 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:11:47 crc kubenswrapper[4786]: I0313 16:11:47.552385 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:11:47 crc kubenswrapper[4786]: E0313 16:11:47.553232 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.150129 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556972-92g9c"] Mar 13 16:12:00 crc kubenswrapper[4786]: E0313 16:12:00.150983 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerName="extract-utilities" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151000 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerName="extract-utilities" Mar 13 16:12:00 crc kubenswrapper[4786]: E0313 16:12:00.151017 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerName="registry-server" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151024 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerName="registry-server" Mar 13 16:12:00 crc kubenswrapper[4786]: E0313 16:12:00.151043 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerName="extract-content" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151050 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerName="extract-content" Mar 13 16:12:00 crc kubenswrapper[4786]: E0313 16:12:00.151060 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerName="extract-utilities" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151066 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerName="extract-utilities" Mar 13 16:12:00 crc kubenswrapper[4786]: E0313 16:12:00.151078 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerName="extract-content" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151084 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerName="extract-content" Mar 13 16:12:00 crc kubenswrapper[4786]: E0313 16:12:00.151091 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerName="registry-server" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151096 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerName="registry-server" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151234 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f28735-71ec-44d0-b16c-b2478500bd74" containerName="registry-server" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151248 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32bc6c0-96a2-4003-a586-b6cb86233de2" containerName="registry-server" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.151653 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556972-92g9c" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.154482 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.154691 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.160565 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556972-92g9c"] Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.179739 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.185154 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg6f7\" (UniqueName: \"kubernetes.io/projected/5519dba9-403d-424b-b6c9-edcca47f98c4-kube-api-access-dg6f7\") pod \"auto-csr-approver-29556972-92g9c\" (UID: \"5519dba9-403d-424b-b6c9-edcca47f98c4\") " pod="openshift-infra/auto-csr-approver-29556972-92g9c" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.286998 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg6f7\" (UniqueName: \"kubernetes.io/projected/5519dba9-403d-424b-b6c9-edcca47f98c4-kube-api-access-dg6f7\") pod \"auto-csr-approver-29556972-92g9c\" (UID: \"5519dba9-403d-424b-b6c9-edcca47f98c4\") " pod="openshift-infra/auto-csr-approver-29556972-92g9c" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.314529 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg6f7\" (UniqueName: \"kubernetes.io/projected/5519dba9-403d-424b-b6c9-edcca47f98c4-kube-api-access-dg6f7\") pod \"auto-csr-approver-29556972-92g9c\" (UID: \"5519dba9-403d-424b-b6c9-edcca47f98c4\") " pod="openshift-infra/auto-csr-approver-29556972-92g9c" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.492247 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556972-92g9c" Mar 13 16:12:00 crc kubenswrapper[4786]: I0313 16:12:00.760224 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556972-92g9c"] Mar 13 16:12:00 crc kubenswrapper[4786]: W0313 16:12:00.770218 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5519dba9_403d_424b_b6c9_edcca47f98c4.slice/crio-1f29f340408ce8e97f2e0082ca4ea840a9184f5c09ae9da13a0eaa468c477faf WatchSource:0}: Error finding container 1f29f340408ce8e97f2e0082ca4ea840a9184f5c09ae9da13a0eaa468c477faf: Status 404 returned error can't find the container with id 1f29f340408ce8e97f2e0082ca4ea840a9184f5c09ae9da13a0eaa468c477faf Mar 13 16:12:01 crc kubenswrapper[4786]: I0313 16:12:01.362568 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556972-92g9c" event={"ID":"5519dba9-403d-424b-b6c9-edcca47f98c4","Type":"ContainerStarted","Data":"1f29f340408ce8e97f2e0082ca4ea840a9184f5c09ae9da13a0eaa468c477faf"} Mar 13 16:12:02 crc kubenswrapper[4786]: I0313 16:12:02.551743 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:12:02 crc kubenswrapper[4786]: E0313 16:12:02.552295 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:12:03 crc kubenswrapper[4786]: I0313 16:12:03.377630 4786 generic.go:334] "Generic (PLEG): container finished" podID="5519dba9-403d-424b-b6c9-edcca47f98c4" containerID="75ea79ffb36aee6d7704ce8f64168c018a388a256495d868c67a9d929f55f849" exitCode=0 Mar 13 16:12:03 crc kubenswrapper[4786]: I0313 16:12:03.377670 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556972-92g9c" event={"ID":"5519dba9-403d-424b-b6c9-edcca47f98c4","Type":"ContainerDied","Data":"75ea79ffb36aee6d7704ce8f64168c018a388a256495d868c67a9d929f55f849"} Mar 13 16:12:04 crc kubenswrapper[4786]: I0313 16:12:04.690261 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556972-92g9c" Mar 13 16:12:04 crc kubenswrapper[4786]: I0313 16:12:04.850583 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg6f7\" (UniqueName: \"kubernetes.io/projected/5519dba9-403d-424b-b6c9-edcca47f98c4-kube-api-access-dg6f7\") pod \"5519dba9-403d-424b-b6c9-edcca47f98c4\" (UID: \"5519dba9-403d-424b-b6c9-edcca47f98c4\") " Mar 13 16:12:04 crc kubenswrapper[4786]: I0313 16:12:04.857603 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5519dba9-403d-424b-b6c9-edcca47f98c4-kube-api-access-dg6f7" (OuterVolumeSpecName: "kube-api-access-dg6f7") pod "5519dba9-403d-424b-b6c9-edcca47f98c4" (UID: "5519dba9-403d-424b-b6c9-edcca47f98c4"). InnerVolumeSpecName "kube-api-access-dg6f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:12:04 crc kubenswrapper[4786]: I0313 16:12:04.952666 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg6f7\" (UniqueName: \"kubernetes.io/projected/5519dba9-403d-424b-b6c9-edcca47f98c4-kube-api-access-dg6f7\") on node \"crc\" DevicePath \"\"" Mar 13 16:12:05 crc kubenswrapper[4786]: I0313 16:12:05.395924 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556972-92g9c" event={"ID":"5519dba9-403d-424b-b6c9-edcca47f98c4","Type":"ContainerDied","Data":"1f29f340408ce8e97f2e0082ca4ea840a9184f5c09ae9da13a0eaa468c477faf"} Mar 13 16:12:05 crc kubenswrapper[4786]: I0313 16:12:05.395960 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f29f340408ce8e97f2e0082ca4ea840a9184f5c09ae9da13a0eaa468c477faf" Mar 13 16:12:05 crc kubenswrapper[4786]: I0313 16:12:05.395979 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556972-92g9c" Mar 13 16:12:05 crc kubenswrapper[4786]: I0313 16:12:05.799998 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556966-gw7lr"] Mar 13 16:12:05 crc kubenswrapper[4786]: I0313 16:12:05.807337 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556966-gw7lr"] Mar 13 16:12:06 crc kubenswrapper[4786]: I0313 16:12:06.565340 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fec79d5-fb54-45d8-9dfa-61dd1a202814" path="/var/lib/kubelet/pods/0fec79d5-fb54-45d8-9dfa-61dd1a202814/volumes" Mar 13 16:12:12 crc kubenswrapper[4786]: I0313 16:12:12.624180 4786 scope.go:117] "RemoveContainer" containerID="97796641b3993e808da63732bea2406630273f6d13d3d101e4e4c73523276d02" Mar 13 16:12:14 crc kubenswrapper[4786]: I0313 16:12:14.552182 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:12:14 crc kubenswrapper[4786]: E0313 16:12:14.552605 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:12:27 crc kubenswrapper[4786]: I0313 16:12:27.551780 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:12:27 crc kubenswrapper[4786]: E0313 16:12:27.552974 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.104787 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p8cnt"] Mar 13 16:12:31 crc kubenswrapper[4786]: E0313 16:12:31.106407 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5519dba9-403d-424b-b6c9-edcca47f98c4" containerName="oc" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.106446 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5519dba9-403d-424b-b6c9-edcca47f98c4" containerName="oc" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.106831 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5519dba9-403d-424b-b6c9-edcca47f98c4" containerName="oc" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.109457 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.121384 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8cnt"] Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.289584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-utilities\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.289732 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-catalog-content\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.289952 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfnkz\" (UniqueName: \"kubernetes.io/projected/e411c5b7-551f-4e3c-8cd8-d8539b744125-kube-api-access-pfnkz\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.391405 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-utilities\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.391461 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-catalog-content\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.391530 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfnkz\" (UniqueName: \"kubernetes.io/projected/e411c5b7-551f-4e3c-8cd8-d8539b744125-kube-api-access-pfnkz\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.391999 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-utilities\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.392141 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-catalog-content\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.420184 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfnkz\" (UniqueName: \"kubernetes.io/projected/e411c5b7-551f-4e3c-8cd8-d8539b744125-kube-api-access-pfnkz\") pod \"community-operators-p8cnt\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.442462 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:31 crc kubenswrapper[4786]: I0313 16:12:31.770276 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8cnt"] Mar 13 16:12:32 crc kubenswrapper[4786]: I0313 16:12:32.643398 4786 generic.go:334] "Generic (PLEG): container finished" podID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerID="780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1" exitCode=0 Mar 13 16:12:32 crc kubenswrapper[4786]: I0313 16:12:32.643586 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8cnt" event={"ID":"e411c5b7-551f-4e3c-8cd8-d8539b744125","Type":"ContainerDied","Data":"780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1"} Mar 13 16:12:32 crc kubenswrapper[4786]: I0313 16:12:32.645015 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8cnt" event={"ID":"e411c5b7-551f-4e3c-8cd8-d8539b744125","Type":"ContainerStarted","Data":"1c4b41c9eb818c6473de3645dc9b30438ad700d6f5c099c82e29732d64408046"} Mar 13 16:12:34 crc kubenswrapper[4786]: I0313 16:12:34.672104 4786 generic.go:334] "Generic (PLEG): container finished" podID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerID="6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef" exitCode=0 Mar 13 16:12:34 crc kubenswrapper[4786]: I0313 16:12:34.672251 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8cnt" event={"ID":"e411c5b7-551f-4e3c-8cd8-d8539b744125","Type":"ContainerDied","Data":"6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef"} Mar 13 16:12:35 crc kubenswrapper[4786]: I0313 16:12:35.682492 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8cnt" event={"ID":"e411c5b7-551f-4e3c-8cd8-d8539b744125","Type":"ContainerStarted","Data":"2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376"} Mar 13 16:12:35 crc kubenswrapper[4786]: I0313 16:12:35.702893 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p8cnt" podStartSLOduration=2.165278522 podStartE2EDuration="4.702874441s" podCreationTimestamp="2026-03-13 16:12:31 +0000 UTC" firstStartedPulling="2026-03-13 16:12:32.646625743 +0000 UTC m=+4182.809837574" lastFinishedPulling="2026-03-13 16:12:35.184221652 +0000 UTC m=+4185.347433493" observedRunningTime="2026-03-13 16:12:35.700230164 +0000 UTC m=+4185.863442005" watchObservedRunningTime="2026-03-13 16:12:35.702874441 +0000 UTC m=+4185.866086262" Mar 13 16:12:39 crc kubenswrapper[4786]: I0313 16:12:39.552271 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:12:39 crc kubenswrapper[4786]: E0313 16:12:39.552745 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:12:41 crc kubenswrapper[4786]: I0313 16:12:41.442970 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:41 crc kubenswrapper[4786]: I0313 16:12:41.443334 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:41 crc kubenswrapper[4786]: I0313 16:12:41.532824 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:41 crc kubenswrapper[4786]: I0313 16:12:41.788846 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:41 crc kubenswrapper[4786]: I0313 16:12:41.855400 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p8cnt"] Mar 13 16:12:43 crc kubenswrapper[4786]: I0313 16:12:43.753795 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p8cnt" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerName="registry-server" containerID="cri-o://2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376" gracePeriod=2 Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.717920 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.777459 4786 generic.go:334] "Generic (PLEG): container finished" podID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerID="2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376" exitCode=0 Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.777507 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8cnt" event={"ID":"e411c5b7-551f-4e3c-8cd8-d8539b744125","Type":"ContainerDied","Data":"2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376"} Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.777531 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8cnt" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.777538 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8cnt" event={"ID":"e411c5b7-551f-4e3c-8cd8-d8539b744125","Type":"ContainerDied","Data":"1c4b41c9eb818c6473de3645dc9b30438ad700d6f5c099c82e29732d64408046"} Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.777558 4786 scope.go:117] "RemoveContainer" containerID="2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.797704 4786 scope.go:117] "RemoveContainer" containerID="6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.821334 4786 scope.go:117] "RemoveContainer" containerID="780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.846246 4786 scope.go:117] "RemoveContainer" containerID="2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376" Mar 13 16:12:44 crc kubenswrapper[4786]: E0313 16:12:44.846899 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376\": container with ID starting with 2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376 not found: ID does not exist" containerID="2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.846955 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376"} err="failed to get container status \"2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376\": rpc error: code = NotFound desc = could not find container \"2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376\": container with ID starting with 2750081537c65af2c9f4792774b2c563455d89e0198c99f077c911eda80ea376 not found: ID does not exist" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.846985 4786 scope.go:117] "RemoveContainer" containerID="6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef" Mar 13 16:12:44 crc kubenswrapper[4786]: E0313 16:12:44.847450 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef\": container with ID starting with 6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef not found: ID does not exist" containerID="6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.847503 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef"} err="failed to get container status \"6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef\": rpc error: code = NotFound desc = could not find container \"6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef\": container with ID starting with 6c692401eda67e5bf94dea9bda7e16b7a27b61f5ae564ad2610d82fedd0ae4ef not found: ID does not exist" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.847536 4786 scope.go:117] "RemoveContainer" containerID="780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1" Mar 13 16:12:44 crc kubenswrapper[4786]: E0313 16:12:44.848043 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1\": container with ID starting with 780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1 not found: ID does not exist" containerID="780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.848074 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1"} err="failed to get container status \"780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1\": rpc error: code = NotFound desc = could not find container \"780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1\": container with ID starting with 780b6efcb4b17731b439f1ab7b1ff2f6262106457dacc0d2491b9a32cd55f5e1 not found: ID does not exist" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.919061 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-utilities\") pod \"e411c5b7-551f-4e3c-8cd8-d8539b744125\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.919152 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-catalog-content\") pod \"e411c5b7-551f-4e3c-8cd8-d8539b744125\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.919189 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfnkz\" (UniqueName: \"kubernetes.io/projected/e411c5b7-551f-4e3c-8cd8-d8539b744125-kube-api-access-pfnkz\") pod \"e411c5b7-551f-4e3c-8cd8-d8539b744125\" (UID: \"e411c5b7-551f-4e3c-8cd8-d8539b744125\") " Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.920316 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-utilities" (OuterVolumeSpecName: "utilities") pod "e411c5b7-551f-4e3c-8cd8-d8539b744125" (UID: "e411c5b7-551f-4e3c-8cd8-d8539b744125"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.925189 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e411c5b7-551f-4e3c-8cd8-d8539b744125-kube-api-access-pfnkz" (OuterVolumeSpecName: "kube-api-access-pfnkz") pod "e411c5b7-551f-4e3c-8cd8-d8539b744125" (UID: "e411c5b7-551f-4e3c-8cd8-d8539b744125"). InnerVolumeSpecName "kube-api-access-pfnkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:12:44 crc kubenswrapper[4786]: I0313 16:12:44.974023 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e411c5b7-551f-4e3c-8cd8-d8539b744125" (UID: "e411c5b7-551f-4e3c-8cd8-d8539b744125"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:12:45 crc kubenswrapper[4786]: I0313 16:12:45.021198 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:12:45 crc kubenswrapper[4786]: I0313 16:12:45.021241 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e411c5b7-551f-4e3c-8cd8-d8539b744125-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:12:45 crc kubenswrapper[4786]: I0313 16:12:45.021259 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfnkz\" (UniqueName: \"kubernetes.io/projected/e411c5b7-551f-4e3c-8cd8-d8539b744125-kube-api-access-pfnkz\") on node \"crc\" DevicePath \"\"" Mar 13 16:12:45 crc kubenswrapper[4786]: I0313 16:12:45.130628 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p8cnt"] Mar 13 16:12:45 crc kubenswrapper[4786]: I0313 16:12:45.142406 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p8cnt"] Mar 13 16:12:46 crc kubenswrapper[4786]: I0313 16:12:46.569299 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" path="/var/lib/kubelet/pods/e411c5b7-551f-4e3c-8cd8-d8539b744125/volumes" Mar 13 16:12:52 crc kubenswrapper[4786]: I0313 16:12:52.552672 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:12:52 crc kubenswrapper[4786]: E0313 16:12:52.553711 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:13:03 crc kubenswrapper[4786]: I0313 16:13:03.552105 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:13:03 crc kubenswrapper[4786]: E0313 16:13:03.553062 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:13:17 crc kubenswrapper[4786]: I0313 16:13:17.552088 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:13:18 crc kubenswrapper[4786]: I0313 16:13:18.147806 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"1c584aa7af64cdbfd616633a49b3c66f7a8d9bf9e37408ccff7e664544ab8091"} Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.156391 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556974-mzrb9"] Mar 13 16:14:00 crc kubenswrapper[4786]: E0313 16:14:00.157399 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerName="extract-content" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.157416 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerName="extract-content" Mar 13 16:14:00 crc kubenswrapper[4786]: E0313 16:14:00.157444 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerName="registry-server" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.157451 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerName="registry-server" Mar 13 16:14:00 crc kubenswrapper[4786]: E0313 16:14:00.157475 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerName="extract-utilities" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.157483 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerName="extract-utilities" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.157637 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e411c5b7-551f-4e3c-8cd8-d8539b744125" containerName="registry-server" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.158267 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556974-mzrb9" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.159953 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.160956 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.162599 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.172136 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556974-mzrb9"] Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.270220 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcwt9\" (UniqueName: \"kubernetes.io/projected/3a6a60d6-24bc-4c1e-bfb4-efb2d983b422-kube-api-access-zcwt9\") pod \"auto-csr-approver-29556974-mzrb9\" (UID: \"3a6a60d6-24bc-4c1e-bfb4-efb2d983b422\") " pod="openshift-infra/auto-csr-approver-29556974-mzrb9" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.371565 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcwt9\" (UniqueName: \"kubernetes.io/projected/3a6a60d6-24bc-4c1e-bfb4-efb2d983b422-kube-api-access-zcwt9\") pod \"auto-csr-approver-29556974-mzrb9\" (UID: \"3a6a60d6-24bc-4c1e-bfb4-efb2d983b422\") " pod="openshift-infra/auto-csr-approver-29556974-mzrb9" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.395181 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcwt9\" (UniqueName: \"kubernetes.io/projected/3a6a60d6-24bc-4c1e-bfb4-efb2d983b422-kube-api-access-zcwt9\") pod \"auto-csr-approver-29556974-mzrb9\" (UID: \"3a6a60d6-24bc-4c1e-bfb4-efb2d983b422\") " pod="openshift-infra/auto-csr-approver-29556974-mzrb9" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.476034 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556974-mzrb9" Mar 13 16:14:00 crc kubenswrapper[4786]: I0313 16:14:00.905257 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556974-mzrb9"] Mar 13 16:14:01 crc kubenswrapper[4786]: I0313 16:14:01.519524 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556974-mzrb9" event={"ID":"3a6a60d6-24bc-4c1e-bfb4-efb2d983b422","Type":"ContainerStarted","Data":"256525c5d064acd48763622d0c9e0506a1fb1f4526354b3d4f5e295637bd8661"} Mar 13 16:14:03 crc kubenswrapper[4786]: I0313 16:14:03.542657 4786 generic.go:334] "Generic (PLEG): container finished" podID="3a6a60d6-24bc-4c1e-bfb4-efb2d983b422" containerID="208b81c33dd69dc0ae0f70ee403742bc844eaaa413b08a20ce34e9f9e3631c66" exitCode=0 Mar 13 16:14:03 crc kubenswrapper[4786]: I0313 16:14:03.542893 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556974-mzrb9" event={"ID":"3a6a60d6-24bc-4c1e-bfb4-efb2d983b422","Type":"ContainerDied","Data":"208b81c33dd69dc0ae0f70ee403742bc844eaaa413b08a20ce34e9f9e3631c66"} Mar 13 16:14:04 crc kubenswrapper[4786]: I0313 16:14:04.831264 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556974-mzrb9" Mar 13 16:14:04 crc kubenswrapper[4786]: I0313 16:14:04.931665 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcwt9\" (UniqueName: \"kubernetes.io/projected/3a6a60d6-24bc-4c1e-bfb4-efb2d983b422-kube-api-access-zcwt9\") pod \"3a6a60d6-24bc-4c1e-bfb4-efb2d983b422\" (UID: \"3a6a60d6-24bc-4c1e-bfb4-efb2d983b422\") " Mar 13 16:14:04 crc kubenswrapper[4786]: I0313 16:14:04.942439 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a6a60d6-24bc-4c1e-bfb4-efb2d983b422-kube-api-access-zcwt9" (OuterVolumeSpecName: "kube-api-access-zcwt9") pod "3a6a60d6-24bc-4c1e-bfb4-efb2d983b422" (UID: "3a6a60d6-24bc-4c1e-bfb4-efb2d983b422"). InnerVolumeSpecName "kube-api-access-zcwt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:14:05 crc kubenswrapper[4786]: I0313 16:14:05.033630 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcwt9\" (UniqueName: \"kubernetes.io/projected/3a6a60d6-24bc-4c1e-bfb4-efb2d983b422-kube-api-access-zcwt9\") on node \"crc\" DevicePath \"\"" Mar 13 16:14:05 crc kubenswrapper[4786]: I0313 16:14:05.557235 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556974-mzrb9" event={"ID":"3a6a60d6-24bc-4c1e-bfb4-efb2d983b422","Type":"ContainerDied","Data":"256525c5d064acd48763622d0c9e0506a1fb1f4526354b3d4f5e295637bd8661"} Mar 13 16:14:05 crc kubenswrapper[4786]: I0313 16:14:05.557282 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="256525c5d064acd48763622d0c9e0506a1fb1f4526354b3d4f5e295637bd8661" Mar 13 16:14:05 crc kubenswrapper[4786]: I0313 16:14:05.557293 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556974-mzrb9" Mar 13 16:14:05 crc kubenswrapper[4786]: I0313 16:14:05.908123 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556968-996h4"] Mar 13 16:14:05 crc kubenswrapper[4786]: I0313 16:14:05.920058 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556968-996h4"] Mar 13 16:14:06 crc kubenswrapper[4786]: I0313 16:14:06.562821 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aaf938a-e3c6-49de-80b3-80c48d1d6a71" path="/var/lib/kubelet/pods/3aaf938a-e3c6-49de-80b3-80c48d1d6a71/volumes" Mar 13 16:14:12 crc kubenswrapper[4786]: I0313 16:14:12.711628 4786 scope.go:117] "RemoveContainer" containerID="09107f4a85999e2a54d60f29744a425150bad4fb0c04001d79894b322e0df740" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.156536 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf"] Mar 13 16:15:00 crc kubenswrapper[4786]: E0313 16:15:00.157324 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a6a60d6-24bc-4c1e-bfb4-efb2d983b422" containerName="oc" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.157336 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a6a60d6-24bc-4c1e-bfb4-efb2d983b422" containerName="oc" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.157458 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a6a60d6-24bc-4c1e-bfb4-efb2d983b422" containerName="oc" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.157896 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.160598 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.161440 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.174513 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf"] Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.317236 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ac97daa-9ad6-411b-9e46-a0e08cb55866-config-volume\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.317327 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ac97daa-9ad6-411b-9e46-a0e08cb55866-secret-volume\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.317589 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czfgp\" (UniqueName: \"kubernetes.io/projected/6ac97daa-9ad6-411b-9e46-a0e08cb55866-kube-api-access-czfgp\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.419335 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czfgp\" (UniqueName: \"kubernetes.io/projected/6ac97daa-9ad6-411b-9e46-a0e08cb55866-kube-api-access-czfgp\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.419399 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ac97daa-9ad6-411b-9e46-a0e08cb55866-config-volume\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.419432 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ac97daa-9ad6-411b-9e46-a0e08cb55866-secret-volume\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.420569 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ac97daa-9ad6-411b-9e46-a0e08cb55866-config-volume\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.627218 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ac97daa-9ad6-411b-9e46-a0e08cb55866-secret-volume\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.628359 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czfgp\" (UniqueName: \"kubernetes.io/projected/6ac97daa-9ad6-411b-9e46-a0e08cb55866-kube-api-access-czfgp\") pod \"collect-profiles-29556975-kp2mf\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:00 crc kubenswrapper[4786]: I0313 16:15:00.779150 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:01 crc kubenswrapper[4786]: I0313 16:15:01.233561 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf"] Mar 13 16:15:02 crc kubenswrapper[4786]: I0313 16:15:02.024617 4786 generic.go:334] "Generic (PLEG): container finished" podID="6ac97daa-9ad6-411b-9e46-a0e08cb55866" containerID="88b349537522e7778a2c85e8a0c5a34a5408e9933a9d4b7aa0b9375cb2f1c15b" exitCode=0 Mar 13 16:15:02 crc kubenswrapper[4786]: I0313 16:15:02.024671 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" event={"ID":"6ac97daa-9ad6-411b-9e46-a0e08cb55866","Type":"ContainerDied","Data":"88b349537522e7778a2c85e8a0c5a34a5408e9933a9d4b7aa0b9375cb2f1c15b"} Mar 13 16:15:02 crc kubenswrapper[4786]: I0313 16:15:02.025042 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" event={"ID":"6ac97daa-9ad6-411b-9e46-a0e08cb55866","Type":"ContainerStarted","Data":"29124abbf97ac42393f0c9d7f959cac24d40117e9c7646fab297d67a6904ba66"} Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.278157 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.367875 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ac97daa-9ad6-411b-9e46-a0e08cb55866-secret-volume\") pod \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.367983 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czfgp\" (UniqueName: \"kubernetes.io/projected/6ac97daa-9ad6-411b-9e46-a0e08cb55866-kube-api-access-czfgp\") pod \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.368017 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ac97daa-9ad6-411b-9e46-a0e08cb55866-config-volume\") pod \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\" (UID: \"6ac97daa-9ad6-411b-9e46-a0e08cb55866\") " Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.368936 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ac97daa-9ad6-411b-9e46-a0e08cb55866-config-volume" (OuterVolumeSpecName: "config-volume") pod "6ac97daa-9ad6-411b-9e46-a0e08cb55866" (UID: "6ac97daa-9ad6-411b-9e46-a0e08cb55866"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.373362 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac97daa-9ad6-411b-9e46-a0e08cb55866-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6ac97daa-9ad6-411b-9e46-a0e08cb55866" (UID: "6ac97daa-9ad6-411b-9e46-a0e08cb55866"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.382119 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac97daa-9ad6-411b-9e46-a0e08cb55866-kube-api-access-czfgp" (OuterVolumeSpecName: "kube-api-access-czfgp") pod "6ac97daa-9ad6-411b-9e46-a0e08cb55866" (UID: "6ac97daa-9ad6-411b-9e46-a0e08cb55866"). InnerVolumeSpecName "kube-api-access-czfgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.470180 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czfgp\" (UniqueName: \"kubernetes.io/projected/6ac97daa-9ad6-411b-9e46-a0e08cb55866-kube-api-access-czfgp\") on node \"crc\" DevicePath \"\"" Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.470241 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ac97daa-9ad6-411b-9e46-a0e08cb55866-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 16:15:03 crc kubenswrapper[4786]: I0313 16:15:03.470268 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ac97daa-9ad6-411b-9e46-a0e08cb55866-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 16:15:04 crc kubenswrapper[4786]: I0313 16:15:04.043779 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" event={"ID":"6ac97daa-9ad6-411b-9e46-a0e08cb55866","Type":"ContainerDied","Data":"29124abbf97ac42393f0c9d7f959cac24d40117e9c7646fab297d67a6904ba66"} Mar 13 16:15:04 crc kubenswrapper[4786]: I0313 16:15:04.043838 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29124abbf97ac42393f0c9d7f959cac24d40117e9c7646fab297d67a6904ba66" Mar 13 16:15:04 crc kubenswrapper[4786]: I0313 16:15:04.043845 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf" Mar 13 16:15:04 crc kubenswrapper[4786]: I0313 16:15:04.378188 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b"] Mar 13 16:15:04 crc kubenswrapper[4786]: I0313 16:15:04.386135 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-kl66b"] Mar 13 16:15:04 crc kubenswrapper[4786]: I0313 16:15:04.563607 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ccc7b5-7b80-4239-ae66-964942300583" path="/var/lib/kubelet/pods/25ccc7b5-7b80-4239-ae66-964942300583/volumes" Mar 13 16:15:12 crc kubenswrapper[4786]: I0313 16:15:12.796800 4786 scope.go:117] "RemoveContainer" containerID="a75c043559eb5f33d9b5551bd16525ea870f9c328e8a6e4b94f066427762e8ab" Mar 13 16:15:37 crc kubenswrapper[4786]: I0313 16:15:37.869069 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:15:37 crc kubenswrapper[4786]: I0313 16:15:37.869576 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.155319 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556976-fvp9f"] Mar 13 16:16:00 crc kubenswrapper[4786]: E0313 16:16:00.156799 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac97daa-9ad6-411b-9e46-a0e08cb55866" containerName="collect-profiles" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.156821 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac97daa-9ad6-411b-9e46-a0e08cb55866" containerName="collect-profiles" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.157215 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac97daa-9ad6-411b-9e46-a0e08cb55866" containerName="collect-profiles" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.157784 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.159815 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.160000 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.160144 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.163440 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556976-fvp9f"] Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.256504 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v25d9\" (UniqueName: \"kubernetes.io/projected/d8a1a461-9ca2-4056-8458-47a00554a130-kube-api-access-v25d9\") pod \"auto-csr-approver-29556976-fvp9f\" (UID: \"d8a1a461-9ca2-4056-8458-47a00554a130\") " pod="openshift-infra/auto-csr-approver-29556976-fvp9f" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.358058 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v25d9\" (UniqueName: \"kubernetes.io/projected/d8a1a461-9ca2-4056-8458-47a00554a130-kube-api-access-v25d9\") pod \"auto-csr-approver-29556976-fvp9f\" (UID: \"d8a1a461-9ca2-4056-8458-47a00554a130\") " pod="openshift-infra/auto-csr-approver-29556976-fvp9f" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.381233 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v25d9\" (UniqueName: \"kubernetes.io/projected/d8a1a461-9ca2-4056-8458-47a00554a130-kube-api-access-v25d9\") pod \"auto-csr-approver-29556976-fvp9f\" (UID: \"d8a1a461-9ca2-4056-8458-47a00554a130\") " pod="openshift-infra/auto-csr-approver-29556976-fvp9f" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.485531 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.722977 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556976-fvp9f"] Mar 13 16:16:00 crc kubenswrapper[4786]: I0313 16:16:00.735741 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:16:01 crc kubenswrapper[4786]: I0313 16:16:01.528494 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" event={"ID":"d8a1a461-9ca2-4056-8458-47a00554a130","Type":"ContainerStarted","Data":"712be36901cbecb5c85125c02ca3bcf8d618e41ee0abd67eaf237d0f1fc28106"} Mar 13 16:16:02 crc kubenswrapper[4786]: I0313 16:16:02.538473 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" event={"ID":"d8a1a461-9ca2-4056-8458-47a00554a130","Type":"ContainerStarted","Data":"7c281f93ce959efc5c6809b73c7c9e3a6bde7b53e6a222112d2aa508428ebec3"} Mar 13 16:16:02 crc kubenswrapper[4786]: I0313 16:16:02.558232 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" podStartSLOduration=1.258296101 podStartE2EDuration="2.558208773s" podCreationTimestamp="2026-03-13 16:16:00 +0000 UTC" firstStartedPulling="2026-03-13 16:16:00.735488536 +0000 UTC m=+4390.898700357" lastFinishedPulling="2026-03-13 16:16:02.035401218 +0000 UTC m=+4392.198613029" observedRunningTime="2026-03-13 16:16:02.55177522 +0000 UTC m=+4392.714987031" watchObservedRunningTime="2026-03-13 16:16:02.558208773 +0000 UTC m=+4392.721420594" Mar 13 16:16:03 crc kubenswrapper[4786]: I0313 16:16:03.548586 4786 generic.go:334] "Generic (PLEG): container finished" podID="d8a1a461-9ca2-4056-8458-47a00554a130" containerID="7c281f93ce959efc5c6809b73c7c9e3a6bde7b53e6a222112d2aa508428ebec3" exitCode=0 Mar 13 16:16:03 crc kubenswrapper[4786]: I0313 16:16:03.548661 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" event={"ID":"d8a1a461-9ca2-4056-8458-47a00554a130","Type":"ContainerDied","Data":"7c281f93ce959efc5c6809b73c7c9e3a6bde7b53e6a222112d2aa508428ebec3"} Mar 13 16:16:04 crc kubenswrapper[4786]: I0313 16:16:04.823686 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" Mar 13 16:16:04 crc kubenswrapper[4786]: I0313 16:16:04.939970 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v25d9\" (UniqueName: \"kubernetes.io/projected/d8a1a461-9ca2-4056-8458-47a00554a130-kube-api-access-v25d9\") pod \"d8a1a461-9ca2-4056-8458-47a00554a130\" (UID: \"d8a1a461-9ca2-4056-8458-47a00554a130\") " Mar 13 16:16:04 crc kubenswrapper[4786]: I0313 16:16:04.945110 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a1a461-9ca2-4056-8458-47a00554a130-kube-api-access-v25d9" (OuterVolumeSpecName: "kube-api-access-v25d9") pod "d8a1a461-9ca2-4056-8458-47a00554a130" (UID: "d8a1a461-9ca2-4056-8458-47a00554a130"). InnerVolumeSpecName "kube-api-access-v25d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:16:05 crc kubenswrapper[4786]: I0313 16:16:05.043196 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v25d9\" (UniqueName: \"kubernetes.io/projected/d8a1a461-9ca2-4056-8458-47a00554a130-kube-api-access-v25d9\") on node \"crc\" DevicePath \"\"" Mar 13 16:16:05 crc kubenswrapper[4786]: I0313 16:16:05.566310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" event={"ID":"d8a1a461-9ca2-4056-8458-47a00554a130","Type":"ContainerDied","Data":"712be36901cbecb5c85125c02ca3bcf8d618e41ee0abd67eaf237d0f1fc28106"} Mar 13 16:16:05 crc kubenswrapper[4786]: I0313 16:16:05.566681 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="712be36901cbecb5c85125c02ca3bcf8d618e41ee0abd67eaf237d0f1fc28106" Mar 13 16:16:05 crc kubenswrapper[4786]: I0313 16:16:05.566372 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556976-fvp9f" Mar 13 16:16:05 crc kubenswrapper[4786]: I0313 16:16:05.619964 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556970-r49rz"] Mar 13 16:16:05 crc kubenswrapper[4786]: I0313 16:16:05.626496 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556970-r49rz"] Mar 13 16:16:06 crc kubenswrapper[4786]: I0313 16:16:06.562542 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cfec5e6-cd20-44ac-abd4-aa7f70352ba9" path="/var/lib/kubelet/pods/8cfec5e6-cd20-44ac-abd4-aa7f70352ba9/volumes" Mar 13 16:16:07 crc kubenswrapper[4786]: I0313 16:16:07.869276 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:16:07 crc kubenswrapper[4786]: I0313 16:16:07.869371 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:16:12 crc kubenswrapper[4786]: I0313 16:16:12.869264 4786 scope.go:117] "RemoveContainer" containerID="e3ee3ff945ab513ffeeb0e13fb18af312e8019b96686936ccefebe9de49cbde7" Mar 13 16:16:37 crc kubenswrapper[4786]: I0313 16:16:37.869159 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:16:37 crc kubenswrapper[4786]: I0313 16:16:37.869796 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:16:37 crc kubenswrapper[4786]: I0313 16:16:37.869884 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:16:37 crc kubenswrapper[4786]: I0313 16:16:37.870821 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c584aa7af64cdbfd616633a49b3c66f7a8d9bf9e37408ccff7e664544ab8091"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:16:37 crc kubenswrapper[4786]: I0313 16:16:37.870962 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://1c584aa7af64cdbfd616633a49b3c66f7a8d9bf9e37408ccff7e664544ab8091" gracePeriod=600 Mar 13 16:16:38 crc kubenswrapper[4786]: I0313 16:16:38.866105 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="1c584aa7af64cdbfd616633a49b3c66f7a8d9bf9e37408ccff7e664544ab8091" exitCode=0 Mar 13 16:16:38 crc kubenswrapper[4786]: I0313 16:16:38.866477 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"1c584aa7af64cdbfd616633a49b3c66f7a8d9bf9e37408ccff7e664544ab8091"} Mar 13 16:16:38 crc kubenswrapper[4786]: I0313 16:16:38.866508 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739"} Mar 13 16:16:38 crc kubenswrapper[4786]: I0313 16:16:38.866528 4786 scope.go:117] "RemoveContainer" containerID="33d4c1ea350e020c5de1680f08434dd475fa0f3f5c337e27c1df50b7dfbc0d69" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.220192 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kjgrq"] Mar 13 16:16:45 crc kubenswrapper[4786]: E0313 16:16:45.228244 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a1a461-9ca2-4056-8458-47a00554a130" containerName="oc" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.228295 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a1a461-9ca2-4056-8458-47a00554a130" containerName="oc" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.228618 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a1a461-9ca2-4056-8458-47a00554a130" containerName="oc" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.230024 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.232590 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjgrq"] Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.268919 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-catalog-content\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.269065 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnn5w\" (UniqueName: \"kubernetes.io/projected/c7df20b8-03e0-41d5-80f2-af0de19827ae-kube-api-access-tnn5w\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.269260 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-utilities\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.370697 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-catalog-content\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.371045 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnn5w\" (UniqueName: \"kubernetes.io/projected/c7df20b8-03e0-41d5-80f2-af0de19827ae-kube-api-access-tnn5w\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.371206 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-catalog-content\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.371214 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-utilities\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.371744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-utilities\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.389661 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnn5w\" (UniqueName: \"kubernetes.io/projected/c7df20b8-03e0-41d5-80f2-af0de19827ae-kube-api-access-tnn5w\") pod \"redhat-operators-kjgrq\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:45 crc kubenswrapper[4786]: I0313 16:16:45.554119 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:46 crc kubenswrapper[4786]: I0313 16:16:46.040118 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjgrq"] Mar 13 16:16:46 crc kubenswrapper[4786]: W0313 16:16:46.049001 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7df20b8_03e0_41d5_80f2_af0de19827ae.slice/crio-67ee752c23ef188e7282700abcc9781b1a3921573a7c95832afa9489f9592927 WatchSource:0}: Error finding container 67ee752c23ef188e7282700abcc9781b1a3921573a7c95832afa9489f9592927: Status 404 returned error can't find the container with id 67ee752c23ef188e7282700abcc9781b1a3921573a7c95832afa9489f9592927 Mar 13 16:16:46 crc kubenswrapper[4786]: I0313 16:16:46.958715 4786 generic.go:334] "Generic (PLEG): container finished" podID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerID="ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b" exitCode=0 Mar 13 16:16:46 crc kubenswrapper[4786]: I0313 16:16:46.958790 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjgrq" event={"ID":"c7df20b8-03e0-41d5-80f2-af0de19827ae","Type":"ContainerDied","Data":"ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b"} Mar 13 16:16:46 crc kubenswrapper[4786]: I0313 16:16:46.959040 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjgrq" event={"ID":"c7df20b8-03e0-41d5-80f2-af0de19827ae","Type":"ContainerStarted","Data":"67ee752c23ef188e7282700abcc9781b1a3921573a7c95832afa9489f9592927"} Mar 13 16:16:48 crc kubenswrapper[4786]: I0313 16:16:48.981014 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjgrq" event={"ID":"c7df20b8-03e0-41d5-80f2-af0de19827ae","Type":"ContainerStarted","Data":"2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866"} Mar 13 16:16:49 crc kubenswrapper[4786]: I0313 16:16:49.997698 4786 generic.go:334] "Generic (PLEG): container finished" podID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerID="2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866" exitCode=0 Mar 13 16:16:49 crc kubenswrapper[4786]: I0313 16:16:49.997765 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjgrq" event={"ID":"c7df20b8-03e0-41d5-80f2-af0de19827ae","Type":"ContainerDied","Data":"2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866"} Mar 13 16:16:51 crc kubenswrapper[4786]: I0313 16:16:51.009676 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjgrq" event={"ID":"c7df20b8-03e0-41d5-80f2-af0de19827ae","Type":"ContainerStarted","Data":"1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767"} Mar 13 16:16:51 crc kubenswrapper[4786]: I0313 16:16:51.041796 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kjgrq" podStartSLOduration=2.576587444 podStartE2EDuration="6.041775797s" podCreationTimestamp="2026-03-13 16:16:45 +0000 UTC" firstStartedPulling="2026-03-13 16:16:46.960732906 +0000 UTC m=+4437.123944717" lastFinishedPulling="2026-03-13 16:16:50.425921259 +0000 UTC m=+4440.589133070" observedRunningTime="2026-03-13 16:16:51.035811116 +0000 UTC m=+4441.199022967" watchObservedRunningTime="2026-03-13 16:16:51.041775797 +0000 UTC m=+4441.204987648" Mar 13 16:16:55 crc kubenswrapper[4786]: I0313 16:16:55.555178 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:55 crc kubenswrapper[4786]: I0313 16:16:55.555758 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:16:56 crc kubenswrapper[4786]: I0313 16:16:56.594618 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kjgrq" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="registry-server" probeResult="failure" output=< Mar 13 16:16:56 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 16:16:56 crc kubenswrapper[4786]: > Mar 13 16:17:05 crc kubenswrapper[4786]: I0313 16:17:05.617422 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:17:05 crc kubenswrapper[4786]: I0313 16:17:05.684653 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:17:05 crc kubenswrapper[4786]: I0313 16:17:05.865788 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjgrq"] Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.144246 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kjgrq" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="registry-server" containerID="cri-o://1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767" gracePeriod=2 Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.532623 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.713696 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-utilities\") pod \"c7df20b8-03e0-41d5-80f2-af0de19827ae\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.713993 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-catalog-content\") pod \"c7df20b8-03e0-41d5-80f2-af0de19827ae\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.714025 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnn5w\" (UniqueName: \"kubernetes.io/projected/c7df20b8-03e0-41d5-80f2-af0de19827ae-kube-api-access-tnn5w\") pod \"c7df20b8-03e0-41d5-80f2-af0de19827ae\" (UID: \"c7df20b8-03e0-41d5-80f2-af0de19827ae\") " Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.715180 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-utilities" (OuterVolumeSpecName: "utilities") pod "c7df20b8-03e0-41d5-80f2-af0de19827ae" (UID: "c7df20b8-03e0-41d5-80f2-af0de19827ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.728303 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7df20b8-03e0-41d5-80f2-af0de19827ae-kube-api-access-tnn5w" (OuterVolumeSpecName: "kube-api-access-tnn5w") pod "c7df20b8-03e0-41d5-80f2-af0de19827ae" (UID: "c7df20b8-03e0-41d5-80f2-af0de19827ae"). InnerVolumeSpecName "kube-api-access-tnn5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.816727 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnn5w\" (UniqueName: \"kubernetes.io/projected/c7df20b8-03e0-41d5-80f2-af0de19827ae-kube-api-access-tnn5w\") on node \"crc\" DevicePath \"\"" Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.817494 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:17:07 crc kubenswrapper[4786]: I0313 16:17:07.935330 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7df20b8-03e0-41d5-80f2-af0de19827ae" (UID: "c7df20b8-03e0-41d5-80f2-af0de19827ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.020784 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7df20b8-03e0-41d5-80f2-af0de19827ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.155190 4786 generic.go:334] "Generic (PLEG): container finished" podID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerID="1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767" exitCode=0 Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.155237 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjgrq" event={"ID":"c7df20b8-03e0-41d5-80f2-af0de19827ae","Type":"ContainerDied","Data":"1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767"} Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.155263 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjgrq" event={"ID":"c7df20b8-03e0-41d5-80f2-af0de19827ae","Type":"ContainerDied","Data":"67ee752c23ef188e7282700abcc9781b1a3921573a7c95832afa9489f9592927"} Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.155273 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjgrq" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.155281 4786 scope.go:117] "RemoveContainer" containerID="1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.175576 4786 scope.go:117] "RemoveContainer" containerID="2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.200100 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjgrq"] Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.209374 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kjgrq"] Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.212447 4786 scope.go:117] "RemoveContainer" containerID="ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.235768 4786 scope.go:117] "RemoveContainer" containerID="1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767" Mar 13 16:17:08 crc kubenswrapper[4786]: E0313 16:17:08.236869 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767\": container with ID starting with 1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767 not found: ID does not exist" containerID="1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.237000 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767"} err="failed to get container status \"1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767\": rpc error: code = NotFound desc = could not find container \"1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767\": container with ID starting with 1823777e6676049335edfea9c478b3f0f1919b4d4027f4d22de3e2eae9090767 not found: ID does not exist" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.237094 4786 scope.go:117] "RemoveContainer" containerID="2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866" Mar 13 16:17:08 crc kubenswrapper[4786]: E0313 16:17:08.238038 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866\": container with ID starting with 2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866 not found: ID does not exist" containerID="2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.238085 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866"} err="failed to get container status \"2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866\": rpc error: code = NotFound desc = could not find container \"2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866\": container with ID starting with 2fb16404441154f6cbf773d3242b538cf5c7be836b43732cbe6b5a564a27a866 not found: ID does not exist" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.238119 4786 scope.go:117] "RemoveContainer" containerID="ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b" Mar 13 16:17:08 crc kubenswrapper[4786]: E0313 16:17:08.238419 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b\": container with ID starting with ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b not found: ID does not exist" containerID="ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.238528 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b"} err="failed to get container status \"ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b\": rpc error: code = NotFound desc = could not find container \"ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b\": container with ID starting with ca78270a9a87625b5f448da5c4a14e775e01ba7f194bd43b28a0b872c5251e1b not found: ID does not exist" Mar 13 16:17:08 crc kubenswrapper[4786]: I0313 16:17:08.568692 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" path="/var/lib/kubelet/pods/c7df20b8-03e0-41d5-80f2-af0de19827ae/volumes" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.155127 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556978-d94pf"] Mar 13 16:18:00 crc kubenswrapper[4786]: E0313 16:18:00.156027 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="extract-utilities" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.156046 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="extract-utilities" Mar 13 16:18:00 crc kubenswrapper[4786]: E0313 16:18:00.156071 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="extract-content" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.156080 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="extract-content" Mar 13 16:18:00 crc kubenswrapper[4786]: E0313 16:18:00.156095 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="registry-server" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.156105 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="registry-server" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.156262 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7df20b8-03e0-41d5-80f2-af0de19827ae" containerName="registry-server" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.156869 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556978-d94pf" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.161987 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.162029 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.162154 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.167647 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556978-d94pf"] Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.343315 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fblph\" (UniqueName: \"kubernetes.io/projected/d7448469-5320-42fb-95c5-78029309e512-kube-api-access-fblph\") pod \"auto-csr-approver-29556978-d94pf\" (UID: \"d7448469-5320-42fb-95c5-78029309e512\") " pod="openshift-infra/auto-csr-approver-29556978-d94pf" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.445512 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fblph\" (UniqueName: \"kubernetes.io/projected/d7448469-5320-42fb-95c5-78029309e512-kube-api-access-fblph\") pod \"auto-csr-approver-29556978-d94pf\" (UID: \"d7448469-5320-42fb-95c5-78029309e512\") " pod="openshift-infra/auto-csr-approver-29556978-d94pf" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.480217 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fblph\" (UniqueName: \"kubernetes.io/projected/d7448469-5320-42fb-95c5-78029309e512-kube-api-access-fblph\") pod \"auto-csr-approver-29556978-d94pf\" (UID: \"d7448469-5320-42fb-95c5-78029309e512\") " pod="openshift-infra/auto-csr-approver-29556978-d94pf" Mar 13 16:18:00 crc kubenswrapper[4786]: I0313 16:18:00.775783 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556978-d94pf" Mar 13 16:18:01 crc kubenswrapper[4786]: I0313 16:18:01.259979 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556978-d94pf"] Mar 13 16:18:01 crc kubenswrapper[4786]: I0313 16:18:01.629898 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556978-d94pf" event={"ID":"d7448469-5320-42fb-95c5-78029309e512","Type":"ContainerStarted","Data":"9efe96a438ceb077c8c3e671c17285eae0b1975cac3bb34c85ca42826609039a"} Mar 13 16:18:02 crc kubenswrapper[4786]: I0313 16:18:02.642175 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556978-d94pf" event={"ID":"d7448469-5320-42fb-95c5-78029309e512","Type":"ContainerStarted","Data":"26d2e81eb5387085f7f62f93098fa8705801382366e9ae62968160576a35e5bf"} Mar 13 16:18:02 crc kubenswrapper[4786]: I0313 16:18:02.665694 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556978-d94pf" podStartSLOduration=1.71103794 podStartE2EDuration="2.665670918s" podCreationTimestamp="2026-03-13 16:18:00 +0000 UTC" firstStartedPulling="2026-03-13 16:18:01.274534518 +0000 UTC m=+4511.437746329" lastFinishedPulling="2026-03-13 16:18:02.229167486 +0000 UTC m=+4512.392379307" observedRunningTime="2026-03-13 16:18:02.65865348 +0000 UTC m=+4512.821865301" watchObservedRunningTime="2026-03-13 16:18:02.665670918 +0000 UTC m=+4512.828882749" Mar 13 16:18:03 crc kubenswrapper[4786]: I0313 16:18:03.653542 4786 generic.go:334] "Generic (PLEG): container finished" podID="d7448469-5320-42fb-95c5-78029309e512" containerID="26d2e81eb5387085f7f62f93098fa8705801382366e9ae62968160576a35e5bf" exitCode=0 Mar 13 16:18:03 crc kubenswrapper[4786]: I0313 16:18:03.653665 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556978-d94pf" event={"ID":"d7448469-5320-42fb-95c5-78029309e512","Type":"ContainerDied","Data":"26d2e81eb5387085f7f62f93098fa8705801382366e9ae62968160576a35e5bf"} Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.007144 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556978-d94pf" Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.111921 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fblph\" (UniqueName: \"kubernetes.io/projected/d7448469-5320-42fb-95c5-78029309e512-kube-api-access-fblph\") pod \"d7448469-5320-42fb-95c5-78029309e512\" (UID: \"d7448469-5320-42fb-95c5-78029309e512\") " Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.116917 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7448469-5320-42fb-95c5-78029309e512-kube-api-access-fblph" (OuterVolumeSpecName: "kube-api-access-fblph") pod "d7448469-5320-42fb-95c5-78029309e512" (UID: "d7448469-5320-42fb-95c5-78029309e512"). InnerVolumeSpecName "kube-api-access-fblph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.213687 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fblph\" (UniqueName: \"kubernetes.io/projected/d7448469-5320-42fb-95c5-78029309e512-kube-api-access-fblph\") on node \"crc\" DevicePath \"\"" Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.673046 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556978-d94pf" event={"ID":"d7448469-5320-42fb-95c5-78029309e512","Type":"ContainerDied","Data":"9efe96a438ceb077c8c3e671c17285eae0b1975cac3bb34c85ca42826609039a"} Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.673184 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9efe96a438ceb077c8c3e671c17285eae0b1975cac3bb34c85ca42826609039a" Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.673253 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556978-d94pf" Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.752490 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556972-92g9c"] Mar 13 16:18:05 crc kubenswrapper[4786]: I0313 16:18:05.765286 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556972-92g9c"] Mar 13 16:18:06 crc kubenswrapper[4786]: I0313 16:18:06.561658 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5519dba9-403d-424b-b6c9-edcca47f98c4" path="/var/lib/kubelet/pods/5519dba9-403d-424b-b6c9-edcca47f98c4/volumes" Mar 13 16:18:12 crc kubenswrapper[4786]: I0313 16:18:12.969552 4786 scope.go:117] "RemoveContainer" containerID="75ea79ffb36aee6d7704ce8f64168c018a388a256495d868c67a9d929f55f849" Mar 13 16:19:07 crc kubenswrapper[4786]: I0313 16:19:07.868595 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:19:07 crc kubenswrapper[4786]: I0313 16:19:07.869229 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:19:37 crc kubenswrapper[4786]: I0313 16:19:37.868807 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:19:37 crc kubenswrapper[4786]: I0313 16:19:37.869424 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.147368 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556980-z2tc7"] Mar 13 16:20:00 crc kubenswrapper[4786]: E0313 16:20:00.150427 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7448469-5320-42fb-95c5-78029309e512" containerName="oc" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.150621 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7448469-5320-42fb-95c5-78029309e512" containerName="oc" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.151055 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7448469-5320-42fb-95c5-78029309e512" containerName="oc" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.153141 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.156333 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.157035 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.157749 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.161286 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556980-z2tc7"] Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.165145 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5n5n\" (UniqueName: \"kubernetes.io/projected/27cdcb18-a9c0-47ca-9acf-033a2179028c-kube-api-access-t5n5n\") pod \"auto-csr-approver-29556980-z2tc7\" (UID: \"27cdcb18-a9c0-47ca-9acf-033a2179028c\") " pod="openshift-infra/auto-csr-approver-29556980-z2tc7" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.266265 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5n5n\" (UniqueName: \"kubernetes.io/projected/27cdcb18-a9c0-47ca-9acf-033a2179028c-kube-api-access-t5n5n\") pod \"auto-csr-approver-29556980-z2tc7\" (UID: \"27cdcb18-a9c0-47ca-9acf-033a2179028c\") " pod="openshift-infra/auto-csr-approver-29556980-z2tc7" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.291172 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5n5n\" (UniqueName: \"kubernetes.io/projected/27cdcb18-a9c0-47ca-9acf-033a2179028c-kube-api-access-t5n5n\") pod \"auto-csr-approver-29556980-z2tc7\" (UID: \"27cdcb18-a9c0-47ca-9acf-033a2179028c\") " pod="openshift-infra/auto-csr-approver-29556980-z2tc7" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.490420 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" Mar 13 16:20:00 crc kubenswrapper[4786]: I0313 16:20:00.951547 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556980-z2tc7"] Mar 13 16:20:01 crc kubenswrapper[4786]: I0313 16:20:01.576906 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" event={"ID":"27cdcb18-a9c0-47ca-9acf-033a2179028c","Type":"ContainerStarted","Data":"4297af45ae063cf9df442f4ecc1787bf2edb27ced763eb411b877125da16fcb6"} Mar 13 16:20:02 crc kubenswrapper[4786]: I0313 16:20:02.591430 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" event={"ID":"27cdcb18-a9c0-47ca-9acf-033a2179028c","Type":"ContainerStarted","Data":"c4f30524c4d22216359df1afd72ea5947821b395e91ce6b08ae68c95dcf48bb5"} Mar 13 16:20:02 crc kubenswrapper[4786]: I0313 16:20:02.617172 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" podStartSLOduration=1.356607869 podStartE2EDuration="2.617147185s" podCreationTimestamp="2026-03-13 16:20:00 +0000 UTC" firstStartedPulling="2026-03-13 16:20:00.974108573 +0000 UTC m=+4631.137320384" lastFinishedPulling="2026-03-13 16:20:02.234647849 +0000 UTC m=+4632.397859700" observedRunningTime="2026-03-13 16:20:02.605034618 +0000 UTC m=+4632.768246439" watchObservedRunningTime="2026-03-13 16:20:02.617147185 +0000 UTC m=+4632.780359016" Mar 13 16:20:03 crc kubenswrapper[4786]: I0313 16:20:03.603563 4786 generic.go:334] "Generic (PLEG): container finished" podID="27cdcb18-a9c0-47ca-9acf-033a2179028c" containerID="c4f30524c4d22216359df1afd72ea5947821b395e91ce6b08ae68c95dcf48bb5" exitCode=0 Mar 13 16:20:03 crc kubenswrapper[4786]: I0313 16:20:03.603643 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" event={"ID":"27cdcb18-a9c0-47ca-9acf-033a2179028c","Type":"ContainerDied","Data":"c4f30524c4d22216359df1afd72ea5947821b395e91ce6b08ae68c95dcf48bb5"} Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.210078 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.339227 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5n5n\" (UniqueName: \"kubernetes.io/projected/27cdcb18-a9c0-47ca-9acf-033a2179028c-kube-api-access-t5n5n\") pod \"27cdcb18-a9c0-47ca-9acf-033a2179028c\" (UID: \"27cdcb18-a9c0-47ca-9acf-033a2179028c\") " Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.344465 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27cdcb18-a9c0-47ca-9acf-033a2179028c-kube-api-access-t5n5n" (OuterVolumeSpecName: "kube-api-access-t5n5n") pod "27cdcb18-a9c0-47ca-9acf-033a2179028c" (UID: "27cdcb18-a9c0-47ca-9acf-033a2179028c"). InnerVolumeSpecName "kube-api-access-t5n5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.441228 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5n5n\" (UniqueName: \"kubernetes.io/projected/27cdcb18-a9c0-47ca-9acf-033a2179028c-kube-api-access-t5n5n\") on node \"crc\" DevicePath \"\"" Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.621154 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" event={"ID":"27cdcb18-a9c0-47ca-9acf-033a2179028c","Type":"ContainerDied","Data":"4297af45ae063cf9df442f4ecc1787bf2edb27ced763eb411b877125da16fcb6"} Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.621199 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4297af45ae063cf9df442f4ecc1787bf2edb27ced763eb411b877125da16fcb6" Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.621244 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556980-z2tc7" Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.686787 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556974-mzrb9"] Mar 13 16:20:05 crc kubenswrapper[4786]: I0313 16:20:05.691679 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556974-mzrb9"] Mar 13 16:20:06 crc kubenswrapper[4786]: I0313 16:20:06.563275 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a6a60d6-24bc-4c1e-bfb4-efb2d983b422" path="/var/lib/kubelet/pods/3a6a60d6-24bc-4c1e-bfb4-efb2d983b422/volumes" Mar 13 16:20:07 crc kubenswrapper[4786]: I0313 16:20:07.868953 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:20:07 crc kubenswrapper[4786]: I0313 16:20:07.869810 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:20:07 crc kubenswrapper[4786]: I0313 16:20:07.870062 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:20:07 crc kubenswrapper[4786]: I0313 16:20:07.871029 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:20:07 crc kubenswrapper[4786]: I0313 16:20:07.871255 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" gracePeriod=600 Mar 13 16:20:08 crc kubenswrapper[4786]: E0313 16:20:08.643064 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:20:08 crc kubenswrapper[4786]: I0313 16:20:08.644670 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" exitCode=0 Mar 13 16:20:08 crc kubenswrapper[4786]: I0313 16:20:08.644710 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739"} Mar 13 16:20:08 crc kubenswrapper[4786]: I0313 16:20:08.644744 4786 scope.go:117] "RemoveContainer" containerID="1c584aa7af64cdbfd616633a49b3c66f7a8d9bf9e37408ccff7e664544ab8091" Mar 13 16:20:09 crc kubenswrapper[4786]: I0313 16:20:09.656086 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:20:09 crc kubenswrapper[4786]: E0313 16:20:09.656499 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:20:13 crc kubenswrapper[4786]: I0313 16:20:13.102249 4786 scope.go:117] "RemoveContainer" containerID="208b81c33dd69dc0ae0f70ee403742bc844eaaa413b08a20ce34e9f9e3631c66" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.299745 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l6b9f"] Mar 13 16:20:20 crc kubenswrapper[4786]: E0313 16:20:20.304283 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cdcb18-a9c0-47ca-9acf-033a2179028c" containerName="oc" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.304325 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cdcb18-a9c0-47ca-9acf-033a2179028c" containerName="oc" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.304614 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cdcb18-a9c0-47ca-9acf-033a2179028c" containerName="oc" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.321238 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.344291 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6b9f"] Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.380958 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-utilities\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.381036 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkg4d\" (UniqueName: \"kubernetes.io/projected/58362d6a-1557-41f0-b4da-aec380482852-kube-api-access-kkg4d\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.381139 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-catalog-content\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.482121 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-utilities\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.482194 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkg4d\" (UniqueName: \"kubernetes.io/projected/58362d6a-1557-41f0-b4da-aec380482852-kube-api-access-kkg4d\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.482248 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-catalog-content\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.482641 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-utilities\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.482680 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-catalog-content\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.512703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkg4d\" (UniqueName: \"kubernetes.io/projected/58362d6a-1557-41f0-b4da-aec380482852-kube-api-access-kkg4d\") pod \"certified-operators-l6b9f\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:20 crc kubenswrapper[4786]: I0313 16:20:20.697988 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:21 crc kubenswrapper[4786]: I0313 16:20:21.196929 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6b9f"] Mar 13 16:20:21 crc kubenswrapper[4786]: I0313 16:20:21.769386 4786 generic.go:334] "Generic (PLEG): container finished" podID="58362d6a-1557-41f0-b4da-aec380482852" containerID="3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357" exitCode=0 Mar 13 16:20:21 crc kubenswrapper[4786]: I0313 16:20:21.769447 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6b9f" event={"ID":"58362d6a-1557-41f0-b4da-aec380482852","Type":"ContainerDied","Data":"3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357"} Mar 13 16:20:21 crc kubenswrapper[4786]: I0313 16:20:21.770065 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6b9f" event={"ID":"58362d6a-1557-41f0-b4da-aec380482852","Type":"ContainerStarted","Data":"b8952afe2f2c62932f55f23db40049921d437f456e29dda645c1bb9475392297"} Mar 13 16:20:22 crc kubenswrapper[4786]: I0313 16:20:22.781822 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6b9f" event={"ID":"58362d6a-1557-41f0-b4da-aec380482852","Type":"ContainerStarted","Data":"3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d"} Mar 13 16:20:23 crc kubenswrapper[4786]: I0313 16:20:23.792825 4786 generic.go:334] "Generic (PLEG): container finished" podID="58362d6a-1557-41f0-b4da-aec380482852" containerID="3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d" exitCode=0 Mar 13 16:20:23 crc kubenswrapper[4786]: I0313 16:20:23.792947 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6b9f" event={"ID":"58362d6a-1557-41f0-b4da-aec380482852","Type":"ContainerDied","Data":"3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d"} Mar 13 16:20:24 crc kubenswrapper[4786]: I0313 16:20:24.551607 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:20:24 crc kubenswrapper[4786]: E0313 16:20:24.551962 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:20:24 crc kubenswrapper[4786]: I0313 16:20:24.804150 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6b9f" event={"ID":"58362d6a-1557-41f0-b4da-aec380482852","Type":"ContainerStarted","Data":"7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734"} Mar 13 16:20:24 crc kubenswrapper[4786]: I0313 16:20:24.839630 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l6b9f" podStartSLOduration=2.170647427 podStartE2EDuration="4.839594848s" podCreationTimestamp="2026-03-13 16:20:20 +0000 UTC" firstStartedPulling="2026-03-13 16:20:21.777769369 +0000 UTC m=+4651.940981220" lastFinishedPulling="2026-03-13 16:20:24.44671679 +0000 UTC m=+4654.609928641" observedRunningTime="2026-03-13 16:20:24.829060572 +0000 UTC m=+4654.992272403" watchObservedRunningTime="2026-03-13 16:20:24.839594848 +0000 UTC m=+4655.002806669" Mar 13 16:20:30 crc kubenswrapper[4786]: I0313 16:20:30.698113 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:30 crc kubenswrapper[4786]: I0313 16:20:30.698506 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:30 crc kubenswrapper[4786]: I0313 16:20:30.764054 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:30 crc kubenswrapper[4786]: I0313 16:20:30.918935 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:31 crc kubenswrapper[4786]: I0313 16:20:31.011925 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6b9f"] Mar 13 16:20:32 crc kubenswrapper[4786]: I0313 16:20:32.877432 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l6b9f" podUID="58362d6a-1557-41f0-b4da-aec380482852" containerName="registry-server" containerID="cri-o://7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734" gracePeriod=2 Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.285610 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.479350 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-catalog-content\") pod \"58362d6a-1557-41f0-b4da-aec380482852\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.479478 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-utilities\") pod \"58362d6a-1557-41f0-b4da-aec380482852\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.479550 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkg4d\" (UniqueName: \"kubernetes.io/projected/58362d6a-1557-41f0-b4da-aec380482852-kube-api-access-kkg4d\") pod \"58362d6a-1557-41f0-b4da-aec380482852\" (UID: \"58362d6a-1557-41f0-b4da-aec380482852\") " Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.481300 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-utilities" (OuterVolumeSpecName: "utilities") pod "58362d6a-1557-41f0-b4da-aec380482852" (UID: "58362d6a-1557-41f0-b4da-aec380482852"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.485518 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58362d6a-1557-41f0-b4da-aec380482852-kube-api-access-kkg4d" (OuterVolumeSpecName: "kube-api-access-kkg4d") pod "58362d6a-1557-41f0-b4da-aec380482852" (UID: "58362d6a-1557-41f0-b4da-aec380482852"). InnerVolumeSpecName "kube-api-access-kkg4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.581192 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.581330 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkg4d\" (UniqueName: \"kubernetes.io/projected/58362d6a-1557-41f0-b4da-aec380482852-kube-api-access-kkg4d\") on node \"crc\" DevicePath \"\"" Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.899432 4786 generic.go:334] "Generic (PLEG): container finished" podID="58362d6a-1557-41f0-b4da-aec380482852" containerID="7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734" exitCode=0 Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.899513 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6b9f" event={"ID":"58362d6a-1557-41f0-b4da-aec380482852","Type":"ContainerDied","Data":"7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734"} Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.899575 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6b9f" Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.899596 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6b9f" event={"ID":"58362d6a-1557-41f0-b4da-aec380482852","Type":"ContainerDied","Data":"b8952afe2f2c62932f55f23db40049921d437f456e29dda645c1bb9475392297"} Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.899632 4786 scope.go:117] "RemoveContainer" containerID="7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734" Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.939253 4786 scope.go:117] "RemoveContainer" containerID="3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d" Mar 13 16:20:33 crc kubenswrapper[4786]: I0313 16:20:33.972738 4786 scope.go:117] "RemoveContainer" containerID="3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.001448 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58362d6a-1557-41f0-b4da-aec380482852" (UID: "58362d6a-1557-41f0-b4da-aec380482852"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.006144 4786 scope.go:117] "RemoveContainer" containerID="7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734" Mar 13 16:20:34 crc kubenswrapper[4786]: E0313 16:20:34.006675 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734\": container with ID starting with 7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734 not found: ID does not exist" containerID="7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.006752 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734"} err="failed to get container status \"7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734\": rpc error: code = NotFound desc = could not find container \"7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734\": container with ID starting with 7385b1146d657a1fe75fb95dd2d809951f489a2ffa7d7c732f68f0ac1db35734 not found: ID does not exist" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.006794 4786 scope.go:117] "RemoveContainer" containerID="3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d" Mar 13 16:20:34 crc kubenswrapper[4786]: E0313 16:20:34.007432 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d\": container with ID starting with 3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d not found: ID does not exist" containerID="3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.007486 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d"} err="failed to get container status \"3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d\": rpc error: code = NotFound desc = could not find container \"3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d\": container with ID starting with 3cc63976f138a7ad692743f41e750b3a5337720175cf4146ea7404ce94f2e13d not found: ID does not exist" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.007526 4786 scope.go:117] "RemoveContainer" containerID="3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357" Mar 13 16:20:34 crc kubenswrapper[4786]: E0313 16:20:34.007844 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357\": container with ID starting with 3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357 not found: ID does not exist" containerID="3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.007909 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357"} err="failed to get container status \"3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357\": rpc error: code = NotFound desc = could not find container \"3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357\": container with ID starting with 3d242fd40d7724eac93d9299e483a9b70315e8b4f6602cf7dc6b984db291e357 not found: ID does not exist" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.094594 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58362d6a-1557-41f0-b4da-aec380482852-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.253875 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6b9f"] Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.268233 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l6b9f"] Mar 13 16:20:34 crc kubenswrapper[4786]: I0313 16:20:34.567358 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58362d6a-1557-41f0-b4da-aec380482852" path="/var/lib/kubelet/pods/58362d6a-1557-41f0-b4da-aec380482852/volumes" Mar 13 16:20:37 crc kubenswrapper[4786]: I0313 16:20:37.551769 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:20:37 crc kubenswrapper[4786]: E0313 16:20:37.552417 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:20:51 crc kubenswrapper[4786]: I0313 16:20:51.552700 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:20:51 crc kubenswrapper[4786]: E0313 16:20:51.554456 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.084910 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4qcdh"] Mar 13 16:20:56 crc kubenswrapper[4786]: E0313 16:20:56.085988 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58362d6a-1557-41f0-b4da-aec380482852" containerName="extract-content" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.086011 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="58362d6a-1557-41f0-b4da-aec380482852" containerName="extract-content" Mar 13 16:20:56 crc kubenswrapper[4786]: E0313 16:20:56.086039 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58362d6a-1557-41f0-b4da-aec380482852" containerName="extract-utilities" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.086050 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="58362d6a-1557-41f0-b4da-aec380482852" containerName="extract-utilities" Mar 13 16:20:56 crc kubenswrapper[4786]: E0313 16:20:56.086068 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58362d6a-1557-41f0-b4da-aec380482852" containerName="registry-server" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.086078 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="58362d6a-1557-41f0-b4da-aec380482852" containerName="registry-server" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.086316 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="58362d6a-1557-41f0-b4da-aec380482852" containerName="registry-server" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.087790 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.109311 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qcdh"] Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.142487 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-utilities\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.142536 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb7jg\" (UniqueName: \"kubernetes.io/projected/22532062-c47d-4fed-8ad8-6157e07c9ee3-kube-api-access-tb7jg\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.142584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-catalog-content\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.243704 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-catalog-content\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.243817 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-utilities\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.243850 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb7jg\" (UniqueName: \"kubernetes.io/projected/22532062-c47d-4fed-8ad8-6157e07c9ee3-kube-api-access-tb7jg\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.244267 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-catalog-content\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.244402 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-utilities\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.265405 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb7jg\" (UniqueName: \"kubernetes.io/projected/22532062-c47d-4fed-8ad8-6157e07c9ee3-kube-api-access-tb7jg\") pod \"redhat-marketplace-4qcdh\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.426305 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:20:56 crc kubenswrapper[4786]: I0313 16:20:56.962716 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qcdh"] Mar 13 16:20:56 crc kubenswrapper[4786]: W0313 16:20:56.973045 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22532062_c47d_4fed_8ad8_6157e07c9ee3.slice/crio-9dc3d9b1cfde371dd667d17296b0d8fbca803b08da034eed9575d34432ced84b WatchSource:0}: Error finding container 9dc3d9b1cfde371dd667d17296b0d8fbca803b08da034eed9575d34432ced84b: Status 404 returned error can't find the container with id 9dc3d9b1cfde371dd667d17296b0d8fbca803b08da034eed9575d34432ced84b Mar 13 16:20:57 crc kubenswrapper[4786]: I0313 16:20:57.127034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qcdh" event={"ID":"22532062-c47d-4fed-8ad8-6157e07c9ee3","Type":"ContainerStarted","Data":"9dc3d9b1cfde371dd667d17296b0d8fbca803b08da034eed9575d34432ced84b"} Mar 13 16:20:58 crc kubenswrapper[4786]: I0313 16:20:58.139151 4786 generic.go:334] "Generic (PLEG): container finished" podID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerID="6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7" exitCode=0 Mar 13 16:20:58 crc kubenswrapper[4786]: I0313 16:20:58.139238 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qcdh" event={"ID":"22532062-c47d-4fed-8ad8-6157e07c9ee3","Type":"ContainerDied","Data":"6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7"} Mar 13 16:20:59 crc kubenswrapper[4786]: I0313 16:20:59.156170 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qcdh" event={"ID":"22532062-c47d-4fed-8ad8-6157e07c9ee3","Type":"ContainerStarted","Data":"5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460"} Mar 13 16:21:00 crc kubenswrapper[4786]: I0313 16:21:00.164613 4786 generic.go:334] "Generic (PLEG): container finished" podID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerID="5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460" exitCode=0 Mar 13 16:21:00 crc kubenswrapper[4786]: I0313 16:21:00.164665 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qcdh" event={"ID":"22532062-c47d-4fed-8ad8-6157e07c9ee3","Type":"ContainerDied","Data":"5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460"} Mar 13 16:21:01 crc kubenswrapper[4786]: I0313 16:21:01.172834 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qcdh" event={"ID":"22532062-c47d-4fed-8ad8-6157e07c9ee3","Type":"ContainerStarted","Data":"7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9"} Mar 13 16:21:01 crc kubenswrapper[4786]: I0313 16:21:01.199810 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4qcdh" podStartSLOduration=2.575052817 podStartE2EDuration="5.19979085s" podCreationTimestamp="2026-03-13 16:20:56 +0000 UTC" firstStartedPulling="2026-03-13 16:20:58.142533486 +0000 UTC m=+4688.305745297" lastFinishedPulling="2026-03-13 16:21:00.767271519 +0000 UTC m=+4690.930483330" observedRunningTime="2026-03-13 16:21:01.19424567 +0000 UTC m=+4691.357457481" watchObservedRunningTime="2026-03-13 16:21:01.19979085 +0000 UTC m=+4691.363002671" Mar 13 16:21:03 crc kubenswrapper[4786]: I0313 16:21:03.553628 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:21:03 crc kubenswrapper[4786]: E0313 16:21:03.554399 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:21:06 crc kubenswrapper[4786]: I0313 16:21:06.426481 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:21:06 crc kubenswrapper[4786]: I0313 16:21:06.426831 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:21:06 crc kubenswrapper[4786]: I0313 16:21:06.504302 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:21:07 crc kubenswrapper[4786]: I0313 16:21:07.304405 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:21:07 crc kubenswrapper[4786]: I0313 16:21:07.372086 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qcdh"] Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.255724 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4qcdh" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerName="registry-server" containerID="cri-o://7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9" gracePeriod=2 Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.680869 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.775396 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-catalog-content\") pod \"22532062-c47d-4fed-8ad8-6157e07c9ee3\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.775502 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-utilities\") pod \"22532062-c47d-4fed-8ad8-6157e07c9ee3\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.775584 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb7jg\" (UniqueName: \"kubernetes.io/projected/22532062-c47d-4fed-8ad8-6157e07c9ee3-kube-api-access-tb7jg\") pod \"22532062-c47d-4fed-8ad8-6157e07c9ee3\" (UID: \"22532062-c47d-4fed-8ad8-6157e07c9ee3\") " Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.777601 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-utilities" (OuterVolumeSpecName: "utilities") pod "22532062-c47d-4fed-8ad8-6157e07c9ee3" (UID: "22532062-c47d-4fed-8ad8-6157e07c9ee3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.784003 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22532062-c47d-4fed-8ad8-6157e07c9ee3-kube-api-access-tb7jg" (OuterVolumeSpecName: "kube-api-access-tb7jg") pod "22532062-c47d-4fed-8ad8-6157e07c9ee3" (UID: "22532062-c47d-4fed-8ad8-6157e07c9ee3"). InnerVolumeSpecName "kube-api-access-tb7jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.804738 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22532062-c47d-4fed-8ad8-6157e07c9ee3" (UID: "22532062-c47d-4fed-8ad8-6157e07c9ee3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.876707 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.876762 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb7jg\" (UniqueName: \"kubernetes.io/projected/22532062-c47d-4fed-8ad8-6157e07c9ee3-kube-api-access-tb7jg\") on node \"crc\" DevicePath \"\"" Mar 13 16:21:09 crc kubenswrapper[4786]: I0313 16:21:09.876783 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22532062-c47d-4fed-8ad8-6157e07c9ee3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.265547 4786 generic.go:334] "Generic (PLEG): container finished" podID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerID="7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9" exitCode=0 Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.265609 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qcdh" event={"ID":"22532062-c47d-4fed-8ad8-6157e07c9ee3","Type":"ContainerDied","Data":"7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9"} Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.265658 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qcdh" event={"ID":"22532062-c47d-4fed-8ad8-6157e07c9ee3","Type":"ContainerDied","Data":"9dc3d9b1cfde371dd667d17296b0d8fbca803b08da034eed9575d34432ced84b"} Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.265686 4786 scope.go:117] "RemoveContainer" containerID="7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.265614 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qcdh" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.314667 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qcdh"] Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.317663 4786 scope.go:117] "RemoveContainer" containerID="5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.324908 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qcdh"] Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.337468 4786 scope.go:117] "RemoveContainer" containerID="6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.369043 4786 scope.go:117] "RemoveContainer" containerID="7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9" Mar 13 16:21:10 crc kubenswrapper[4786]: E0313 16:21:10.369677 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9\": container with ID starting with 7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9 not found: ID does not exist" containerID="7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.369742 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9"} err="failed to get container status \"7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9\": rpc error: code = NotFound desc = could not find container \"7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9\": container with ID starting with 7e6f4bbd483c1121198d0d877cb3ef52a8bba3900e9c408249c4033cab4c43d9 not found: ID does not exist" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.369788 4786 scope.go:117] "RemoveContainer" containerID="5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460" Mar 13 16:21:10 crc kubenswrapper[4786]: E0313 16:21:10.370305 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460\": container with ID starting with 5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460 not found: ID does not exist" containerID="5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.370348 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460"} err="failed to get container status \"5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460\": rpc error: code = NotFound desc = could not find container \"5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460\": container with ID starting with 5c2a3354932e943dca198b0cc55d455e391417be103da11cc5608d7bf5373460 not found: ID does not exist" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.370397 4786 scope.go:117] "RemoveContainer" containerID="6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7" Mar 13 16:21:10 crc kubenswrapper[4786]: E0313 16:21:10.370907 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7\": container with ID starting with 6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7 not found: ID does not exist" containerID="6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.370966 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7"} err="failed to get container status \"6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7\": rpc error: code = NotFound desc = could not find container \"6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7\": container with ID starting with 6daaf694845692cca046fd84f6ede1f60c53fe3a0880ae50eb613b5edc79e9d7 not found: ID does not exist" Mar 13 16:21:10 crc kubenswrapper[4786]: I0313 16:21:10.566461 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" path="/var/lib/kubelet/pods/22532062-c47d-4fed-8ad8-6157e07c9ee3/volumes" Mar 13 16:21:15 crc kubenswrapper[4786]: I0313 16:21:15.552474 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:21:15 crc kubenswrapper[4786]: E0313 16:21:15.553201 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:21:29 crc kubenswrapper[4786]: I0313 16:21:29.552361 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:21:29 crc kubenswrapper[4786]: E0313 16:21:29.553570 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:21:43 crc kubenswrapper[4786]: I0313 16:21:43.552401 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:21:43 crc kubenswrapper[4786]: E0313 16:21:43.553216 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:21:58 crc kubenswrapper[4786]: I0313 16:21:58.552561 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:21:58 crc kubenswrapper[4786]: E0313 16:21:58.553676 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.149653 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556982-ds5kh"] Mar 13 16:22:00 crc kubenswrapper[4786]: E0313 16:22:00.150009 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerName="registry-server" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.150022 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerName="registry-server" Mar 13 16:22:00 crc kubenswrapper[4786]: E0313 16:22:00.150036 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerName="extract-utilities" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.150042 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerName="extract-utilities" Mar 13 16:22:00 crc kubenswrapper[4786]: E0313 16:22:00.150056 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerName="extract-content" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.150062 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerName="extract-content" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.150193 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="22532062-c47d-4fed-8ad8-6157e07c9ee3" containerName="registry-server" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.150659 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556982-ds5kh" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.152665 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.153732 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.153788 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.163467 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556982-ds5kh"] Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.269848 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzlgm\" (UniqueName: \"kubernetes.io/projected/3ce8871c-daa4-4243-95fa-b6fffee541f1-kube-api-access-zzlgm\") pod \"auto-csr-approver-29556982-ds5kh\" (UID: \"3ce8871c-daa4-4243-95fa-b6fffee541f1\") " pod="openshift-infra/auto-csr-approver-29556982-ds5kh" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.371457 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzlgm\" (UniqueName: \"kubernetes.io/projected/3ce8871c-daa4-4243-95fa-b6fffee541f1-kube-api-access-zzlgm\") pod \"auto-csr-approver-29556982-ds5kh\" (UID: \"3ce8871c-daa4-4243-95fa-b6fffee541f1\") " pod="openshift-infra/auto-csr-approver-29556982-ds5kh" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.395786 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzlgm\" (UniqueName: \"kubernetes.io/projected/3ce8871c-daa4-4243-95fa-b6fffee541f1-kube-api-access-zzlgm\") pod \"auto-csr-approver-29556982-ds5kh\" (UID: \"3ce8871c-daa4-4243-95fa-b6fffee541f1\") " pod="openshift-infra/auto-csr-approver-29556982-ds5kh" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.472044 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556982-ds5kh" Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.963751 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556982-ds5kh"] Mar 13 16:22:00 crc kubenswrapper[4786]: I0313 16:22:00.975676 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:22:01 crc kubenswrapper[4786]: I0313 16:22:01.739976 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556982-ds5kh" event={"ID":"3ce8871c-daa4-4243-95fa-b6fffee541f1","Type":"ContainerStarted","Data":"b36a47a13f7441300fe72525541cbda408092c6cc1fbb37f84b91879d4aab9d5"} Mar 13 16:22:02 crc kubenswrapper[4786]: I0313 16:22:02.752127 4786 generic.go:334] "Generic (PLEG): container finished" podID="3ce8871c-daa4-4243-95fa-b6fffee541f1" containerID="b9649dbc4c54fc54b5fd7fcaafbc6ad4e91320bb7f85c8bc0102a3ddc2e6497f" exitCode=0 Mar 13 16:22:02 crc kubenswrapper[4786]: I0313 16:22:02.752190 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556982-ds5kh" event={"ID":"3ce8871c-daa4-4243-95fa-b6fffee541f1","Type":"ContainerDied","Data":"b9649dbc4c54fc54b5fd7fcaafbc6ad4e91320bb7f85c8bc0102a3ddc2e6497f"} Mar 13 16:22:04 crc kubenswrapper[4786]: I0313 16:22:04.120642 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556982-ds5kh" Mar 13 16:22:04 crc kubenswrapper[4786]: I0313 16:22:04.230545 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzlgm\" (UniqueName: \"kubernetes.io/projected/3ce8871c-daa4-4243-95fa-b6fffee541f1-kube-api-access-zzlgm\") pod \"3ce8871c-daa4-4243-95fa-b6fffee541f1\" (UID: \"3ce8871c-daa4-4243-95fa-b6fffee541f1\") " Mar 13 16:22:04 crc kubenswrapper[4786]: I0313 16:22:04.248556 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce8871c-daa4-4243-95fa-b6fffee541f1-kube-api-access-zzlgm" (OuterVolumeSpecName: "kube-api-access-zzlgm") pod "3ce8871c-daa4-4243-95fa-b6fffee541f1" (UID: "3ce8871c-daa4-4243-95fa-b6fffee541f1"). InnerVolumeSpecName "kube-api-access-zzlgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:22:04 crc kubenswrapper[4786]: I0313 16:22:04.332280 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzlgm\" (UniqueName: \"kubernetes.io/projected/3ce8871c-daa4-4243-95fa-b6fffee541f1-kube-api-access-zzlgm\") on node \"crc\" DevicePath \"\"" Mar 13 16:22:04 crc kubenswrapper[4786]: I0313 16:22:04.774249 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556982-ds5kh" event={"ID":"3ce8871c-daa4-4243-95fa-b6fffee541f1","Type":"ContainerDied","Data":"b36a47a13f7441300fe72525541cbda408092c6cc1fbb37f84b91879d4aab9d5"} Mar 13 16:22:04 crc kubenswrapper[4786]: I0313 16:22:04.774305 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b36a47a13f7441300fe72525541cbda408092c6cc1fbb37f84b91879d4aab9d5" Mar 13 16:22:04 crc kubenswrapper[4786]: I0313 16:22:04.774379 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556982-ds5kh" Mar 13 16:22:05 crc kubenswrapper[4786]: I0313 16:22:05.212042 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556976-fvp9f"] Mar 13 16:22:05 crc kubenswrapper[4786]: I0313 16:22:05.221632 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556976-fvp9f"] Mar 13 16:22:06 crc kubenswrapper[4786]: I0313 16:22:06.570722 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a1a461-9ca2-4056-8458-47a00554a130" path="/var/lib/kubelet/pods/d8a1a461-9ca2-4056-8458-47a00554a130/volumes" Mar 13 16:22:13 crc kubenswrapper[4786]: I0313 16:22:13.230432 4786 scope.go:117] "RemoveContainer" containerID="7c281f93ce959efc5c6809b73c7c9e3a6bde7b53e6a222112d2aa508428ebec3" Mar 13 16:22:13 crc kubenswrapper[4786]: I0313 16:22:13.552450 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:22:13 crc kubenswrapper[4786]: E0313 16:22:13.553241 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:22:28 crc kubenswrapper[4786]: I0313 16:22:28.552785 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:22:28 crc kubenswrapper[4786]: E0313 16:22:28.554843 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:22:40 crc kubenswrapper[4786]: I0313 16:22:40.563634 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:22:40 crc kubenswrapper[4786]: E0313 16:22:40.565842 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:22:51 crc kubenswrapper[4786]: I0313 16:22:51.552461 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:22:51 crc kubenswrapper[4786]: E0313 16:22:51.553433 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:23:04 crc kubenswrapper[4786]: I0313 16:23:04.553491 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:23:04 crc kubenswrapper[4786]: E0313 16:23:04.554682 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:23:16 crc kubenswrapper[4786]: I0313 16:23:16.551999 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:23:16 crc kubenswrapper[4786]: E0313 16:23:16.552934 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:23:27 crc kubenswrapper[4786]: I0313 16:23:27.552588 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:23:27 crc kubenswrapper[4786]: E0313 16:23:27.553640 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:23:40 crc kubenswrapper[4786]: I0313 16:23:40.562468 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:23:40 crc kubenswrapper[4786]: E0313 16:23:40.565401 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:23:41 crc kubenswrapper[4786]: I0313 16:23:41.888413 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8xbv7"] Mar 13 16:23:41 crc kubenswrapper[4786]: E0313 16:23:41.888831 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce8871c-daa4-4243-95fa-b6fffee541f1" containerName="oc" Mar 13 16:23:41 crc kubenswrapper[4786]: I0313 16:23:41.888851 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce8871c-daa4-4243-95fa-b6fffee541f1" containerName="oc" Mar 13 16:23:41 crc kubenswrapper[4786]: I0313 16:23:41.889099 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce8871c-daa4-4243-95fa-b6fffee541f1" containerName="oc" Mar 13 16:23:41 crc kubenswrapper[4786]: I0313 16:23:41.893774 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:41 crc kubenswrapper[4786]: I0313 16:23:41.914083 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xbv7"] Mar 13 16:23:41 crc kubenswrapper[4786]: I0313 16:23:41.927813 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-catalog-content\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:41 crc kubenswrapper[4786]: I0313 16:23:41.927901 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbcg4\" (UniqueName: \"kubernetes.io/projected/08c80932-5699-4922-9459-52ed0f4665cb-kube-api-access-xbcg4\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:41 crc kubenswrapper[4786]: I0313 16:23:41.927940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-utilities\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:42 crc kubenswrapper[4786]: I0313 16:23:42.028545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-catalog-content\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:42 crc kubenswrapper[4786]: I0313 16:23:42.028615 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbcg4\" (UniqueName: \"kubernetes.io/projected/08c80932-5699-4922-9459-52ed0f4665cb-kube-api-access-xbcg4\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:42 crc kubenswrapper[4786]: I0313 16:23:42.028652 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-utilities\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:42 crc kubenswrapper[4786]: I0313 16:23:42.029121 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-utilities\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:42 crc kubenswrapper[4786]: I0313 16:23:42.029359 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-catalog-content\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:42 crc kubenswrapper[4786]: I0313 16:23:42.050998 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbcg4\" (UniqueName: \"kubernetes.io/projected/08c80932-5699-4922-9459-52ed0f4665cb-kube-api-access-xbcg4\") pod \"community-operators-8xbv7\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:42 crc kubenswrapper[4786]: I0313 16:23:42.218504 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:42 crc kubenswrapper[4786]: I0313 16:23:42.694419 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xbv7"] Mar 13 16:23:43 crc kubenswrapper[4786]: I0313 16:23:43.577324 4786 generic.go:334] "Generic (PLEG): container finished" podID="08c80932-5699-4922-9459-52ed0f4665cb" containerID="37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e" exitCode=0 Mar 13 16:23:43 crc kubenswrapper[4786]: I0313 16:23:43.577369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbv7" event={"ID":"08c80932-5699-4922-9459-52ed0f4665cb","Type":"ContainerDied","Data":"37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e"} Mar 13 16:23:43 crc kubenswrapper[4786]: I0313 16:23:43.577400 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbv7" event={"ID":"08c80932-5699-4922-9459-52ed0f4665cb","Type":"ContainerStarted","Data":"31b8fe4a7525aaab16bb22b625af3adbc03a729c40f3c4ca25aa6f09df7ee189"} Mar 13 16:23:44 crc kubenswrapper[4786]: I0313 16:23:44.583901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbv7" event={"ID":"08c80932-5699-4922-9459-52ed0f4665cb","Type":"ContainerStarted","Data":"3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18"} Mar 13 16:23:45 crc kubenswrapper[4786]: I0313 16:23:45.595398 4786 generic.go:334] "Generic (PLEG): container finished" podID="08c80932-5699-4922-9459-52ed0f4665cb" containerID="3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18" exitCode=0 Mar 13 16:23:45 crc kubenswrapper[4786]: I0313 16:23:45.595522 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbv7" event={"ID":"08c80932-5699-4922-9459-52ed0f4665cb","Type":"ContainerDied","Data":"3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18"} Mar 13 16:23:46 crc kubenswrapper[4786]: I0313 16:23:46.606358 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbv7" event={"ID":"08c80932-5699-4922-9459-52ed0f4665cb","Type":"ContainerStarted","Data":"7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0"} Mar 13 16:23:46 crc kubenswrapper[4786]: I0313 16:23:46.627075 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8xbv7" podStartSLOduration=3.149909636 podStartE2EDuration="5.627048136s" podCreationTimestamp="2026-03-13 16:23:41 +0000 UTC" firstStartedPulling="2026-03-13 16:23:43.579815195 +0000 UTC m=+4853.743027016" lastFinishedPulling="2026-03-13 16:23:46.056953665 +0000 UTC m=+4856.220165516" observedRunningTime="2026-03-13 16:23:46.623824194 +0000 UTC m=+4856.787036005" watchObservedRunningTime="2026-03-13 16:23:46.627048136 +0000 UTC m=+4856.790259977" Mar 13 16:23:52 crc kubenswrapper[4786]: I0313 16:23:52.219719 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:52 crc kubenswrapper[4786]: I0313 16:23:52.220538 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:52 crc kubenswrapper[4786]: I0313 16:23:52.277524 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:52 crc kubenswrapper[4786]: I0313 16:23:52.725211 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:52 crc kubenswrapper[4786]: I0313 16:23:52.797562 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xbv7"] Mar 13 16:23:54 crc kubenswrapper[4786]: I0313 16:23:54.677539 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8xbv7" podUID="08c80932-5699-4922-9459-52ed0f4665cb" containerName="registry-server" containerID="cri-o://7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0" gracePeriod=2 Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.287612 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.432802 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-utilities\") pod \"08c80932-5699-4922-9459-52ed0f4665cb\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.432997 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbcg4\" (UniqueName: \"kubernetes.io/projected/08c80932-5699-4922-9459-52ed0f4665cb-kube-api-access-xbcg4\") pod \"08c80932-5699-4922-9459-52ed0f4665cb\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.433118 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-catalog-content\") pod \"08c80932-5699-4922-9459-52ed0f4665cb\" (UID: \"08c80932-5699-4922-9459-52ed0f4665cb\") " Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.435137 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-utilities" (OuterVolumeSpecName: "utilities") pod "08c80932-5699-4922-9459-52ed0f4665cb" (UID: "08c80932-5699-4922-9459-52ed0f4665cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.441414 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c80932-5699-4922-9459-52ed0f4665cb-kube-api-access-xbcg4" (OuterVolumeSpecName: "kube-api-access-xbcg4") pod "08c80932-5699-4922-9459-52ed0f4665cb" (UID: "08c80932-5699-4922-9459-52ed0f4665cb"). InnerVolumeSpecName "kube-api-access-xbcg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.535146 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.535228 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbcg4\" (UniqueName: \"kubernetes.io/projected/08c80932-5699-4922-9459-52ed0f4665cb-kube-api-access-xbcg4\") on node \"crc\" DevicePath \"\"" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.553313 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:23:55 crc kubenswrapper[4786]: E0313 16:23:55.553492 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.687471 4786 generic.go:334] "Generic (PLEG): container finished" podID="08c80932-5699-4922-9459-52ed0f4665cb" containerID="7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0" exitCode=0 Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.687531 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbv7" event={"ID":"08c80932-5699-4922-9459-52ed0f4665cb","Type":"ContainerDied","Data":"7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0"} Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.687551 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xbv7" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.687571 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xbv7" event={"ID":"08c80932-5699-4922-9459-52ed0f4665cb","Type":"ContainerDied","Data":"31b8fe4a7525aaab16bb22b625af3adbc03a729c40f3c4ca25aa6f09df7ee189"} Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.687585 4786 scope.go:117] "RemoveContainer" containerID="7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.715279 4786 scope.go:117] "RemoveContainer" containerID="3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.740141 4786 scope.go:117] "RemoveContainer" containerID="37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.746727 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08c80932-5699-4922-9459-52ed0f4665cb" (UID: "08c80932-5699-4922-9459-52ed0f4665cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.787932 4786 scope.go:117] "RemoveContainer" containerID="7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0" Mar 13 16:23:55 crc kubenswrapper[4786]: E0313 16:23:55.788512 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0\": container with ID starting with 7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0 not found: ID does not exist" containerID="7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.788713 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0"} err="failed to get container status \"7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0\": rpc error: code = NotFound desc = could not find container \"7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0\": container with ID starting with 7cb6de23a6b400443a33b83d32c3f930a2d2249913515ed6554e4c9f06bbb3b0 not found: ID does not exist" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.788838 4786 scope.go:117] "RemoveContainer" containerID="3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18" Mar 13 16:23:55 crc kubenswrapper[4786]: E0313 16:23:55.789487 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18\": container with ID starting with 3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18 not found: ID does not exist" containerID="3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.789526 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18"} err="failed to get container status \"3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18\": rpc error: code = NotFound desc = could not find container \"3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18\": container with ID starting with 3e28a1512e12dd17632f2691717fd9c315865204b243ae3f41a02f1b0ed17a18 not found: ID does not exist" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.789552 4786 scope.go:117] "RemoveContainer" containerID="37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e" Mar 13 16:23:55 crc kubenswrapper[4786]: E0313 16:23:55.789975 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e\": container with ID starting with 37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e not found: ID does not exist" containerID="37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.790029 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e"} err="failed to get container status \"37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e\": rpc error: code = NotFound desc = could not find container \"37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e\": container with ID starting with 37e47f8efcaa53bf789fd525c120793282919e9dd23c5445aff3c62d2958a08e not found: ID does not exist" Mar 13 16:23:55 crc kubenswrapper[4786]: I0313 16:23:55.840366 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c80932-5699-4922-9459-52ed0f4665cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:23:56 crc kubenswrapper[4786]: I0313 16:23:56.062299 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xbv7"] Mar 13 16:23:56 crc kubenswrapper[4786]: I0313 16:23:56.078108 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8xbv7"] Mar 13 16:23:56 crc kubenswrapper[4786]: I0313 16:23:56.566000 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c80932-5699-4922-9459-52ed0f4665cb" path="/var/lib/kubelet/pods/08c80932-5699-4922-9459-52ed0f4665cb/volumes" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.150938 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556984-jvhl9"] Mar 13 16:24:00 crc kubenswrapper[4786]: E0313 16:24:00.151354 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c80932-5699-4922-9459-52ed0f4665cb" containerName="extract-utilities" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.151370 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c80932-5699-4922-9459-52ed0f4665cb" containerName="extract-utilities" Mar 13 16:24:00 crc kubenswrapper[4786]: E0313 16:24:00.151385 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c80932-5699-4922-9459-52ed0f4665cb" containerName="registry-server" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.151390 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c80932-5699-4922-9459-52ed0f4665cb" containerName="registry-server" Mar 13 16:24:00 crc kubenswrapper[4786]: E0313 16:24:00.151411 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c80932-5699-4922-9459-52ed0f4665cb" containerName="extract-content" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.151417 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c80932-5699-4922-9459-52ed0f4665cb" containerName="extract-content" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.151556 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c80932-5699-4922-9459-52ed0f4665cb" containerName="registry-server" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.152006 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556984-jvhl9" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.156335 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.156526 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.157109 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556984-jvhl9"] Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.157414 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.305501 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8nr\" (UniqueName: \"kubernetes.io/projected/41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6-kube-api-access-5f8nr\") pod \"auto-csr-approver-29556984-jvhl9\" (UID: \"41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6\") " pod="openshift-infra/auto-csr-approver-29556984-jvhl9" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.407005 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8nr\" (UniqueName: \"kubernetes.io/projected/41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6-kube-api-access-5f8nr\") pod \"auto-csr-approver-29556984-jvhl9\" (UID: \"41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6\") " pod="openshift-infra/auto-csr-approver-29556984-jvhl9" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.426663 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8nr\" (UniqueName: \"kubernetes.io/projected/41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6-kube-api-access-5f8nr\") pod \"auto-csr-approver-29556984-jvhl9\" (UID: \"41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6\") " pod="openshift-infra/auto-csr-approver-29556984-jvhl9" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.465833 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556984-jvhl9" Mar 13 16:24:00 crc kubenswrapper[4786]: I0313 16:24:00.879104 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556984-jvhl9"] Mar 13 16:24:01 crc kubenswrapper[4786]: I0313 16:24:01.737256 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556984-jvhl9" event={"ID":"41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6","Type":"ContainerStarted","Data":"782ea97eb1fc2c732a395a6393c66a246ea690d46f7e4fc8d4565507802b7bd3"} Mar 13 16:24:02 crc kubenswrapper[4786]: I0313 16:24:02.745243 4786 generic.go:334] "Generic (PLEG): container finished" podID="41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6" containerID="aa7dc5b89ece73d30d472b54962324b243e0a7117127d7a9478dee3377c6f350" exitCode=0 Mar 13 16:24:02 crc kubenswrapper[4786]: I0313 16:24:02.745325 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556984-jvhl9" event={"ID":"41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6","Type":"ContainerDied","Data":"aa7dc5b89ece73d30d472b54962324b243e0a7117127d7a9478dee3377c6f350"} Mar 13 16:24:04 crc kubenswrapper[4786]: I0313 16:24:04.552494 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556984-jvhl9" Mar 13 16:24:04 crc kubenswrapper[4786]: I0313 16:24:04.566995 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f8nr\" (UniqueName: \"kubernetes.io/projected/41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6-kube-api-access-5f8nr\") pod \"41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6\" (UID: \"41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6\") " Mar 13 16:24:04 crc kubenswrapper[4786]: I0313 16:24:04.580547 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6-kube-api-access-5f8nr" (OuterVolumeSpecName: "kube-api-access-5f8nr") pod "41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6" (UID: "41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6"). InnerVolumeSpecName "kube-api-access-5f8nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:24:04 crc kubenswrapper[4786]: I0313 16:24:04.668776 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f8nr\" (UniqueName: \"kubernetes.io/projected/41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6-kube-api-access-5f8nr\") on node \"crc\" DevicePath \"\"" Mar 13 16:24:04 crc kubenswrapper[4786]: I0313 16:24:04.762105 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556984-jvhl9" event={"ID":"41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6","Type":"ContainerDied","Data":"782ea97eb1fc2c732a395a6393c66a246ea690d46f7e4fc8d4565507802b7bd3"} Mar 13 16:24:04 crc kubenswrapper[4786]: I0313 16:24:04.762146 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="782ea97eb1fc2c732a395a6393c66a246ea690d46f7e4fc8d4565507802b7bd3" Mar 13 16:24:04 crc kubenswrapper[4786]: I0313 16:24:04.762176 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556984-jvhl9" Mar 13 16:24:05 crc kubenswrapper[4786]: I0313 16:24:05.631311 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556978-d94pf"] Mar 13 16:24:05 crc kubenswrapper[4786]: I0313 16:24:05.637806 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556978-d94pf"] Mar 13 16:24:06 crc kubenswrapper[4786]: I0313 16:24:06.567306 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7448469-5320-42fb-95c5-78029309e512" path="/var/lib/kubelet/pods/d7448469-5320-42fb-95c5-78029309e512/volumes" Mar 13 16:24:10 crc kubenswrapper[4786]: I0313 16:24:10.556263 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:24:10 crc kubenswrapper[4786]: E0313 16:24:10.556954 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:24:13 crc kubenswrapper[4786]: I0313 16:24:13.334923 4786 scope.go:117] "RemoveContainer" containerID="26d2e81eb5387085f7f62f93098fa8705801382366e9ae62968160576a35e5bf" Mar 13 16:24:24 crc kubenswrapper[4786]: I0313 16:24:24.553125 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:24:24 crc kubenswrapper[4786]: E0313 16:24:24.554393 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:24:36 crc kubenswrapper[4786]: I0313 16:24:36.552445 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:24:36 crc kubenswrapper[4786]: E0313 16:24:36.554420 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:24:51 crc kubenswrapper[4786]: I0313 16:24:51.552791 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:24:51 crc kubenswrapper[4786]: E0313 16:24:51.554026 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:25:02 crc kubenswrapper[4786]: I0313 16:25:02.552160 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:25:02 crc kubenswrapper[4786]: E0313 16:25:02.553823 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:25:13 crc kubenswrapper[4786]: I0313 16:25:13.552535 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:25:14 crc kubenswrapper[4786]: I0313 16:25:14.346339 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"dc7c313ae26eaf0288c8be9988aeae6dd1a6013e849e3a00ef45160c50254dc2"} Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.166614 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556986-m4gx6"] Mar 13 16:26:00 crc kubenswrapper[4786]: E0313 16:26:00.167749 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6" containerName="oc" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.167772 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6" containerName="oc" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.168093 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6" containerName="oc" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.168994 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556986-m4gx6" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.172185 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.172647 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.175231 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.180629 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556986-m4gx6"] Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.337039 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh5pm\" (UniqueName: \"kubernetes.io/projected/8b49e165-c3c0-4ddc-ae7b-5f731910e1d4-kube-api-access-mh5pm\") pod \"auto-csr-approver-29556986-m4gx6\" (UID: \"8b49e165-c3c0-4ddc-ae7b-5f731910e1d4\") " pod="openshift-infra/auto-csr-approver-29556986-m4gx6" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.438981 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh5pm\" (UniqueName: \"kubernetes.io/projected/8b49e165-c3c0-4ddc-ae7b-5f731910e1d4-kube-api-access-mh5pm\") pod \"auto-csr-approver-29556986-m4gx6\" (UID: \"8b49e165-c3c0-4ddc-ae7b-5f731910e1d4\") " pod="openshift-infra/auto-csr-approver-29556986-m4gx6" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.474427 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh5pm\" (UniqueName: \"kubernetes.io/projected/8b49e165-c3c0-4ddc-ae7b-5f731910e1d4-kube-api-access-mh5pm\") pod \"auto-csr-approver-29556986-m4gx6\" (UID: \"8b49e165-c3c0-4ddc-ae7b-5f731910e1d4\") " pod="openshift-infra/auto-csr-approver-29556986-m4gx6" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.495812 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556986-m4gx6" Mar 13 16:26:00 crc kubenswrapper[4786]: I0313 16:26:00.798681 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556986-m4gx6"] Mar 13 16:26:00 crc kubenswrapper[4786]: W0313 16:26:00.804035 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b49e165_c3c0_4ddc_ae7b_5f731910e1d4.slice/crio-5f559fe2d49a84ccfddb3bea2593ef0b1264b65e26ccd19430b7eae43d1d0710 WatchSource:0}: Error finding container 5f559fe2d49a84ccfddb3bea2593ef0b1264b65e26ccd19430b7eae43d1d0710: Status 404 returned error can't find the container with id 5f559fe2d49a84ccfddb3bea2593ef0b1264b65e26ccd19430b7eae43d1d0710 Mar 13 16:26:01 crc kubenswrapper[4786]: I0313 16:26:01.766093 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556986-m4gx6" event={"ID":"8b49e165-c3c0-4ddc-ae7b-5f731910e1d4","Type":"ContainerStarted","Data":"5f559fe2d49a84ccfddb3bea2593ef0b1264b65e26ccd19430b7eae43d1d0710"} Mar 13 16:26:02 crc kubenswrapper[4786]: I0313 16:26:02.779119 4786 generic.go:334] "Generic (PLEG): container finished" podID="8b49e165-c3c0-4ddc-ae7b-5f731910e1d4" containerID="8d0009dc6988b2cd872114ab818d61293fa2515b0ff464035a7a38837f3f5cda" exitCode=0 Mar 13 16:26:02 crc kubenswrapper[4786]: I0313 16:26:02.779178 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556986-m4gx6" event={"ID":"8b49e165-c3c0-4ddc-ae7b-5f731910e1d4","Type":"ContainerDied","Data":"8d0009dc6988b2cd872114ab818d61293fa2515b0ff464035a7a38837f3f5cda"} Mar 13 16:26:04 crc kubenswrapper[4786]: I0313 16:26:04.165785 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556986-m4gx6" Mar 13 16:26:04 crc kubenswrapper[4786]: I0313 16:26:04.305960 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh5pm\" (UniqueName: \"kubernetes.io/projected/8b49e165-c3c0-4ddc-ae7b-5f731910e1d4-kube-api-access-mh5pm\") pod \"8b49e165-c3c0-4ddc-ae7b-5f731910e1d4\" (UID: \"8b49e165-c3c0-4ddc-ae7b-5f731910e1d4\") " Mar 13 16:26:04 crc kubenswrapper[4786]: I0313 16:26:04.312470 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b49e165-c3c0-4ddc-ae7b-5f731910e1d4-kube-api-access-mh5pm" (OuterVolumeSpecName: "kube-api-access-mh5pm") pod "8b49e165-c3c0-4ddc-ae7b-5f731910e1d4" (UID: "8b49e165-c3c0-4ddc-ae7b-5f731910e1d4"). InnerVolumeSpecName "kube-api-access-mh5pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:26:04 crc kubenswrapper[4786]: I0313 16:26:04.407784 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh5pm\" (UniqueName: \"kubernetes.io/projected/8b49e165-c3c0-4ddc-ae7b-5f731910e1d4-kube-api-access-mh5pm\") on node \"crc\" DevicePath \"\"" Mar 13 16:26:04 crc kubenswrapper[4786]: I0313 16:26:04.811714 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556986-m4gx6" event={"ID":"8b49e165-c3c0-4ddc-ae7b-5f731910e1d4","Type":"ContainerDied","Data":"5f559fe2d49a84ccfddb3bea2593ef0b1264b65e26ccd19430b7eae43d1d0710"} Mar 13 16:26:04 crc kubenswrapper[4786]: I0313 16:26:04.811748 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f559fe2d49a84ccfddb3bea2593ef0b1264b65e26ccd19430b7eae43d1d0710" Mar 13 16:26:04 crc kubenswrapper[4786]: I0313 16:26:04.811889 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556986-m4gx6" Mar 13 16:26:05 crc kubenswrapper[4786]: I0313 16:26:05.247263 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556980-z2tc7"] Mar 13 16:26:05 crc kubenswrapper[4786]: I0313 16:26:05.253940 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556980-z2tc7"] Mar 13 16:26:06 crc kubenswrapper[4786]: I0313 16:26:06.568774 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27cdcb18-a9c0-47ca-9acf-033a2179028c" path="/var/lib/kubelet/pods/27cdcb18-a9c0-47ca-9acf-033a2179028c/volumes" Mar 13 16:26:13 crc kubenswrapper[4786]: I0313 16:26:13.461883 4786 scope.go:117] "RemoveContainer" containerID="c4f30524c4d22216359df1afd72ea5947821b395e91ce6b08ae68c95dcf48bb5" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.408623 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rbfhc"] Mar 13 16:27:15 crc kubenswrapper[4786]: E0313 16:27:15.410124 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b49e165-c3c0-4ddc-ae7b-5f731910e1d4" containerName="oc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.410155 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b49e165-c3c0-4ddc-ae7b-5f731910e1d4" containerName="oc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.410541 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b49e165-c3c0-4ddc-ae7b-5f731910e1d4" containerName="oc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.412230 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.431546 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rbfhc"] Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.476371 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9767\" (UniqueName: \"kubernetes.io/projected/2a86ad9a-be23-43aa-90ae-dda1003af63f-kube-api-access-c9767\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.476446 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-catalog-content\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.476520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-utilities\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.577645 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9767\" (UniqueName: \"kubernetes.io/projected/2a86ad9a-be23-43aa-90ae-dda1003af63f-kube-api-access-c9767\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.577704 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-catalog-content\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.577757 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-utilities\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.578256 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-catalog-content\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.578462 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-utilities\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.596526 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9767\" (UniqueName: \"kubernetes.io/projected/2a86ad9a-be23-43aa-90ae-dda1003af63f-kube-api-access-c9767\") pod \"redhat-operators-rbfhc\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:15 crc kubenswrapper[4786]: I0313 16:27:15.741215 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:16 crc kubenswrapper[4786]: I0313 16:27:16.182185 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rbfhc"] Mar 13 16:27:16 crc kubenswrapper[4786]: I0313 16:27:16.950464 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerID="32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7" exitCode=0 Mar 13 16:27:16 crc kubenswrapper[4786]: I0313 16:27:16.950850 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbfhc" event={"ID":"2a86ad9a-be23-43aa-90ae-dda1003af63f","Type":"ContainerDied","Data":"32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7"} Mar 13 16:27:16 crc kubenswrapper[4786]: I0313 16:27:16.950972 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbfhc" event={"ID":"2a86ad9a-be23-43aa-90ae-dda1003af63f","Type":"ContainerStarted","Data":"28ac9ecbf6e73c1520e05e36b4f20c7471e30d1f99076fa7a6e4f25a1dfe02eb"} Mar 13 16:27:16 crc kubenswrapper[4786]: I0313 16:27:16.952840 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:27:17 crc kubenswrapper[4786]: I0313 16:27:17.962070 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbfhc" event={"ID":"2a86ad9a-be23-43aa-90ae-dda1003af63f","Type":"ContainerStarted","Data":"efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25"} Mar 13 16:27:18 crc kubenswrapper[4786]: I0313 16:27:18.971806 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerID="efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25" exitCode=0 Mar 13 16:27:18 crc kubenswrapper[4786]: I0313 16:27:18.971931 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbfhc" event={"ID":"2a86ad9a-be23-43aa-90ae-dda1003af63f","Type":"ContainerDied","Data":"efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25"} Mar 13 16:27:19 crc kubenswrapper[4786]: I0313 16:27:19.983249 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbfhc" event={"ID":"2a86ad9a-be23-43aa-90ae-dda1003af63f","Type":"ContainerStarted","Data":"46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9"} Mar 13 16:27:20 crc kubenswrapper[4786]: I0313 16:27:20.011154 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rbfhc" podStartSLOduration=2.392447985 podStartE2EDuration="5.011134906s" podCreationTimestamp="2026-03-13 16:27:15 +0000 UTC" firstStartedPulling="2026-03-13 16:27:16.952612615 +0000 UTC m=+5067.115824426" lastFinishedPulling="2026-03-13 16:27:19.571299536 +0000 UTC m=+5069.734511347" observedRunningTime="2026-03-13 16:27:20.007599899 +0000 UTC m=+5070.170811720" watchObservedRunningTime="2026-03-13 16:27:20.011134906 +0000 UTC m=+5070.174346717" Mar 13 16:27:25 crc kubenswrapper[4786]: I0313 16:27:25.741568 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:25 crc kubenswrapper[4786]: I0313 16:27:25.742373 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:26 crc kubenswrapper[4786]: I0313 16:27:26.808627 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rbfhc" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="registry-server" probeResult="failure" output=< Mar 13 16:27:26 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 16:27:26 crc kubenswrapper[4786]: > Mar 13 16:27:35 crc kubenswrapper[4786]: I0313 16:27:35.815271 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:35 crc kubenswrapper[4786]: I0313 16:27:35.892986 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:36 crc kubenswrapper[4786]: I0313 16:27:36.053240 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rbfhc"] Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.124911 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rbfhc" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="registry-server" containerID="cri-o://46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9" gracePeriod=2 Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.547349 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.623031 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-catalog-content\") pod \"2a86ad9a-be23-43aa-90ae-dda1003af63f\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.623162 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-utilities\") pod \"2a86ad9a-be23-43aa-90ae-dda1003af63f\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.623267 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9767\" (UniqueName: \"kubernetes.io/projected/2a86ad9a-be23-43aa-90ae-dda1003af63f-kube-api-access-c9767\") pod \"2a86ad9a-be23-43aa-90ae-dda1003af63f\" (UID: \"2a86ad9a-be23-43aa-90ae-dda1003af63f\") " Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.643019 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-utilities" (OuterVolumeSpecName: "utilities") pod "2a86ad9a-be23-43aa-90ae-dda1003af63f" (UID: "2a86ad9a-be23-43aa-90ae-dda1003af63f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.665065 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a86ad9a-be23-43aa-90ae-dda1003af63f-kube-api-access-c9767" (OuterVolumeSpecName: "kube-api-access-c9767") pod "2a86ad9a-be23-43aa-90ae-dda1003af63f" (UID: "2a86ad9a-be23-43aa-90ae-dda1003af63f"). InnerVolumeSpecName "kube-api-access-c9767". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.725744 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9767\" (UniqueName: \"kubernetes.io/projected/2a86ad9a-be23-43aa-90ae-dda1003af63f-kube-api-access-c9767\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.725782 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.811269 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a86ad9a-be23-43aa-90ae-dda1003af63f" (UID: "2a86ad9a-be23-43aa-90ae-dda1003af63f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.827402 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a86ad9a-be23-43aa-90ae-dda1003af63f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.868626 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:27:37 crc kubenswrapper[4786]: I0313 16:27:37.868680 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.136284 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerID="46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9" exitCode=0 Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.136325 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rbfhc" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.136349 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbfhc" event={"ID":"2a86ad9a-be23-43aa-90ae-dda1003af63f","Type":"ContainerDied","Data":"46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9"} Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.136389 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rbfhc" event={"ID":"2a86ad9a-be23-43aa-90ae-dda1003af63f","Type":"ContainerDied","Data":"28ac9ecbf6e73c1520e05e36b4f20c7471e30d1f99076fa7a6e4f25a1dfe02eb"} Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.136417 4786 scope.go:117] "RemoveContainer" containerID="46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.159842 4786 scope.go:117] "RemoveContainer" containerID="efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.180094 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rbfhc"] Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.187942 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rbfhc"] Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.198052 4786 scope.go:117] "RemoveContainer" containerID="32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.223585 4786 scope.go:117] "RemoveContainer" containerID="46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9" Mar 13 16:27:38 crc kubenswrapper[4786]: E0313 16:27:38.224241 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9\": container with ID starting with 46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9 not found: ID does not exist" containerID="46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.224286 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9"} err="failed to get container status \"46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9\": rpc error: code = NotFound desc = could not find container \"46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9\": container with ID starting with 46c5e0bf9df5511baf78635354de319297b52eca2bf0743314f166abfd361ee9 not found: ID does not exist" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.224315 4786 scope.go:117] "RemoveContainer" containerID="efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25" Mar 13 16:27:38 crc kubenswrapper[4786]: E0313 16:27:38.224713 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25\": container with ID starting with efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25 not found: ID does not exist" containerID="efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.224779 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25"} err="failed to get container status \"efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25\": rpc error: code = NotFound desc = could not find container \"efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25\": container with ID starting with efb02a24070411edcf98402dc8ece6a47cfcbb2b8bf9e3534190c9b669405f25 not found: ID does not exist" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.224819 4786 scope.go:117] "RemoveContainer" containerID="32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7" Mar 13 16:27:38 crc kubenswrapper[4786]: E0313 16:27:38.225295 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7\": container with ID starting with 32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7 not found: ID does not exist" containerID="32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.225339 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7"} err="failed to get container status \"32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7\": rpc error: code = NotFound desc = could not find container \"32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7\": container with ID starting with 32c5ac5f3df7ab04a0bdaecffd54bef865c8a138ccd53be3457795c1b31343f7 not found: ID does not exist" Mar 13 16:27:38 crc kubenswrapper[4786]: I0313 16:27:38.561636 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" path="/var/lib/kubelet/pods/2a86ad9a-be23-43aa-90ae-dda1003af63f/volumes" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.032741 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jbw4r"] Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.041378 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jbw4r"] Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.136544 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-zwhbc"] Mar 13 16:27:45 crc kubenswrapper[4786]: E0313 16:27:45.136998 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="extract-utilities" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.137022 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="extract-utilities" Mar 13 16:27:45 crc kubenswrapper[4786]: E0313 16:27:45.137049 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="registry-server" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.137057 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="registry-server" Mar 13 16:27:45 crc kubenswrapper[4786]: E0313 16:27:45.137085 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="extract-content" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.137092 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="extract-content" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.137250 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a86ad9a-be23-43aa-90ae-dda1003af63f" containerName="registry-server" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.137900 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.140110 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.141496 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.142729 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.144902 4786 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-kbgsj" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.155807 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zwhbc"] Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.239210 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjlmx\" (UniqueName: \"kubernetes.io/projected/0fa927f4-a658-4daa-acf1-6c00400d8ee1-kube-api-access-kjlmx\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.239286 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0fa927f4-a658-4daa-acf1-6c00400d8ee1-node-mnt\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.239344 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0fa927f4-a658-4daa-acf1-6c00400d8ee1-crc-storage\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.340423 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjlmx\" (UniqueName: \"kubernetes.io/projected/0fa927f4-a658-4daa-acf1-6c00400d8ee1-kube-api-access-kjlmx\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.340485 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0fa927f4-a658-4daa-acf1-6c00400d8ee1-node-mnt\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.340533 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0fa927f4-a658-4daa-acf1-6c00400d8ee1-crc-storage\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.341348 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0fa927f4-a658-4daa-acf1-6c00400d8ee1-crc-storage\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.341460 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0fa927f4-a658-4daa-acf1-6c00400d8ee1-node-mnt\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.362495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjlmx\" (UniqueName: \"kubernetes.io/projected/0fa927f4-a658-4daa-acf1-6c00400d8ee1-kube-api-access-kjlmx\") pod \"crc-storage-crc-zwhbc\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.515551 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:45 crc kubenswrapper[4786]: I0313 16:27:45.960455 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zwhbc"] Mar 13 16:27:46 crc kubenswrapper[4786]: I0313 16:27:46.198819 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zwhbc" event={"ID":"0fa927f4-a658-4daa-acf1-6c00400d8ee1","Type":"ContainerStarted","Data":"e38871dbf49b2ab6371bc7471ad903eb94fc91ae32d9e692a3ff6914adf782e5"} Mar 13 16:27:46 crc kubenswrapper[4786]: I0313 16:27:46.566968 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c1f7e2a-a7a0-463c-ad77-07674d37c7a2" path="/var/lib/kubelet/pods/5c1f7e2a-a7a0-463c-ad77-07674d37c7a2/volumes" Mar 13 16:27:47 crc kubenswrapper[4786]: I0313 16:27:47.207285 4786 generic.go:334] "Generic (PLEG): container finished" podID="0fa927f4-a658-4daa-acf1-6c00400d8ee1" containerID="90cf092f6a267c5756f3b5141d6b6fbbcb7dee79f877e660242006e224fc0109" exitCode=0 Mar 13 16:27:47 crc kubenswrapper[4786]: I0313 16:27:47.207452 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zwhbc" event={"ID":"0fa927f4-a658-4daa-acf1-6c00400d8ee1","Type":"ContainerDied","Data":"90cf092f6a267c5756f3b5141d6b6fbbcb7dee79f877e660242006e224fc0109"} Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.493442 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.592682 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0fa927f4-a658-4daa-acf1-6c00400d8ee1-crc-storage\") pod \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.592737 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjlmx\" (UniqueName: \"kubernetes.io/projected/0fa927f4-a658-4daa-acf1-6c00400d8ee1-kube-api-access-kjlmx\") pod \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.592794 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0fa927f4-a658-4daa-acf1-6c00400d8ee1-node-mnt\") pod \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\" (UID: \"0fa927f4-a658-4daa-acf1-6c00400d8ee1\") " Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.592955 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fa927f4-a658-4daa-acf1-6c00400d8ee1-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0fa927f4-a658-4daa-acf1-6c00400d8ee1" (UID: "0fa927f4-a658-4daa-acf1-6c00400d8ee1"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.593115 4786 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0fa927f4-a658-4daa-acf1-6c00400d8ee1-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.597493 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa927f4-a658-4daa-acf1-6c00400d8ee1-kube-api-access-kjlmx" (OuterVolumeSpecName: "kube-api-access-kjlmx") pod "0fa927f4-a658-4daa-acf1-6c00400d8ee1" (UID: "0fa927f4-a658-4daa-acf1-6c00400d8ee1"). InnerVolumeSpecName "kube-api-access-kjlmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.610214 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa927f4-a658-4daa-acf1-6c00400d8ee1-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0fa927f4-a658-4daa-acf1-6c00400d8ee1" (UID: "0fa927f4-a658-4daa-acf1-6c00400d8ee1"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.694672 4786 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0fa927f4-a658-4daa-acf1-6c00400d8ee1-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:48 crc kubenswrapper[4786]: I0313 16:27:48.694707 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjlmx\" (UniqueName: \"kubernetes.io/projected/0fa927f4-a658-4daa-acf1-6c00400d8ee1-kube-api-access-kjlmx\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:49 crc kubenswrapper[4786]: I0313 16:27:49.226789 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zwhbc" event={"ID":"0fa927f4-a658-4daa-acf1-6c00400d8ee1","Type":"ContainerDied","Data":"e38871dbf49b2ab6371bc7471ad903eb94fc91ae32d9e692a3ff6914adf782e5"} Mar 13 16:27:49 crc kubenswrapper[4786]: I0313 16:27:49.226842 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zwhbc" Mar 13 16:27:49 crc kubenswrapper[4786]: I0313 16:27:49.226899 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e38871dbf49b2ab6371bc7471ad903eb94fc91ae32d9e692a3ff6914adf782e5" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.591007 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-zwhbc"] Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.598039 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-zwhbc"] Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.734049 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bpvn5"] Mar 13 16:27:50 crc kubenswrapper[4786]: E0313 16:27:50.734381 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa927f4-a658-4daa-acf1-6c00400d8ee1" containerName="storage" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.734396 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa927f4-a658-4daa-acf1-6c00400d8ee1" containerName="storage" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.734535 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa927f4-a658-4daa-acf1-6c00400d8ee1" containerName="storage" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.734988 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.739570 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.739612 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.739583 4786 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-kbgsj" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.739885 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.742963 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bpvn5"] Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.935823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/727c5931-2f5e-42fc-b607-b2141595fd03-node-mnt\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.935921 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/727c5931-2f5e-42fc-b607-b2141595fd03-crc-storage\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:50 crc kubenswrapper[4786]: I0313 16:27:50.935995 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjphm\" (UniqueName: \"kubernetes.io/projected/727c5931-2f5e-42fc-b607-b2141595fd03-kube-api-access-tjphm\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:51 crc kubenswrapper[4786]: I0313 16:27:51.037563 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/727c5931-2f5e-42fc-b607-b2141595fd03-node-mnt\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:51 crc kubenswrapper[4786]: I0313 16:27:51.037889 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/727c5931-2f5e-42fc-b607-b2141595fd03-node-mnt\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:51 crc kubenswrapper[4786]: I0313 16:27:51.038296 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/727c5931-2f5e-42fc-b607-b2141595fd03-crc-storage\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:51 crc kubenswrapper[4786]: I0313 16:27:51.038418 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjphm\" (UniqueName: \"kubernetes.io/projected/727c5931-2f5e-42fc-b607-b2141595fd03-kube-api-access-tjphm\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:51 crc kubenswrapper[4786]: I0313 16:27:51.039171 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/727c5931-2f5e-42fc-b607-b2141595fd03-crc-storage\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:51 crc kubenswrapper[4786]: I0313 16:27:51.068266 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjphm\" (UniqueName: \"kubernetes.io/projected/727c5931-2f5e-42fc-b607-b2141595fd03-kube-api-access-tjphm\") pod \"crc-storage-crc-bpvn5\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:51 crc kubenswrapper[4786]: I0313 16:27:51.353013 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:51 crc kubenswrapper[4786]: I0313 16:27:51.789563 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bpvn5"] Mar 13 16:27:52 crc kubenswrapper[4786]: I0313 16:27:52.262900 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bpvn5" event={"ID":"727c5931-2f5e-42fc-b607-b2141595fd03","Type":"ContainerStarted","Data":"3b2e0aa9f56d711b834abb4cc8b8760239b5d862fbfb47f8507fd5e3460b2e4e"} Mar 13 16:27:52 crc kubenswrapper[4786]: I0313 16:27:52.569447 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa927f4-a658-4daa-acf1-6c00400d8ee1" path="/var/lib/kubelet/pods/0fa927f4-a658-4daa-acf1-6c00400d8ee1/volumes" Mar 13 16:27:53 crc kubenswrapper[4786]: I0313 16:27:53.271991 4786 generic.go:334] "Generic (PLEG): container finished" podID="727c5931-2f5e-42fc-b607-b2141595fd03" containerID="75345fccbe324dd65aa941c3ac60e11e9123d7b445e13b576097586b856d2eb9" exitCode=0 Mar 13 16:27:53 crc kubenswrapper[4786]: I0313 16:27:53.272028 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bpvn5" event={"ID":"727c5931-2f5e-42fc-b607-b2141595fd03","Type":"ContainerDied","Data":"75345fccbe324dd65aa941c3ac60e11e9123d7b445e13b576097586b856d2eb9"} Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.589828 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.662773 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/727c5931-2f5e-42fc-b607-b2141595fd03-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "727c5931-2f5e-42fc-b607-b2141595fd03" (UID: "727c5931-2f5e-42fc-b607-b2141595fd03"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.662391 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/727c5931-2f5e-42fc-b607-b2141595fd03-node-mnt\") pod \"727c5931-2f5e-42fc-b607-b2141595fd03\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.663371 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjphm\" (UniqueName: \"kubernetes.io/projected/727c5931-2f5e-42fc-b607-b2141595fd03-kube-api-access-tjphm\") pod \"727c5931-2f5e-42fc-b607-b2141595fd03\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.663524 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/727c5931-2f5e-42fc-b607-b2141595fd03-crc-storage\") pod \"727c5931-2f5e-42fc-b607-b2141595fd03\" (UID: \"727c5931-2f5e-42fc-b607-b2141595fd03\") " Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.664040 4786 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/727c5931-2f5e-42fc-b607-b2141595fd03-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.668626 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727c5931-2f5e-42fc-b607-b2141595fd03-kube-api-access-tjphm" (OuterVolumeSpecName: "kube-api-access-tjphm") pod "727c5931-2f5e-42fc-b607-b2141595fd03" (UID: "727c5931-2f5e-42fc-b607-b2141595fd03"). InnerVolumeSpecName "kube-api-access-tjphm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.681611 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/727c5931-2f5e-42fc-b607-b2141595fd03-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "727c5931-2f5e-42fc-b607-b2141595fd03" (UID: "727c5931-2f5e-42fc-b607-b2141595fd03"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.765483 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjphm\" (UniqueName: \"kubernetes.io/projected/727c5931-2f5e-42fc-b607-b2141595fd03-kube-api-access-tjphm\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:54 crc kubenswrapper[4786]: I0313 16:27:54.765838 4786 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/727c5931-2f5e-42fc-b607-b2141595fd03-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 13 16:27:55 crc kubenswrapper[4786]: I0313 16:27:55.286396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bpvn5" event={"ID":"727c5931-2f5e-42fc-b607-b2141595fd03","Type":"ContainerDied","Data":"3b2e0aa9f56d711b834abb4cc8b8760239b5d862fbfb47f8507fd5e3460b2e4e"} Mar 13 16:27:55 crc kubenswrapper[4786]: I0313 16:27:55.286443 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b2e0aa9f56d711b834abb4cc8b8760239b5d862fbfb47f8507fd5e3460b2e4e" Mar 13 16:27:55 crc kubenswrapper[4786]: I0313 16:27:55.286503 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bpvn5" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.155788 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556988-6msng"] Mar 13 16:28:00 crc kubenswrapper[4786]: E0313 16:28:00.156703 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727c5931-2f5e-42fc-b607-b2141595fd03" containerName="storage" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.156715 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="727c5931-2f5e-42fc-b607-b2141595fd03" containerName="storage" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.156839 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="727c5931-2f5e-42fc-b607-b2141595fd03" containerName="storage" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.157383 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556988-6msng" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.161665 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.161847 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.161996 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.194756 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556988-6msng"] Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.246592 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8hft\" (UniqueName: \"kubernetes.io/projected/309022bf-f87f-42c2-8cdb-f03ddf74bbd9-kube-api-access-z8hft\") pod \"auto-csr-approver-29556988-6msng\" (UID: \"309022bf-f87f-42c2-8cdb-f03ddf74bbd9\") " pod="openshift-infra/auto-csr-approver-29556988-6msng" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.349035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8hft\" (UniqueName: \"kubernetes.io/projected/309022bf-f87f-42c2-8cdb-f03ddf74bbd9-kube-api-access-z8hft\") pod \"auto-csr-approver-29556988-6msng\" (UID: \"309022bf-f87f-42c2-8cdb-f03ddf74bbd9\") " pod="openshift-infra/auto-csr-approver-29556988-6msng" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.382447 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8hft\" (UniqueName: \"kubernetes.io/projected/309022bf-f87f-42c2-8cdb-f03ddf74bbd9-kube-api-access-z8hft\") pod \"auto-csr-approver-29556988-6msng\" (UID: \"309022bf-f87f-42c2-8cdb-f03ddf74bbd9\") " pod="openshift-infra/auto-csr-approver-29556988-6msng" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.488255 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556988-6msng" Mar 13 16:28:00 crc kubenswrapper[4786]: I0313 16:28:00.943772 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556988-6msng"] Mar 13 16:28:01 crc kubenswrapper[4786]: I0313 16:28:01.350404 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556988-6msng" event={"ID":"309022bf-f87f-42c2-8cdb-f03ddf74bbd9","Type":"ContainerStarted","Data":"df339f26f69f2524d0a94440a074387d5126c3cba98fd42e636a94b1d92ab092"} Mar 13 16:28:02 crc kubenswrapper[4786]: I0313 16:28:02.365309 4786 generic.go:334] "Generic (PLEG): container finished" podID="309022bf-f87f-42c2-8cdb-f03ddf74bbd9" containerID="ea01c12cf711381e618eeae990725a65856bae6645910485b11e86d9d87cacde" exitCode=0 Mar 13 16:28:02 crc kubenswrapper[4786]: I0313 16:28:02.365401 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556988-6msng" event={"ID":"309022bf-f87f-42c2-8cdb-f03ddf74bbd9","Type":"ContainerDied","Data":"ea01c12cf711381e618eeae990725a65856bae6645910485b11e86d9d87cacde"} Mar 13 16:28:03 crc kubenswrapper[4786]: I0313 16:28:03.686188 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556988-6msng" Mar 13 16:28:03 crc kubenswrapper[4786]: I0313 16:28:03.809355 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8hft\" (UniqueName: \"kubernetes.io/projected/309022bf-f87f-42c2-8cdb-f03ddf74bbd9-kube-api-access-z8hft\") pod \"309022bf-f87f-42c2-8cdb-f03ddf74bbd9\" (UID: \"309022bf-f87f-42c2-8cdb-f03ddf74bbd9\") " Mar 13 16:28:03 crc kubenswrapper[4786]: I0313 16:28:03.814639 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309022bf-f87f-42c2-8cdb-f03ddf74bbd9-kube-api-access-z8hft" (OuterVolumeSpecName: "kube-api-access-z8hft") pod "309022bf-f87f-42c2-8cdb-f03ddf74bbd9" (UID: "309022bf-f87f-42c2-8cdb-f03ddf74bbd9"). InnerVolumeSpecName "kube-api-access-z8hft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:28:03 crc kubenswrapper[4786]: I0313 16:28:03.911018 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8hft\" (UniqueName: \"kubernetes.io/projected/309022bf-f87f-42c2-8cdb-f03ddf74bbd9-kube-api-access-z8hft\") on node \"crc\" DevicePath \"\"" Mar 13 16:28:04 crc kubenswrapper[4786]: I0313 16:28:04.386111 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556988-6msng" event={"ID":"309022bf-f87f-42c2-8cdb-f03ddf74bbd9","Type":"ContainerDied","Data":"df339f26f69f2524d0a94440a074387d5126c3cba98fd42e636a94b1d92ab092"} Mar 13 16:28:04 crc kubenswrapper[4786]: I0313 16:28:04.386171 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556988-6msng" Mar 13 16:28:04 crc kubenswrapper[4786]: I0313 16:28:04.386170 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df339f26f69f2524d0a94440a074387d5126c3cba98fd42e636a94b1d92ab092" Mar 13 16:28:04 crc kubenswrapper[4786]: I0313 16:28:04.745560 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556982-ds5kh"] Mar 13 16:28:04 crc kubenswrapper[4786]: I0313 16:28:04.759977 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556982-ds5kh"] Mar 13 16:28:06 crc kubenswrapper[4786]: I0313 16:28:06.564351 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce8871c-daa4-4243-95fa-b6fffee541f1" path="/var/lib/kubelet/pods/3ce8871c-daa4-4243-95fa-b6fffee541f1/volumes" Mar 13 16:28:07 crc kubenswrapper[4786]: I0313 16:28:07.869003 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:28:07 crc kubenswrapper[4786]: I0313 16:28:07.869099 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:28:13 crc kubenswrapper[4786]: I0313 16:28:13.567759 4786 scope.go:117] "RemoveContainer" containerID="7ff3ba39c44e7b17afc354eac43d6001c39d03f38c1740a77085c938ee044094" Mar 13 16:28:13 crc kubenswrapper[4786]: I0313 16:28:13.602608 4786 scope.go:117] "RemoveContainer" containerID="b9649dbc4c54fc54b5fd7fcaafbc6ad4e91320bb7f85c8bc0102a3ddc2e6497f" Mar 13 16:28:37 crc kubenswrapper[4786]: I0313 16:28:37.868329 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:28:37 crc kubenswrapper[4786]: I0313 16:28:37.868996 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:28:37 crc kubenswrapper[4786]: I0313 16:28:37.869050 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:28:37 crc kubenswrapper[4786]: I0313 16:28:37.869709 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc7c313ae26eaf0288c8be9988aeae6dd1a6013e849e3a00ef45160c50254dc2"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:28:37 crc kubenswrapper[4786]: I0313 16:28:37.869761 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://dc7c313ae26eaf0288c8be9988aeae6dd1a6013e849e3a00ef45160c50254dc2" gracePeriod=600 Mar 13 16:28:38 crc kubenswrapper[4786]: I0313 16:28:38.736659 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="dc7c313ae26eaf0288c8be9988aeae6dd1a6013e849e3a00ef45160c50254dc2" exitCode=0 Mar 13 16:28:38 crc kubenswrapper[4786]: I0313 16:28:38.737124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"dc7c313ae26eaf0288c8be9988aeae6dd1a6013e849e3a00ef45160c50254dc2"} Mar 13 16:28:38 crc kubenswrapper[4786]: I0313 16:28:38.737153 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade"} Mar 13 16:28:38 crc kubenswrapper[4786]: I0313 16:28:38.737170 4786 scope.go:117] "RemoveContainer" containerID="32f470196a21743aba7777f35634c31802d102d760f783ee77ce2eba77cfc739" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.984069 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-54wfz"] Mar 13 16:29:53 crc kubenswrapper[4786]: E0313 16:29:53.985060 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309022bf-f87f-42c2-8cdb-f03ddf74bbd9" containerName="oc" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.985078 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="309022bf-f87f-42c2-8cdb-f03ddf74bbd9" containerName="oc" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.985265 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="309022bf-f87f-42c2-8cdb-f03ddf74bbd9" containerName="oc" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.986202 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.990003 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-lml94"] Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.990078 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-g9ldg" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.990663 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.990999 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.991710 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 16:29:53 crc kubenswrapper[4786]: I0313 16:29:53.991887 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.007234 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.008160 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-54wfz"] Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.018338 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-lml94"] Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.175460 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxb9z\" (UniqueName: \"kubernetes.io/projected/1b3d663b-e569-40cf-a091-c64ac0cc07cb-kube-api-access-mxb9z\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.175538 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-dns-svc\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.175608 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8606f4dd-6856-4815-a09b-99f43ba18a08-config\") pod \"dnsmasq-dns-78dcc4d9b5-lml94\" (UID: \"8606f4dd-6856-4815-a09b-99f43ba18a08\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.175645 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-config\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.175691 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vth8m\" (UniqueName: \"kubernetes.io/projected/8606f4dd-6856-4815-a09b-99f43ba18a08-kube-api-access-vth8m\") pod \"dnsmasq-dns-78dcc4d9b5-lml94\" (UID: \"8606f4dd-6856-4815-a09b-99f43ba18a08\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.276650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-dns-svc\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.276702 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8606f4dd-6856-4815-a09b-99f43ba18a08-config\") pod \"dnsmasq-dns-78dcc4d9b5-lml94\" (UID: \"8606f4dd-6856-4815-a09b-99f43ba18a08\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.276720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-config\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.276737 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vth8m\" (UniqueName: \"kubernetes.io/projected/8606f4dd-6856-4815-a09b-99f43ba18a08-kube-api-access-vth8m\") pod \"dnsmasq-dns-78dcc4d9b5-lml94\" (UID: \"8606f4dd-6856-4815-a09b-99f43ba18a08\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.276804 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxb9z\" (UniqueName: \"kubernetes.io/projected/1b3d663b-e569-40cf-a091-c64ac0cc07cb-kube-api-access-mxb9z\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.277456 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-dns-svc\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.277607 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8606f4dd-6856-4815-a09b-99f43ba18a08-config\") pod \"dnsmasq-dns-78dcc4d9b5-lml94\" (UID: \"8606f4dd-6856-4815-a09b-99f43ba18a08\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.277613 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-config\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.295596 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxb9z\" (UniqueName: \"kubernetes.io/projected/1b3d663b-e569-40cf-a091-c64ac0cc07cb-kube-api-access-mxb9z\") pod \"dnsmasq-dns-76f4889f87-54wfz\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.298513 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vth8m\" (UniqueName: \"kubernetes.io/projected/8606f4dd-6856-4815-a09b-99f43ba18a08-kube-api-access-vth8m\") pod \"dnsmasq-dns-78dcc4d9b5-lml94\" (UID: \"8606f4dd-6856-4815-a09b-99f43ba18a08\") " pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.309353 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.315930 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.770529 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-54wfz"] Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.781467 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-lml94"] Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.822230 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc7c884dc-pppkc"] Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.823646 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.846751 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc7c884dc-pppkc"] Mar 13 16:29:54 crc kubenswrapper[4786]: W0313 16:29:54.869531 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b3d663b_e569_40cf_a091_c64ac0cc07cb.slice/crio-e16933924686ad14a9b268ae3737d2f8eae3fdd97864fbe173b010107b030996 WatchSource:0}: Error finding container e16933924686ad14a9b268ae3737d2f8eae3fdd97864fbe173b010107b030996: Status 404 returned error can't find the container with id e16933924686ad14a9b268ae3737d2f8eae3fdd97864fbe173b010107b030996 Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.870631 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-54wfz"] Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.984267 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-config\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.984378 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-dns-svc\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:54 crc kubenswrapper[4786]: I0313 16:29:54.984530 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2v5q\" (UniqueName: \"kubernetes.io/projected/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-kube-api-access-f2v5q\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.086569 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2v5q\" (UniqueName: \"kubernetes.io/projected/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-kube-api-access-f2v5q\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.086653 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-config\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.086747 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-dns-svc\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.088374 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-dns-svc\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.088722 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-config\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.093925 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-lml94"] Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.107225 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2v5q\" (UniqueName: \"kubernetes.io/projected/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-kube-api-access-f2v5q\") pod \"dnsmasq-dns-7fc7c884dc-pppkc\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.125741 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-wglkz"] Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.132331 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.140379 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-wglkz"] Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.163785 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.305823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqnkk\" (UniqueName: \"kubernetes.io/projected/9906cf23-b6ae-4b47-a6f2-011885c9954f-kube-api-access-nqnkk\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.306460 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-config\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.306502 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.333676 4786 generic.go:334] "Generic (PLEG): container finished" podID="1b3d663b-e569-40cf-a091-c64ac0cc07cb" containerID="98c357346b6ba6699cda6eb6d6a2cf8234155956d100eb6713cd6db6ed3b1792" exitCode=0 Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.333747 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f4889f87-54wfz" event={"ID":"1b3d663b-e569-40cf-a091-c64ac0cc07cb","Type":"ContainerDied","Data":"98c357346b6ba6699cda6eb6d6a2cf8234155956d100eb6713cd6db6ed3b1792"} Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.333777 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f4889f87-54wfz" event={"ID":"1b3d663b-e569-40cf-a091-c64ac0cc07cb","Type":"ContainerStarted","Data":"e16933924686ad14a9b268ae3737d2f8eae3fdd97864fbe173b010107b030996"} Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.336075 4786 generic.go:334] "Generic (PLEG): container finished" podID="8606f4dd-6856-4815-a09b-99f43ba18a08" containerID="91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340" exitCode=0 Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.336120 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" event={"ID":"8606f4dd-6856-4815-a09b-99f43ba18a08","Type":"ContainerDied","Data":"91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340"} Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.336149 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" event={"ID":"8606f4dd-6856-4815-a09b-99f43ba18a08","Type":"ContainerStarted","Data":"6dfc211aa6b96d0e57648b14083d9835109ed5ae814c926d45294632f5d73ed1"} Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.408772 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-config\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.409165 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.409333 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqnkk\" (UniqueName: \"kubernetes.io/projected/9906cf23-b6ae-4b47-a6f2-011885c9954f-kube-api-access-nqnkk\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.419504 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-config\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.420471 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-dns-svc\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.470935 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqnkk\" (UniqueName: \"kubernetes.io/projected/9906cf23-b6ae-4b47-a6f2-011885c9954f-kube-api-access-nqnkk\") pod \"dnsmasq-dns-7c95686bd5-wglkz\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.493204 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.732750 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc7c884dc-pppkc"] Mar 13 16:29:55 crc kubenswrapper[4786]: W0313 16:29:55.735850 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9bf9bfb_1ed4_4530_ba20_7432a1c7dcfa.slice/crio-764ce41695c7bd067f5514b7354506e4bc4992243deff5a1d575b6eb28143a5d WatchSource:0}: Error finding container 764ce41695c7bd067f5514b7354506e4bc4992243deff5a1d575b6eb28143a5d: Status 404 returned error can't find the container with id 764ce41695c7bd067f5514b7354506e4bc4992243deff5a1d575b6eb28143a5d Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.949820 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.952911 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.956427 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.959685 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.959929 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.960062 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-v9ps9" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.960189 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.960280 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.961202 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 16:29:55 crc kubenswrapper[4786]: I0313 16:29:55.967951 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.017622 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.049107 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-wglkz"] Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123332 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxb9z\" (UniqueName: \"kubernetes.io/projected/1b3d663b-e569-40cf-a091-c64ac0cc07cb-kube-api-access-mxb9z\") pod \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123497 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-dns-svc\") pod \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123519 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-config\") pod \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\" (UID: \"1b3d663b-e569-40cf-a091-c64ac0cc07cb\") " Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123727 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123760 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123805 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123910 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-config-data\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123961 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a594219d-8f84-4426-83e2-c73fd8bcace7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.123979 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d96vw\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-kube-api-access-d96vw\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.124018 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.124035 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a594219d-8f84-4426-83e2-c73fd8bcace7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.124087 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.124136 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.126672 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3d663b-e569-40cf-a091-c64ac0cc07cb-kube-api-access-mxb9z" (OuterVolumeSpecName: "kube-api-access-mxb9z") pod "1b3d663b-e569-40cf-a091-c64ac0cc07cb" (UID: "1b3d663b-e569-40cf-a091-c64ac0cc07cb"). InnerVolumeSpecName "kube-api-access-mxb9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.138049 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b3d663b-e569-40cf-a091-c64ac0cc07cb" (UID: "1b3d663b-e569-40cf-a091-c64ac0cc07cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.138102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-config" (OuterVolumeSpecName: "config") pod "1b3d663b-e569-40cf-a091-c64ac0cc07cb" (UID: "1b3d663b-e569-40cf-a091-c64ac0cc07cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225365 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d96vw\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-kube-api-access-d96vw\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225416 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225443 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a594219d-8f84-4426-83e2-c73fd8bcace7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225488 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225533 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225582 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225615 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225657 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225680 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225707 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-config-data\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225743 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a594219d-8f84-4426-83e2-c73fd8bcace7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225792 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxb9z\" (UniqueName: \"kubernetes.io/projected/1b3d663b-e569-40cf-a091-c64ac0cc07cb-kube-api-access-mxb9z\") on node \"crc\" DevicePath \"\"" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225808 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.225819 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d663b-e569-40cf-a091-c64ac0cc07cb-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.228893 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.228918 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.229039 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.229319 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-config-data\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.230219 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.232743 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a594219d-8f84-4426-83e2-c73fd8bcace7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.233238 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.233701 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a594219d-8f84-4426-83e2-c73fd8bcace7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.233898 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.233921 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/19750967caecaff8646851bef3ed6448fecfa64d79db9ee79d940f03448699a2/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.234223 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.249054 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d96vw\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-kube-api-access-d96vw\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.251167 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:29:56 crc kubenswrapper[4786]: E0313 16:29:56.251481 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3d663b-e569-40cf-a091-c64ac0cc07cb" containerName="init" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.251494 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3d663b-e569-40cf-a091-c64ac0cc07cb" containerName="init" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.251654 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3d663b-e569-40cf-a091-c64ac0cc07cb" containerName="init" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.252413 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.255616 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.256090 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d8khs" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.256669 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.256697 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.256943 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.257051 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.256951 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.268536 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"rabbitmq-server-0\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.269516 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.310398 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.346095 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f4889f87-54wfz" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.346102 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f4889f87-54wfz" event={"ID":"1b3d663b-e569-40cf-a091-c64ac0cc07cb","Type":"ContainerDied","Data":"e16933924686ad14a9b268ae3737d2f8eae3fdd97864fbe173b010107b030996"} Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.346455 4786 scope.go:117] "RemoveContainer" containerID="98c357346b6ba6699cda6eb6d6a2cf8234155956d100eb6713cd6db6ed3b1792" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.347622 4786 generic.go:334] "Generic (PLEG): container finished" podID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" containerID="85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4" exitCode=0 Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.347678 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" event={"ID":"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa","Type":"ContainerDied","Data":"85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4"} Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.347712 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" event={"ID":"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa","Type":"ContainerStarted","Data":"764ce41695c7bd067f5514b7354506e4bc4992243deff5a1d575b6eb28143a5d"} Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.350035 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" event={"ID":"8606f4dd-6856-4815-a09b-99f43ba18a08","Type":"ContainerStarted","Data":"dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951"} Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.350174 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" podUID="8606f4dd-6856-4815-a09b-99f43ba18a08" containerName="dnsmasq-dns" containerID="cri-o://dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951" gracePeriod=10 Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.350273 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.354417 4786 generic.go:334] "Generic (PLEG): container finished" podID="9906cf23-b6ae-4b47-a6f2-011885c9954f" containerID="6c0eaa75172523cd9c647552ed48986377ed64e1e9cee433e3966677d4387752" exitCode=0 Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.354465 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" event={"ID":"9906cf23-b6ae-4b47-a6f2-011885c9954f","Type":"ContainerDied","Data":"6c0eaa75172523cd9c647552ed48986377ed64e1e9cee433e3966677d4387752"} Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.354492 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" event={"ID":"9906cf23-b6ae-4b47-a6f2-011885c9954f","Type":"ContainerStarted","Data":"bd7c138a0620552b34b221a15ed74ba8ad910c58a4a068a1fcd43f41affb887a"} Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.389947 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" podStartSLOduration=3.389925582 podStartE2EDuration="3.389925582s" podCreationTimestamp="2026-03-13 16:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:29:56.382393857 +0000 UTC m=+5226.545605668" watchObservedRunningTime="2026-03-13 16:29:56.389925582 +0000 UTC m=+5226.553137403" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0de3055a-6888-4e3a-9863-10f758c5e71a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428420 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428461 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428489 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428522 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428603 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428634 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls6d5\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-kube-api-access-ls6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428673 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428757 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428814 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0de3055a-6888-4e3a-9863-10f758c5e71a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.428838 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.529675 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.529733 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0de3055a-6888-4e3a-9863-10f758c5e71a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.529757 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.529793 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0de3055a-6888-4e3a-9863-10f758c5e71a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.529822 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.529847 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.530008 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.530047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.530105 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.531289 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.531333 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls6d5\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-kube-api-access-ls6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.531372 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.532121 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.532428 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.532573 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.533360 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.534629 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0de3055a-6888-4e3a-9863-10f758c5e71a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.536661 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0de3055a-6888-4e3a-9863-10f758c5e71a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.539345 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.539467 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b95abe7429486fe8f4c3539aad0cb48cc6c6a3ebabd5c537e177b1d54a9deebf/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.543196 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.543749 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.547949 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls6d5\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-kube-api-access-ls6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.586174 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-54wfz"] Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.604065 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f4889f87-54wfz"] Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.615202 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"rabbitmq-cell1-server-0\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: W0313 16:29:56.793350 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda594219d_8f84_4426_83e2_c73fd8bcace7.slice/crio-d228ad8e7ce98e321cae1c878f9f03726b042a18810ad62d1bccc672799a3a01 WatchSource:0}: Error finding container d228ad8e7ce98e321cae1c878f9f03726b042a18810ad62d1bccc672799a3a01: Status 404 returned error can't find the container with id d228ad8e7ce98e321cae1c878f9f03726b042a18810ad62d1bccc672799a3a01 Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.797739 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.823038 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.878154 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.936971 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8606f4dd-6856-4815-a09b-99f43ba18a08-config\") pod \"8606f4dd-6856-4815-a09b-99f43ba18a08\" (UID: \"8606f4dd-6856-4815-a09b-99f43ba18a08\") " Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.937235 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vth8m\" (UniqueName: \"kubernetes.io/projected/8606f4dd-6856-4815-a09b-99f43ba18a08-kube-api-access-vth8m\") pod \"8606f4dd-6856-4815-a09b-99f43ba18a08\" (UID: \"8606f4dd-6856-4815-a09b-99f43ba18a08\") " Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.943034 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8606f4dd-6856-4815-a09b-99f43ba18a08-kube-api-access-vth8m" (OuterVolumeSpecName: "kube-api-access-vth8m") pod "8606f4dd-6856-4815-a09b-99f43ba18a08" (UID: "8606f4dd-6856-4815-a09b-99f43ba18a08"). InnerVolumeSpecName "kube-api-access-vth8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.984259 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 16:29:56 crc kubenswrapper[4786]: E0313 16:29:56.984887 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8606f4dd-6856-4815-a09b-99f43ba18a08" containerName="init" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.984900 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8606f4dd-6856-4815-a09b-99f43ba18a08" containerName="init" Mar 13 16:29:56 crc kubenswrapper[4786]: E0313 16:29:56.984936 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8606f4dd-6856-4815-a09b-99f43ba18a08" containerName="dnsmasq-dns" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.984943 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8606f4dd-6856-4815-a09b-99f43ba18a08" containerName="dnsmasq-dns" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.985110 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8606f4dd-6856-4815-a09b-99f43ba18a08" containerName="dnsmasq-dns" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.985930 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.991529 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.991545 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8606f4dd-6856-4815-a09b-99f43ba18a08-config" (OuterVolumeSpecName: "config") pod "8606f4dd-6856-4815-a09b-99f43ba18a08" (UID: "8606f4dd-6856-4815-a09b-99f43ba18a08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.991821 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.992199 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.992501 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-62nq7" Mar 13 16:29:56 crc kubenswrapper[4786]: I0313 16:29:56.999222 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.019303 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.039488 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8606f4dd-6856-4815-a09b-99f43ba18a08-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.039529 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vth8m\" (UniqueName: \"kubernetes.io/projected/8606f4dd-6856-4815-a09b-99f43ba18a08-kube-api-access-vth8m\") on node \"crc\" DevicePath \"\"" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.142259 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.142306 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-config-data-default\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.142325 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.142364 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.142582 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9e4340ac-c234-4ea0-b8d2-69bef67f859e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e4340ac-c234-4ea0-b8d2-69bef67f859e\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.142673 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.142692 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmtn\" (UniqueName: \"kubernetes.io/projected/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-kube-api-access-kvmtn\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.142715 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-kolla-config\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.244516 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.244558 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-config-data-default\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.244575 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.245607 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-config-data-default\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.246491 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.246921 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.247316 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9e4340ac-c234-4ea0-b8d2-69bef67f859e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e4340ac-c234-4ea0-b8d2-69bef67f859e\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.247378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.247400 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvmtn\" (UniqueName: \"kubernetes.io/projected/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-kube-api-access-kvmtn\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.247418 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-kolla-config\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.247801 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.248991 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-kolla-config\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.249021 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.253895 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.253934 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9e4340ac-c234-4ea0-b8d2-69bef67f859e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e4340ac-c234-4ea0-b8d2-69bef67f859e\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1ce8455911f3fd4266a975735ec0d78645e6f16c214fe34308ebf9533b6a1e2/globalmount\"" pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.254603 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.272987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvmtn\" (UniqueName: \"kubernetes.io/projected/4fe673d3-c989-4a19-9e4b-c06e0cc21ebc-kube-api-access-kvmtn\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.305476 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9e4340ac-c234-4ea0-b8d2-69bef67f859e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e4340ac-c234-4ea0-b8d2-69bef67f859e\") pod \"openstack-galera-0\" (UID: \"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc\") " pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.363553 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.384727 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" event={"ID":"9906cf23-b6ae-4b47-a6f2-011885c9954f","Type":"ContainerStarted","Data":"624ab215ed4598c9fc43a1198be2f3293fffa58a2510f13ae0715d6741e2e414"} Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.384922 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.388455 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0de3055a-6888-4e3a-9863-10f758c5e71a","Type":"ContainerStarted","Data":"1130096510e7844c756167c7b3c65d321813bd3bcd1d90f953312ffa3fef6d3b"} Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.391037 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" event={"ID":"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa","Type":"ContainerStarted","Data":"2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a"} Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.391900 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.393235 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a594219d-8f84-4426-83e2-c73fd8bcace7","Type":"ContainerStarted","Data":"d228ad8e7ce98e321cae1c878f9f03726b042a18810ad62d1bccc672799a3a01"} Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.399323 4786 generic.go:334] "Generic (PLEG): container finished" podID="8606f4dd-6856-4815-a09b-99f43ba18a08" containerID="dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951" exitCode=0 Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.399370 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" event={"ID":"8606f4dd-6856-4815-a09b-99f43ba18a08","Type":"ContainerDied","Data":"dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951"} Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.399396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" event={"ID":"8606f4dd-6856-4815-a09b-99f43ba18a08","Type":"ContainerDied","Data":"6dfc211aa6b96d0e57648b14083d9835109ed5ae814c926d45294632f5d73ed1"} Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.399414 4786 scope.go:117] "RemoveContainer" containerID="dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.399546 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dcc4d9b5-lml94" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.412947 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" podStartSLOduration=2.412931011 podStartE2EDuration="2.412931011s" podCreationTimestamp="2026-03-13 16:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:29:57.411184398 +0000 UTC m=+5227.574396209" watchObservedRunningTime="2026-03-13 16:29:57.412931011 +0000 UTC m=+5227.576142822" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.432072 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" podStartSLOduration=3.432052492 podStartE2EDuration="3.432052492s" podCreationTimestamp="2026-03-13 16:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:29:57.430148255 +0000 UTC m=+5227.593360066" watchObservedRunningTime="2026-03-13 16:29:57.432052492 +0000 UTC m=+5227.595264293" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.439376 4786 scope.go:117] "RemoveContainer" containerID="91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.561576 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-lml94"] Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.568329 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dcc4d9b5-lml94"] Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.605588 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.635510 4786 scope.go:117] "RemoveContainer" containerID="dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951" Mar 13 16:29:57 crc kubenswrapper[4786]: E0313 16:29:57.636141 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951\": container with ID starting with dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951 not found: ID does not exist" containerID="dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.636175 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951"} err="failed to get container status \"dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951\": rpc error: code = NotFound desc = could not find container \"dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951\": container with ID starting with dba424cddb0795842ff1e114d8ff85311aeae907b56a5718d483de549bcc7951 not found: ID does not exist" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.636195 4786 scope.go:117] "RemoveContainer" containerID="91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340" Mar 13 16:29:57 crc kubenswrapper[4786]: E0313 16:29:57.637132 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340\": container with ID starting with 91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340 not found: ID does not exist" containerID="91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340" Mar 13 16:29:57 crc kubenswrapper[4786]: I0313 16:29:57.637156 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340"} err="failed to get container status \"91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340\": rpc error: code = NotFound desc = could not find container \"91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340\": container with ID starting with 91522d8128e6990c1b818f25f10ae0b6ad2eda4f259734f6613f325f91840340 not found: ID does not exist" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.065040 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 16:29:58 crc kubenswrapper[4786]: W0313 16:29:58.068236 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fe673d3_c989_4a19_9e4b_c06e0cc21ebc.slice/crio-f067ffc5115898dcb60ab147c239fabec046b238d2a02f7a4373185d9c291edf WatchSource:0}: Error finding container f067ffc5115898dcb60ab147c239fabec046b238d2a02f7a4373185d9c291edf: Status 404 returned error can't find the container with id f067ffc5115898dcb60ab147c239fabec046b238d2a02f7a4373185d9c291edf Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.409284 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.411392 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.413468 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nhqxd" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.415345 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc","Type":"ContainerStarted","Data":"7e3bec8707fdc7390646d0034b254e28498c01723e2e979edd04e749d0a0a9c3"} Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.415393 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc","Type":"ContainerStarted","Data":"f067ffc5115898dcb60ab147c239fabec046b238d2a02f7a4373185d9c291edf"} Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.416897 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.417090 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.417165 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.418134 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a594219d-8f84-4426-83e2-c73fd8bcace7","Type":"ContainerStarted","Data":"2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e"} Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.437106 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.563752 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3d663b-e569-40cf-a091-c64ac0cc07cb" path="/var/lib/kubelet/pods/1b3d663b-e569-40cf-a091-c64ac0cc07cb/volumes" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.564971 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8606f4dd-6856-4815-a09b-99f43ba18a08" path="/var/lib/kubelet/pods/8606f4dd-6856-4815-a09b-99f43ba18a08/volumes" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.568816 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce37391d-6853-4058-81bd-53d5009f12fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.568899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d36f14ec-e358-473a-a377-1bbf764cad04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d36f14ec-e358-473a-a377-1bbf764cad04\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.568988 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce37391d-6853-4058-81bd-53d5009f12fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.569043 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.569208 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqp2b\" (UniqueName: \"kubernetes.io/projected/ce37391d-6853-4058-81bd-53d5009f12fc-kube-api-access-xqp2b\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.569270 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.569336 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce37391d-6853-4058-81bd-53d5009f12fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.569380 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.670886 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.670965 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce37391d-6853-4058-81bd-53d5009f12fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.670987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.671032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce37391d-6853-4058-81bd-53d5009f12fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.671054 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d36f14ec-e358-473a-a377-1bbf764cad04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d36f14ec-e358-473a-a377-1bbf764cad04\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.671091 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce37391d-6853-4058-81bd-53d5009f12fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.671111 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.671151 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqp2b\" (UniqueName: \"kubernetes.io/projected/ce37391d-6853-4058-81bd-53d5009f12fc-kube-api-access-xqp2b\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.671773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.672149 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce37391d-6853-4058-81bd-53d5009f12fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.673022 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.673412 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce37391d-6853-4058-81bd-53d5009f12fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.678434 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce37391d-6853-4058-81bd-53d5009f12fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.682815 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce37391d-6853-4058-81bd-53d5009f12fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.683654 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.683700 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d36f14ec-e358-473a-a377-1bbf764cad04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d36f14ec-e358-473a-a377-1bbf764cad04\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/edce5d73fc12dbb00aeaef4a8be77eabbc568a5b8b11856ae82a2a2a3a7c7e21/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.700600 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqp2b\" (UniqueName: \"kubernetes.io/projected/ce37391d-6853-4058-81bd-53d5009f12fc-kube-api-access-xqp2b\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.717014 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d36f14ec-e358-473a-a377-1bbf764cad04\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d36f14ec-e358-473a-a377-1bbf764cad04\") pod \"openstack-cell1-galera-0\" (UID: \"ce37391d-6853-4058-81bd-53d5009f12fc\") " pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.732919 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.821468 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.822402 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.825710 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gncdt" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.825932 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.826272 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.842416 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.975753 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/947a28af-b2fb-41bb-8be0-8b6723bb630e-config-data\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.976161 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pflx\" (UniqueName: \"kubernetes.io/projected/947a28af-b2fb-41bb-8be0-8b6723bb630e-kube-api-access-8pflx\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.976186 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/947a28af-b2fb-41bb-8be0-8b6723bb630e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.976202 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947a28af-b2fb-41bb-8be0-8b6723bb630e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:58 crc kubenswrapper[4786]: I0313 16:29:58.976298 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/947a28af-b2fb-41bb-8be0-8b6723bb630e-kolla-config\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.077647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/947a28af-b2fb-41bb-8be0-8b6723bb630e-config-data\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.077722 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pflx\" (UniqueName: \"kubernetes.io/projected/947a28af-b2fb-41bb-8be0-8b6723bb630e-kube-api-access-8pflx\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.077755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/947a28af-b2fb-41bb-8be0-8b6723bb630e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.077775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947a28af-b2fb-41bb-8be0-8b6723bb630e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.077878 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/947a28af-b2fb-41bb-8be0-8b6723bb630e-kolla-config\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.078508 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/947a28af-b2fb-41bb-8be0-8b6723bb630e-config-data\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.078773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/947a28af-b2fb-41bb-8be0-8b6723bb630e-kolla-config\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.082609 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/947a28af-b2fb-41bb-8be0-8b6723bb630e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.084187 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947a28af-b2fb-41bb-8be0-8b6723bb630e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.098161 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pflx\" (UniqueName: \"kubernetes.io/projected/947a28af-b2fb-41bb-8be0-8b6723bb630e-kube-api-access-8pflx\") pod \"memcached-0\" (UID: \"947a28af-b2fb-41bb-8be0-8b6723bb630e\") " pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.141897 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.188197 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 16:29:59 crc kubenswrapper[4786]: W0313 16:29:59.192541 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce37391d_6853_4058_81bd_53d5009f12fc.slice/crio-eaa2487e9482d98e0da33d695bfe54e887873d667aacafc814fcd2ffd7bf6c43 WatchSource:0}: Error finding container eaa2487e9482d98e0da33d695bfe54e887873d667aacafc814fcd2ffd7bf6c43: Status 404 returned error can't find the container with id eaa2487e9482d98e0da33d695bfe54e887873d667aacafc814fcd2ffd7bf6c43 Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.440586 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ce37391d-6853-4058-81bd-53d5009f12fc","Type":"ContainerStarted","Data":"18ceb43c37edfa1985a3118b9e3d350c52b8a2885365f10b55dab6978d33bfd0"} Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.441016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ce37391d-6853-4058-81bd-53d5009f12fc","Type":"ContainerStarted","Data":"eaa2487e9482d98e0da33d695bfe54e887873d667aacafc814fcd2ffd7bf6c43"} Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.444668 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0de3055a-6888-4e3a-9863-10f758c5e71a","Type":"ContainerStarted","Data":"d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6"} Mar 13 16:29:59 crc kubenswrapper[4786]: I0313 16:29:59.579073 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.140267 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556990-w5rpx"] Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.142620 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556990-w5rpx" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.148184 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.148819 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.149146 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.153094 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt"] Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.154294 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.156485 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.158438 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.164735 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556990-w5rpx"] Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.178902 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt"] Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.197641 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8044d25b-e896-4109-b222-e6e2e849cdbf-secret-volume\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.197711 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgs5h\" (UniqueName: \"kubernetes.io/projected/8044d25b-e896-4109-b222-e6e2e849cdbf-kube-api-access-dgs5h\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.197741 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch99h\" (UniqueName: \"kubernetes.io/projected/6e6fa2cb-69af-420e-a158-9ebfded4c3e0-kube-api-access-ch99h\") pod \"auto-csr-approver-29556990-w5rpx\" (UID: \"6e6fa2cb-69af-420e-a158-9ebfded4c3e0\") " pod="openshift-infra/auto-csr-approver-29556990-w5rpx" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.197761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8044d25b-e896-4109-b222-e6e2e849cdbf-config-volume\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.299000 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8044d25b-e896-4109-b222-e6e2e849cdbf-secret-volume\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.299068 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgs5h\" (UniqueName: \"kubernetes.io/projected/8044d25b-e896-4109-b222-e6e2e849cdbf-kube-api-access-dgs5h\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.299100 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch99h\" (UniqueName: \"kubernetes.io/projected/6e6fa2cb-69af-420e-a158-9ebfded4c3e0-kube-api-access-ch99h\") pod \"auto-csr-approver-29556990-w5rpx\" (UID: \"6e6fa2cb-69af-420e-a158-9ebfded4c3e0\") " pod="openshift-infra/auto-csr-approver-29556990-w5rpx" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.299121 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8044d25b-e896-4109-b222-e6e2e849cdbf-config-volume\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.299843 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8044d25b-e896-4109-b222-e6e2e849cdbf-config-volume\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.305417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8044d25b-e896-4109-b222-e6e2e849cdbf-secret-volume\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.315591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgs5h\" (UniqueName: \"kubernetes.io/projected/8044d25b-e896-4109-b222-e6e2e849cdbf-kube-api-access-dgs5h\") pod \"collect-profiles-29556990-6whkt\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.317232 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch99h\" (UniqueName: \"kubernetes.io/projected/6e6fa2cb-69af-420e-a158-9ebfded4c3e0-kube-api-access-ch99h\") pod \"auto-csr-approver-29556990-w5rpx\" (UID: \"6e6fa2cb-69af-420e-a158-9ebfded4c3e0\") " pod="openshift-infra/auto-csr-approver-29556990-w5rpx" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.458946 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"947a28af-b2fb-41bb-8be0-8b6723bb630e","Type":"ContainerStarted","Data":"18db1b1f19e00fc2b24f499017b10712175ee4f7c072ad0bf7eb5ecfcd2c39f2"} Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.459058 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"947a28af-b2fb-41bb-8be0-8b6723bb630e","Type":"ContainerStarted","Data":"08955f828cd23ec79b0f6f4cfa7b4543e98bb34c4d2cf13b4d082c79dfc682eb"} Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.460731 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.470549 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556990-w5rpx" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.486547 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.490070 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.49004963 podStartE2EDuration="2.49004963s" podCreationTimestamp="2026-03-13 16:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:30:00.479816158 +0000 UTC m=+5230.643027969" watchObservedRunningTime="2026-03-13 16:30:00.49004963 +0000 UTC m=+5230.653261441" Mar 13 16:30:00 crc kubenswrapper[4786]: I0313 16:30:00.982647 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556990-w5rpx"] Mar 13 16:30:01 crc kubenswrapper[4786]: I0313 16:30:01.066368 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt"] Mar 13 16:30:01 crc kubenswrapper[4786]: W0313 16:30:01.069291 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8044d25b_e896_4109_b222_e6e2e849cdbf.slice/crio-8a043184b17a58366198f5a1d4d7a08530473362f14c4a4d5afbfc04429accce WatchSource:0}: Error finding container 8a043184b17a58366198f5a1d4d7a08530473362f14c4a4d5afbfc04429accce: Status 404 returned error can't find the container with id 8a043184b17a58366198f5a1d4d7a08530473362f14c4a4d5afbfc04429accce Mar 13 16:30:01 crc kubenswrapper[4786]: I0313 16:30:01.470562 4786 generic.go:334] "Generic (PLEG): container finished" podID="8044d25b-e896-4109-b222-e6e2e849cdbf" containerID="881cc4efb6ef0ae64ac9fd7d0e9eaf036fb34c8e070082a2788b725d42716201" exitCode=0 Mar 13 16:30:01 crc kubenswrapper[4786]: I0313 16:30:01.470642 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" event={"ID":"8044d25b-e896-4109-b222-e6e2e849cdbf","Type":"ContainerDied","Data":"881cc4efb6ef0ae64ac9fd7d0e9eaf036fb34c8e070082a2788b725d42716201"} Mar 13 16:30:01 crc kubenswrapper[4786]: I0313 16:30:01.470881 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" event={"ID":"8044d25b-e896-4109-b222-e6e2e849cdbf","Type":"ContainerStarted","Data":"8a043184b17a58366198f5a1d4d7a08530473362f14c4a4d5afbfc04429accce"} Mar 13 16:30:01 crc kubenswrapper[4786]: I0313 16:30:01.473997 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556990-w5rpx" event={"ID":"6e6fa2cb-69af-420e-a158-9ebfded4c3e0","Type":"ContainerStarted","Data":"6d1388d80159d88a0e30d196fe23f28136a43f999777aa06196167785e2f7daa"} Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.482648 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fe673d3-c989-4a19-9e4b-c06e0cc21ebc" containerID="7e3bec8707fdc7390646d0034b254e28498c01723e2e979edd04e749d0a0a9c3" exitCode=0 Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.482759 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc","Type":"ContainerDied","Data":"7e3bec8707fdc7390646d0034b254e28498c01723e2e979edd04e749d0a0a9c3"} Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.832468 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.948048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8044d25b-e896-4109-b222-e6e2e849cdbf-secret-volume\") pod \"8044d25b-e896-4109-b222-e6e2e849cdbf\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.948406 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8044d25b-e896-4109-b222-e6e2e849cdbf-config-volume\") pod \"8044d25b-e896-4109-b222-e6e2e849cdbf\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.948439 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgs5h\" (UniqueName: \"kubernetes.io/projected/8044d25b-e896-4109-b222-e6e2e849cdbf-kube-api-access-dgs5h\") pod \"8044d25b-e896-4109-b222-e6e2e849cdbf\" (UID: \"8044d25b-e896-4109-b222-e6e2e849cdbf\") " Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.949110 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8044d25b-e896-4109-b222-e6e2e849cdbf-config-volume" (OuterVolumeSpecName: "config-volume") pod "8044d25b-e896-4109-b222-e6e2e849cdbf" (UID: "8044d25b-e896-4109-b222-e6e2e849cdbf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.954030 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8044d25b-e896-4109-b222-e6e2e849cdbf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8044d25b-e896-4109-b222-e6e2e849cdbf" (UID: "8044d25b-e896-4109-b222-e6e2e849cdbf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:30:02 crc kubenswrapper[4786]: I0313 16:30:02.954108 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8044d25b-e896-4109-b222-e6e2e849cdbf-kube-api-access-dgs5h" (OuterVolumeSpecName: "kube-api-access-dgs5h") pod "8044d25b-e896-4109-b222-e6e2e849cdbf" (UID: "8044d25b-e896-4109-b222-e6e2e849cdbf"). InnerVolumeSpecName "kube-api-access-dgs5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.049992 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgs5h\" (UniqueName: \"kubernetes.io/projected/8044d25b-e896-4109-b222-e6e2e849cdbf-kube-api-access-dgs5h\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.050032 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8044d25b-e896-4109-b222-e6e2e849cdbf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.050048 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8044d25b-e896-4109-b222-e6e2e849cdbf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.510405 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4fe673d3-c989-4a19-9e4b-c06e0cc21ebc","Type":"ContainerStarted","Data":"b36dfc9b0c3ac89907bea42d99a8923bc0cc788a232eea1810e6e9c3851e16cf"} Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.514286 4786 generic.go:334] "Generic (PLEG): container finished" podID="6e6fa2cb-69af-420e-a158-9ebfded4c3e0" containerID="b676824333dc9f6a07480748522760ddccca6263a0189ec1784b3afd9078c0d2" exitCode=0 Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.514405 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556990-w5rpx" event={"ID":"6e6fa2cb-69af-420e-a158-9ebfded4c3e0","Type":"ContainerDied","Data":"b676824333dc9f6a07480748522760ddccca6263a0189ec1784b3afd9078c0d2"} Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.517483 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.518207 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556990-6whkt" event={"ID":"8044d25b-e896-4109-b222-e6e2e849cdbf","Type":"ContainerDied","Data":"8a043184b17a58366198f5a1d4d7a08530473362f14c4a4d5afbfc04429accce"} Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.518256 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a043184b17a58366198f5a1d4d7a08530473362f14c4a4d5afbfc04429accce" Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.520043 4786 generic.go:334] "Generic (PLEG): container finished" podID="ce37391d-6853-4058-81bd-53d5009f12fc" containerID="18ceb43c37edfa1985a3118b9e3d350c52b8a2885365f10b55dab6978d33bfd0" exitCode=0 Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.520083 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ce37391d-6853-4058-81bd-53d5009f12fc","Type":"ContainerDied","Data":"18ceb43c37edfa1985a3118b9e3d350c52b8a2885365f10b55dab6978d33bfd0"} Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.546393 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.546373169 podStartE2EDuration="8.546373169s" podCreationTimestamp="2026-03-13 16:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:30:03.540990766 +0000 UTC m=+5233.704202587" watchObservedRunningTime="2026-03-13 16:30:03.546373169 +0000 UTC m=+5233.709584990" Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.920830 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg"] Mar 13 16:30:03 crc kubenswrapper[4786]: I0313 16:30:03.929993 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556945-gwfpg"] Mar 13 16:30:04 crc kubenswrapper[4786]: I0313 16:30:04.143127 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 16:30:04 crc kubenswrapper[4786]: I0313 16:30:04.528236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ce37391d-6853-4058-81bd-53d5009f12fc","Type":"ContainerStarted","Data":"2a376716d5b1fdf271da7b396ee85757c8fee01caddeb0f3b9dc5273e762812c"} Mar 13 16:30:04 crc kubenswrapper[4786]: I0313 16:30:04.549081 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.549062958 podStartE2EDuration="7.549062958s" podCreationTimestamp="2026-03-13 16:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:30:04.547265063 +0000 UTC m=+5234.710476884" watchObservedRunningTime="2026-03-13 16:30:04.549062958 +0000 UTC m=+5234.712274769" Mar 13 16:30:04 crc kubenswrapper[4786]: I0313 16:30:04.561275 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaccb485-07a2-43c1-b974-1268fcf7a5ee" path="/var/lib/kubelet/pods/aaccb485-07a2-43c1-b974-1268fcf7a5ee/volumes" Mar 13 16:30:04 crc kubenswrapper[4786]: I0313 16:30:04.895366 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556990-w5rpx" Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.080776 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch99h\" (UniqueName: \"kubernetes.io/projected/6e6fa2cb-69af-420e-a158-9ebfded4c3e0-kube-api-access-ch99h\") pod \"6e6fa2cb-69af-420e-a158-9ebfded4c3e0\" (UID: \"6e6fa2cb-69af-420e-a158-9ebfded4c3e0\") " Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.085712 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6fa2cb-69af-420e-a158-9ebfded4c3e0-kube-api-access-ch99h" (OuterVolumeSpecName: "kube-api-access-ch99h") pod "6e6fa2cb-69af-420e-a158-9ebfded4c3e0" (UID: "6e6fa2cb-69af-420e-a158-9ebfded4c3e0"). InnerVolumeSpecName "kube-api-access-ch99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.165049 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.182910 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch99h\" (UniqueName: \"kubernetes.io/projected/6e6fa2cb-69af-420e-a158-9ebfded4c3e0-kube-api-access-ch99h\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.494973 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.536060 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556990-w5rpx" event={"ID":"6e6fa2cb-69af-420e-a158-9ebfded4c3e0","Type":"ContainerDied","Data":"6d1388d80159d88a0e30d196fe23f28136a43f999777aa06196167785e2f7daa"} Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.536099 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556990-w5rpx" Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.536099 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1388d80159d88a0e30d196fe23f28136a43f999777aa06196167785e2f7daa" Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.576174 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc7c884dc-pppkc"] Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.576390 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" podUID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" containerName="dnsmasq-dns" containerID="cri-o://2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a" gracePeriod=10 Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.955795 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556984-jvhl9"] Mar 13 16:30:05 crc kubenswrapper[4786]: I0313 16:30:05.983297 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556984-jvhl9"] Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.184447 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.299764 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2v5q\" (UniqueName: \"kubernetes.io/projected/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-kube-api-access-f2v5q\") pod \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.299910 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-config\") pod \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.299955 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-dns-svc\") pod \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\" (UID: \"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa\") " Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.314634 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-kube-api-access-f2v5q" (OuterVolumeSpecName: "kube-api-access-f2v5q") pod "f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" (UID: "f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa"). InnerVolumeSpecName "kube-api-access-f2v5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.335011 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-config" (OuterVolumeSpecName: "config") pod "f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" (UID: "f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.341298 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" (UID: "f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.402019 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2v5q\" (UniqueName: \"kubernetes.io/projected/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-kube-api-access-f2v5q\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.402071 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.402084 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.547108 4786 generic.go:334] "Generic (PLEG): container finished" podID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" containerID="2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a" exitCode=0 Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.547171 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" event={"ID":"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa","Type":"ContainerDied","Data":"2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a"} Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.547214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" event={"ID":"f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa","Type":"ContainerDied","Data":"764ce41695c7bd067f5514b7354506e4bc4992243deff5a1d575b6eb28143a5d"} Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.547242 4786 scope.go:117] "RemoveContainer" containerID="2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.547429 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc7c884dc-pppkc" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.593679 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6" path="/var/lib/kubelet/pods/41c8721d-b9e2-44f4-a5a4-dc0d5d7a82b6/volumes" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.607688 4786 scope.go:117] "RemoveContainer" containerID="85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.617133 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc7c884dc-pppkc"] Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.618257 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc7c884dc-pppkc"] Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.682370 4786 scope.go:117] "RemoveContainer" containerID="2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a" Mar 13 16:30:06 crc kubenswrapper[4786]: E0313 16:30:06.682779 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a\": container with ID starting with 2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a not found: ID does not exist" containerID="2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.682830 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a"} err="failed to get container status \"2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a\": rpc error: code = NotFound desc = could not find container \"2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a\": container with ID starting with 2076c1bbaf7b27fc943795308c08a5641379fe9f323882c3fde9eb639630790a not found: ID does not exist" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.682883 4786 scope.go:117] "RemoveContainer" containerID="85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4" Mar 13 16:30:06 crc kubenswrapper[4786]: E0313 16:30:06.683261 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4\": container with ID starting with 85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4 not found: ID does not exist" containerID="85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4" Mar 13 16:30:06 crc kubenswrapper[4786]: I0313 16:30:06.683291 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4"} err="failed to get container status \"85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4\": rpc error: code = NotFound desc = could not find container \"85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4\": container with ID starting with 85678341582b3f266a19d376c5228c0b91012453ad9380ee9b0dc71e650debf4 not found: ID does not exist" Mar 13 16:30:07 crc kubenswrapper[4786]: I0313 16:30:07.606650 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 16:30:07 crc kubenswrapper[4786]: I0313 16:30:07.607749 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 16:30:08 crc kubenswrapper[4786]: I0313 16:30:08.562850 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" path="/var/lib/kubelet/pods/f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa/volumes" Mar 13 16:30:08 crc kubenswrapper[4786]: I0313 16:30:08.733822 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 16:30:08 crc kubenswrapper[4786]: I0313 16:30:08.734462 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 16:30:09 crc kubenswrapper[4786]: I0313 16:30:09.894716 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 16:30:10 crc kubenswrapper[4786]: I0313 16:30:10.006314 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 16:30:11 crc kubenswrapper[4786]: I0313 16:30:11.026275 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 16:30:11 crc kubenswrapper[4786]: I0313 16:30:11.133316 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 16:30:13 crc kubenswrapper[4786]: I0313 16:30:13.723349 4786 scope.go:117] "RemoveContainer" containerID="aa7dc5b89ece73d30d472b54962324b243e0a7117127d7a9478dee3377c6f350" Mar 13 16:30:13 crc kubenswrapper[4786]: I0313 16:30:13.772274 4786 scope.go:117] "RemoveContainer" containerID="747ca890b9753a681878dde3cf8bc7a85a2b846d7fbc4a8c3b20d6a44c6d1010" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.972426 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vx5kl"] Mar 13 16:30:15 crc kubenswrapper[4786]: E0313 16:30:15.973147 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8044d25b-e896-4109-b222-e6e2e849cdbf" containerName="collect-profiles" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.973163 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8044d25b-e896-4109-b222-e6e2e849cdbf" containerName="collect-profiles" Mar 13 16:30:15 crc kubenswrapper[4786]: E0313 16:30:15.973182 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6fa2cb-69af-420e-a158-9ebfded4c3e0" containerName="oc" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.973192 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6fa2cb-69af-420e-a158-9ebfded4c3e0" containerName="oc" Mar 13 16:30:15 crc kubenswrapper[4786]: E0313 16:30:15.973209 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" containerName="init" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.973218 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" containerName="init" Mar 13 16:30:15 crc kubenswrapper[4786]: E0313 16:30:15.973240 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" containerName="dnsmasq-dns" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.973249 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" containerName="dnsmasq-dns" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.973424 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8044d25b-e896-4109-b222-e6e2e849cdbf" containerName="collect-profiles" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.973436 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bf9bfb-1ed4-4530-ba20-7432a1c7dcfa" containerName="dnsmasq-dns" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.973457 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6fa2cb-69af-420e-a158-9ebfded4c3e0" containerName="oc" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.974176 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:15 crc kubenswrapper[4786]: I0313 16:30:15.978660 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.003895 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vx5kl"] Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.057352 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a85931-4d37-4eb6-94db-233eda006041-operator-scripts\") pod \"root-account-create-update-vx5kl\" (UID: \"d7a85931-4d37-4eb6-94db-233eda006041\") " pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.057495 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdltv\" (UniqueName: \"kubernetes.io/projected/d7a85931-4d37-4eb6-94db-233eda006041-kube-api-access-jdltv\") pod \"root-account-create-update-vx5kl\" (UID: \"d7a85931-4d37-4eb6-94db-233eda006041\") " pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.159123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdltv\" (UniqueName: \"kubernetes.io/projected/d7a85931-4d37-4eb6-94db-233eda006041-kube-api-access-jdltv\") pod \"root-account-create-update-vx5kl\" (UID: \"d7a85931-4d37-4eb6-94db-233eda006041\") " pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.159308 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a85931-4d37-4eb6-94db-233eda006041-operator-scripts\") pod \"root-account-create-update-vx5kl\" (UID: \"d7a85931-4d37-4eb6-94db-233eda006041\") " pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.160545 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a85931-4d37-4eb6-94db-233eda006041-operator-scripts\") pod \"root-account-create-update-vx5kl\" (UID: \"d7a85931-4d37-4eb6-94db-233eda006041\") " pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.182458 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdltv\" (UniqueName: \"kubernetes.io/projected/d7a85931-4d37-4eb6-94db-233eda006041-kube-api-access-jdltv\") pod \"root-account-create-update-vx5kl\" (UID: \"d7a85931-4d37-4eb6-94db-233eda006041\") " pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.298581 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:16 crc kubenswrapper[4786]: I0313 16:30:16.777748 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vx5kl"] Mar 13 16:30:17 crc kubenswrapper[4786]: I0313 16:30:17.629305 4786 generic.go:334] "Generic (PLEG): container finished" podID="d7a85931-4d37-4eb6-94db-233eda006041" containerID="794477728eded07d73ed0fdd30b7ab2a4dc285d047896b7e45c9b348b0629cc6" exitCode=0 Mar 13 16:30:17 crc kubenswrapper[4786]: I0313 16:30:17.629343 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vx5kl" event={"ID":"d7a85931-4d37-4eb6-94db-233eda006041","Type":"ContainerDied","Data":"794477728eded07d73ed0fdd30b7ab2a4dc285d047896b7e45c9b348b0629cc6"} Mar 13 16:30:17 crc kubenswrapper[4786]: I0313 16:30:17.629369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vx5kl" event={"ID":"d7a85931-4d37-4eb6-94db-233eda006041","Type":"ContainerStarted","Data":"45b744fcd57267bb7a06cf7fbdabfb527e16e7d0e25a904efb583908a04d08ae"} Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.029395 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.110491 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdltv\" (UniqueName: \"kubernetes.io/projected/d7a85931-4d37-4eb6-94db-233eda006041-kube-api-access-jdltv\") pod \"d7a85931-4d37-4eb6-94db-233eda006041\" (UID: \"d7a85931-4d37-4eb6-94db-233eda006041\") " Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.110698 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a85931-4d37-4eb6-94db-233eda006041-operator-scripts\") pod \"d7a85931-4d37-4eb6-94db-233eda006041\" (UID: \"d7a85931-4d37-4eb6-94db-233eda006041\") " Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.111551 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a85931-4d37-4eb6-94db-233eda006041-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7a85931-4d37-4eb6-94db-233eda006041" (UID: "d7a85931-4d37-4eb6-94db-233eda006041"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.115287 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a85931-4d37-4eb6-94db-233eda006041-kube-api-access-jdltv" (OuterVolumeSpecName: "kube-api-access-jdltv") pod "d7a85931-4d37-4eb6-94db-233eda006041" (UID: "d7a85931-4d37-4eb6-94db-233eda006041"). InnerVolumeSpecName "kube-api-access-jdltv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.213068 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a85931-4d37-4eb6-94db-233eda006041-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.213122 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdltv\" (UniqueName: \"kubernetes.io/projected/d7a85931-4d37-4eb6-94db-233eda006041-kube-api-access-jdltv\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.651607 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vx5kl" event={"ID":"d7a85931-4d37-4eb6-94db-233eda006041","Type":"ContainerDied","Data":"45b744fcd57267bb7a06cf7fbdabfb527e16e7d0e25a904efb583908a04d08ae"} Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.651659 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45b744fcd57267bb7a06cf7fbdabfb527e16e7d0e25a904efb583908a04d08ae" Mar 13 16:30:19 crc kubenswrapper[4786]: I0313 16:30:19.651698 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vx5kl" Mar 13 16:30:22 crc kubenswrapper[4786]: I0313 16:30:22.342071 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vx5kl"] Mar 13 16:30:22 crc kubenswrapper[4786]: I0313 16:30:22.350331 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vx5kl"] Mar 13 16:30:22 crc kubenswrapper[4786]: I0313 16:30:22.562225 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a85931-4d37-4eb6-94db-233eda006041" path="/var/lib/kubelet/pods/d7a85931-4d37-4eb6-94db-233eda006041/volumes" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.363151 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5b9pv"] Mar 13 16:30:27 crc kubenswrapper[4786]: E0313 16:30:27.363732 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a85931-4d37-4eb6-94db-233eda006041" containerName="mariadb-account-create-update" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.363747 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a85931-4d37-4eb6-94db-233eda006041" containerName="mariadb-account-create-update" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.363973 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a85931-4d37-4eb6-94db-233eda006041" containerName="mariadb-account-create-update" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.364538 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.366729 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.375655 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5b9pv"] Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.451305 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txgvc\" (UniqueName: \"kubernetes.io/projected/d3935245-a5fb-49b8-8b34-d474d92ba6fe-kube-api-access-txgvc\") pod \"root-account-create-update-5b9pv\" (UID: \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\") " pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.451383 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3935245-a5fb-49b8-8b34-d474d92ba6fe-operator-scripts\") pod \"root-account-create-update-5b9pv\" (UID: \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\") " pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.553570 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txgvc\" (UniqueName: \"kubernetes.io/projected/d3935245-a5fb-49b8-8b34-d474d92ba6fe-kube-api-access-txgvc\") pod \"root-account-create-update-5b9pv\" (UID: \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\") " pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.553729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3935245-a5fb-49b8-8b34-d474d92ba6fe-operator-scripts\") pod \"root-account-create-update-5b9pv\" (UID: \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\") " pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.555061 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3935245-a5fb-49b8-8b34-d474d92ba6fe-operator-scripts\") pod \"root-account-create-update-5b9pv\" (UID: \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\") " pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.584620 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txgvc\" (UniqueName: \"kubernetes.io/projected/d3935245-a5fb-49b8-8b34-d474d92ba6fe-kube-api-access-txgvc\") pod \"root-account-create-update-5b9pv\" (UID: \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\") " pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:27 crc kubenswrapper[4786]: I0313 16:30:27.690014 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:28 crc kubenswrapper[4786]: I0313 16:30:28.155361 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5b9pv"] Mar 13 16:30:28 crc kubenswrapper[4786]: I0313 16:30:28.726846 4786 generic.go:334] "Generic (PLEG): container finished" podID="d3935245-a5fb-49b8-8b34-d474d92ba6fe" containerID="dbeee43237c6aa5b8bf163df05c1d1a8f45a082222fe2a8a798c4a14327c53d6" exitCode=0 Mar 13 16:30:28 crc kubenswrapper[4786]: I0313 16:30:28.726975 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5b9pv" event={"ID":"d3935245-a5fb-49b8-8b34-d474d92ba6fe","Type":"ContainerDied","Data":"dbeee43237c6aa5b8bf163df05c1d1a8f45a082222fe2a8a798c4a14327c53d6"} Mar 13 16:30:28 crc kubenswrapper[4786]: I0313 16:30:28.727010 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5b9pv" event={"ID":"d3935245-a5fb-49b8-8b34-d474d92ba6fe","Type":"ContainerStarted","Data":"484c3fc0e06c10120d3165c30b4ed3c046faf0175b70e9cc574002c8d3817fcb"} Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.123218 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.208682 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txgvc\" (UniqueName: \"kubernetes.io/projected/d3935245-a5fb-49b8-8b34-d474d92ba6fe-kube-api-access-txgvc\") pod \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\" (UID: \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\") " Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.209029 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3935245-a5fb-49b8-8b34-d474d92ba6fe-operator-scripts\") pod \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\" (UID: \"d3935245-a5fb-49b8-8b34-d474d92ba6fe\") " Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.210138 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3935245-a5fb-49b8-8b34-d474d92ba6fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3935245-a5fb-49b8-8b34-d474d92ba6fe" (UID: "d3935245-a5fb-49b8-8b34-d474d92ba6fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.214047 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3935245-a5fb-49b8-8b34-d474d92ba6fe-kube-api-access-txgvc" (OuterVolumeSpecName: "kube-api-access-txgvc") pod "d3935245-a5fb-49b8-8b34-d474d92ba6fe" (UID: "d3935245-a5fb-49b8-8b34-d474d92ba6fe"). InnerVolumeSpecName "kube-api-access-txgvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.311037 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3935245-a5fb-49b8-8b34-d474d92ba6fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.311171 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txgvc\" (UniqueName: \"kubernetes.io/projected/d3935245-a5fb-49b8-8b34-d474d92ba6fe-kube-api-access-txgvc\") on node \"crc\" DevicePath \"\"" Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.744767 4786 generic.go:334] "Generic (PLEG): container finished" podID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerID="2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e" exitCode=0 Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.744842 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a594219d-8f84-4426-83e2-c73fd8bcace7","Type":"ContainerDied","Data":"2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e"} Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.747712 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5b9pv" Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.748185 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5b9pv" event={"ID":"d3935245-a5fb-49b8-8b34-d474d92ba6fe","Type":"ContainerDied","Data":"484c3fc0e06c10120d3165c30b4ed3c046faf0175b70e9cc574002c8d3817fcb"} Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.748251 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="484c3fc0e06c10120d3165c30b4ed3c046faf0175b70e9cc574002c8d3817fcb" Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.750458 4786 generic.go:334] "Generic (PLEG): container finished" podID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerID="d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6" exitCode=0 Mar 13 16:30:30 crc kubenswrapper[4786]: I0313 16:30:30.750504 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0de3055a-6888-4e3a-9863-10f758c5e71a","Type":"ContainerDied","Data":"d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6"} Mar 13 16:30:31 crc kubenswrapper[4786]: I0313 16:30:31.761523 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0de3055a-6888-4e3a-9863-10f758c5e71a","Type":"ContainerStarted","Data":"d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08"} Mar 13 16:30:31 crc kubenswrapper[4786]: I0313 16:30:31.761983 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:30:31 crc kubenswrapper[4786]: I0313 16:30:31.764211 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a594219d-8f84-4426-83e2-c73fd8bcace7","Type":"ContainerStarted","Data":"e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d"} Mar 13 16:30:31 crc kubenswrapper[4786]: I0313 16:30:31.764701 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 16:30:31 crc kubenswrapper[4786]: I0313 16:30:31.801845 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.801826435 podStartE2EDuration="36.801826435s" podCreationTimestamp="2026-03-13 16:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:30:31.792464504 +0000 UTC m=+5261.955676315" watchObservedRunningTime="2026-03-13 16:30:31.801826435 +0000 UTC m=+5261.965038246" Mar 13 16:30:31 crc kubenswrapper[4786]: I0313 16:30:31.835995 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.835976636 podStartE2EDuration="37.835976636s" podCreationTimestamp="2026-03-13 16:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:30:31.83370197 +0000 UTC m=+5261.996913781" watchObservedRunningTime="2026-03-13 16:30:31.835976636 +0000 UTC m=+5261.999188447" Mar 13 16:30:46 crc kubenswrapper[4786]: I0313 16:30:46.315272 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 16:30:46 crc kubenswrapper[4786]: I0313 16:30:46.881049 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.170524 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-bsvld"] Mar 13 16:30:55 crc kubenswrapper[4786]: E0313 16:30:55.171339 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3935245-a5fb-49b8-8b34-d474d92ba6fe" containerName="mariadb-account-create-update" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.171358 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3935245-a5fb-49b8-8b34-d474d92ba6fe" containerName="mariadb-account-create-update" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.171638 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3935245-a5fb-49b8-8b34-d474d92ba6fe" containerName="mariadb-account-create-update" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.172555 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.194489 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-bsvld"] Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.277142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-config\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.277304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-dns-svc\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.277369 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbhmm\" (UniqueName: \"kubernetes.io/projected/d61bb448-910e-4e1c-a869-c053109566fd-kube-api-access-tbhmm\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.379309 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-config\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.379476 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-dns-svc\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.379537 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbhmm\" (UniqueName: \"kubernetes.io/projected/d61bb448-910e-4e1c-a869-c053109566fd-kube-api-access-tbhmm\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.381547 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-config\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.381547 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-dns-svc\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.407470 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbhmm\" (UniqueName: \"kubernetes.io/projected/d61bb448-910e-4e1c-a869-c053109566fd-kube-api-access-tbhmm\") pod \"dnsmasq-dns-684c864bc9-bsvld\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.494407 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:55 crc kubenswrapper[4786]: I0313 16:30:55.980174 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-bsvld"] Mar 13 16:30:56 crc kubenswrapper[4786]: I0313 16:30:56.447344 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:30:56 crc kubenswrapper[4786]: I0313 16:30:56.993462 4786 generic.go:334] "Generic (PLEG): container finished" podID="d61bb448-910e-4e1c-a869-c053109566fd" containerID="c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d" exitCode=0 Mar 13 16:30:56 crc kubenswrapper[4786]: I0313 16:30:56.993525 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" event={"ID":"d61bb448-910e-4e1c-a869-c053109566fd","Type":"ContainerDied","Data":"c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d"} Mar 13 16:30:56 crc kubenswrapper[4786]: I0313 16:30:56.993565 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" event={"ID":"d61bb448-910e-4e1c-a869-c053109566fd","Type":"ContainerStarted","Data":"99602dfef1c06d537dc47312731993359f4d64f0b49f0d7c36ebad702ccd925e"} Mar 13 16:30:57 crc kubenswrapper[4786]: I0313 16:30:57.384150 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:30:58 crc kubenswrapper[4786]: I0313 16:30:58.003786 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" event={"ID":"d61bb448-910e-4e1c-a869-c053109566fd","Type":"ContainerStarted","Data":"a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376"} Mar 13 16:30:58 crc kubenswrapper[4786]: I0313 16:30:58.004155 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:30:58 crc kubenswrapper[4786]: I0313 16:30:58.027252 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" podStartSLOduration=3.027234602 podStartE2EDuration="3.027234602s" podCreationTimestamp="2026-03-13 16:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:30:58.024589567 +0000 UTC m=+5288.187801388" watchObservedRunningTime="2026-03-13 16:30:58.027234602 +0000 UTC m=+5288.190446423" Mar 13 16:31:00 crc kubenswrapper[4786]: I0313 16:31:00.157513 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerName="rabbitmq" containerID="cri-o://e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d" gracePeriod=604797 Mar 13 16:31:01 crc kubenswrapper[4786]: I0313 16:31:01.306067 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerName="rabbitmq" containerID="cri-o://d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08" gracePeriod=604797 Mar 13 16:31:05 crc kubenswrapper[4786]: I0313 16:31:05.496105 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:31:05 crc kubenswrapper[4786]: I0313 16:31:05.562895 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-wglkz"] Mar 13 16:31:05 crc kubenswrapper[4786]: I0313 16:31:05.563939 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" podUID="9906cf23-b6ae-4b47-a6f2-011885c9954f" containerName="dnsmasq-dns" containerID="cri-o://624ab215ed4598c9fc43a1198be2f3293fffa58a2510f13ae0715d6741e2e414" gracePeriod=10 Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.086399 4786 generic.go:334] "Generic (PLEG): container finished" podID="9906cf23-b6ae-4b47-a6f2-011885c9954f" containerID="624ab215ed4598c9fc43a1198be2f3293fffa58a2510f13ae0715d6741e2e414" exitCode=0 Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.086457 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" event={"ID":"9906cf23-b6ae-4b47-a6f2-011885c9954f","Type":"ContainerDied","Data":"624ab215ed4598c9fc43a1198be2f3293fffa58a2510f13ae0715d6741e2e414"} Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.086483 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" event={"ID":"9906cf23-b6ae-4b47-a6f2-011885c9954f","Type":"ContainerDied","Data":"bd7c138a0620552b34b221a15ed74ba8ad910c58a4a068a1fcd43f41affb887a"} Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.086495 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd7c138a0620552b34b221a15ed74ba8ad910c58a4a068a1fcd43f41affb887a" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.140039 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.159912 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-dns-svc\") pod \"9906cf23-b6ae-4b47-a6f2-011885c9954f\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.160240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-config\") pod \"9906cf23-b6ae-4b47-a6f2-011885c9954f\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.160429 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqnkk\" (UniqueName: \"kubernetes.io/projected/9906cf23-b6ae-4b47-a6f2-011885c9954f-kube-api-access-nqnkk\") pod \"9906cf23-b6ae-4b47-a6f2-011885c9954f\" (UID: \"9906cf23-b6ae-4b47-a6f2-011885c9954f\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.165784 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9906cf23-b6ae-4b47-a6f2-011885c9954f-kube-api-access-nqnkk" (OuterVolumeSpecName: "kube-api-access-nqnkk") pod "9906cf23-b6ae-4b47-a6f2-011885c9954f" (UID: "9906cf23-b6ae-4b47-a6f2-011885c9954f"). InnerVolumeSpecName "kube-api-access-nqnkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.195478 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9906cf23-b6ae-4b47-a6f2-011885c9954f" (UID: "9906cf23-b6ae-4b47-a6f2-011885c9954f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.212027 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-config" (OuterVolumeSpecName: "config") pod "9906cf23-b6ae-4b47-a6f2-011885c9954f" (UID: "9906cf23-b6ae-4b47-a6f2-011885c9954f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.262073 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqnkk\" (UniqueName: \"kubernetes.io/projected/9906cf23-b6ae-4b47-a6f2-011885c9954f-kube-api-access-nqnkk\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.262115 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.262124 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9906cf23-b6ae-4b47-a6f2-011885c9954f-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.311301 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.29:5671: connect: connection refused" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.675571 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.768751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-erlang-cookie\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.768901 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.768923 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-config-data\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.768968 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-plugins-conf\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.768987 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-tls\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.769013 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-plugins\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.769036 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a594219d-8f84-4426-83e2-c73fd8bcace7-pod-info\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.769062 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a594219d-8f84-4426-83e2-c73fd8bcace7-erlang-cookie-secret\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.769090 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-confd\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.769105 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-server-conf\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.769125 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d96vw\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-kube-api-access-d96vw\") pod \"a594219d-8f84-4426-83e2-c73fd8bcace7\" (UID: \"a594219d-8f84-4426-83e2-c73fd8bcace7\") " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.769544 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.769970 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.772992 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a594219d-8f84-4426-83e2-c73fd8bcace7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.776086 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.776517 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.777092 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-kube-api-access-d96vw" (OuterVolumeSpecName: "kube-api-access-d96vw") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "kube-api-access-d96vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.777912 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a594219d-8f84-4426-83e2-c73fd8bcace7-pod-info" (OuterVolumeSpecName: "pod-info") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.800932 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d" (OuterVolumeSpecName: "persistence") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.801077 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-config-data" (OuterVolumeSpecName: "config-data") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.825516 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-server-conf" (OuterVolumeSpecName: "server-conf") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.870975 4786 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871017 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871030 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871043 4786 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a594219d-8f84-4426-83e2-c73fd8bcace7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871054 4786 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a594219d-8f84-4426-83e2-c73fd8bcace7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871065 4786 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871085 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d96vw\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-kube-api-access-d96vw\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871098 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871140 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") on node \"crc\" " Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.871152 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a594219d-8f84-4426-83e2-c73fd8bcace7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.879481 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.30:5671: connect: connection refused" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.886814 4786 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.886994 4786 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d") on node "crc" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.889115 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a594219d-8f84-4426-83e2-c73fd8bcace7" (UID: "a594219d-8f84-4426-83e2-c73fd8bcace7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.971935 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a594219d-8f84-4426-83e2-c73fd8bcace7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:06 crc kubenswrapper[4786]: I0313 16:31:06.971980 4786 reconciler_common.go:293] "Volume detached for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.098325 4786 generic.go:334] "Generic (PLEG): container finished" podID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerID="e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d" exitCode=0 Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.098419 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a594219d-8f84-4426-83e2-c73fd8bcace7","Type":"ContainerDied","Data":"e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d"} Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.098489 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c95686bd5-wglkz" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.098505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a594219d-8f84-4426-83e2-c73fd8bcace7","Type":"ContainerDied","Data":"d228ad8e7ce98e321cae1c878f9f03726b042a18810ad62d1bccc672799a3a01"} Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.098549 4786 scope.go:117] "RemoveContainer" containerID="e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.099108 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.127452 4786 scope.go:117] "RemoveContainer" containerID="2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.133927 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-wglkz"] Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.150118 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c95686bd5-wglkz"] Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.156876 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.162758 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.169977 4786 scope.go:117] "RemoveContainer" containerID="e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d" Mar 13 16:31:07 crc kubenswrapper[4786]: E0313 16:31:07.170562 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d\": container with ID starting with e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d not found: ID does not exist" containerID="e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.170727 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d"} err="failed to get container status \"e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d\": rpc error: code = NotFound desc = could not find container \"e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d\": container with ID starting with e6189c58f1dbaf76339588081828edf0c9e0ef6afbc0fdfcfecbab14c1fbe55d not found: ID does not exist" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.170959 4786 scope.go:117] "RemoveContainer" containerID="2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e" Mar 13 16:31:07 crc kubenswrapper[4786]: E0313 16:31:07.171505 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e\": container with ID starting with 2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e not found: ID does not exist" containerID="2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.171578 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e"} err="failed to get container status \"2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e\": rpc error: code = NotFound desc = could not find container \"2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e\": container with ID starting with 2db912fc19ad92c843be426092cb1cd132196cebc253163ef8074d31ed56ef5e not found: ID does not exist" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.199602 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:31:07 crc kubenswrapper[4786]: E0313 16:31:07.200071 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerName="setup-container" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.200095 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerName="setup-container" Mar 13 16:31:07 crc kubenswrapper[4786]: E0313 16:31:07.200124 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9906cf23-b6ae-4b47-a6f2-011885c9954f" containerName="init" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.200134 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9906cf23-b6ae-4b47-a6f2-011885c9954f" containerName="init" Mar 13 16:31:07 crc kubenswrapper[4786]: E0313 16:31:07.200143 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerName="rabbitmq" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.200151 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerName="rabbitmq" Mar 13 16:31:07 crc kubenswrapper[4786]: E0313 16:31:07.200165 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9906cf23-b6ae-4b47-a6f2-011885c9954f" containerName="dnsmasq-dns" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.200174 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9906cf23-b6ae-4b47-a6f2-011885c9954f" containerName="dnsmasq-dns" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.200345 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9906cf23-b6ae-4b47-a6f2-011885c9954f" containerName="dnsmasq-dns" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.200371 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a594219d-8f84-4426-83e2-c73fd8bcace7" containerName="rabbitmq" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.201378 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.204577 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.204814 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.205027 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-v9ps9" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.205177 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.205382 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.205457 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.207165 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.210292 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380097 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380174 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380227 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgzxm\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-kube-api-access-xgzxm\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380252 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380285 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380308 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380378 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380411 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-config-data\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380560 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.380647 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.482691 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.482777 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.482825 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-config-data\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.482901 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.482968 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.483015 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.483074 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.483132 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgzxm\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-kube-api-access-xgzxm\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.483163 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.483217 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.483249 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.484296 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.484694 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-config-data\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.485013 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.485026 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.486030 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.487038 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.487074 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/19750967caecaff8646851bef3ed6448fecfa64d79db9ee79d940f03448699a2/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.487640 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.487662 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.487975 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.488513 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.500919 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgzxm\" (UniqueName: \"kubernetes.io/projected/e73a6e9a-15e8-47a1-9818-086aa1a8e60e-kube-api-access-xgzxm\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.515709 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cc2b6c1-5c70-4981-b91c-e0916660515d\") pod \"rabbitmq-server-0\" (UID: \"e73a6e9a-15e8-47a1-9818-086aa1a8e60e\") " pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.525282 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.868153 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.868400 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.936445 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998155 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0de3055a-6888-4e3a-9863-10f758c5e71a-pod-info\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998222 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-plugins\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998248 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-erlang-cookie\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-tls\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998322 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0de3055a-6888-4e3a-9863-10f758c5e71a-erlang-cookie-secret\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998353 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-config-data\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998543 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998593 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-server-conf\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998646 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls6d5\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-kube-api-access-ls6d5\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998688 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-confd\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.998728 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-plugins-conf\") pod \"0de3055a-6888-4e3a-9863-10f758c5e71a\" (UID: \"0de3055a-6888-4e3a-9863-10f758c5e71a\") " Mar 13 16:31:07 crc kubenswrapper[4786]: I0313 16:31:07.999505 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:07.999909 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.005470 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.012173 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.013999 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-kube-api-access-ls6d5" (OuterVolumeSpecName: "kube-api-access-ls6d5") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "kube-api-access-ls6d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.014066 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0de3055a-6888-4e3a-9863-10f758c5e71a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.024028 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a" (OuterVolumeSpecName: "persistence") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.033331 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0de3055a-6888-4e3a-9863-10f758c5e71a-pod-info" (OuterVolumeSpecName: "pod-info") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.034666 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-config-data" (OuterVolumeSpecName: "config-data") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: W0313 16:31:08.077694 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode73a6e9a_15e8_47a1_9818_086aa1a8e60e.slice/crio-34da94f00f8d366381f47dac1db333ea5a8dc52d2b7af7d06eed65af954e25e3 WatchSource:0}: Error finding container 34da94f00f8d366381f47dac1db333ea5a8dc52d2b7af7d06eed65af954e25e3: Status 404 returned error can't find the container with id 34da94f00f8d366381f47dac1db333ea5a8dc52d2b7af7d06eed65af954e25e3 Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.077820 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.099544 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-server-conf" (OuterVolumeSpecName: "server-conf") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100201 4786 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100221 4786 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0de3055a-6888-4e3a-9863-10f758c5e71a-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100234 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100246 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100259 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100270 4786 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0de3055a-6888-4e3a-9863-10f758c5e71a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100281 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100331 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") on node \"crc\" " Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100345 4786 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0de3055a-6888-4e3a-9863-10f758c5e71a-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.100357 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls6d5\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-kube-api-access-ls6d5\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.109017 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0de3055a-6888-4e3a-9863-10f758c5e71a" (UID: "0de3055a-6888-4e3a-9863-10f758c5e71a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.109534 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e73a6e9a-15e8-47a1-9818-086aa1a8e60e","Type":"ContainerStarted","Data":"34da94f00f8d366381f47dac1db333ea5a8dc52d2b7af7d06eed65af954e25e3"} Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.113098 4786 generic.go:334] "Generic (PLEG): container finished" podID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerID="d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08" exitCode=0 Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.113148 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.113174 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0de3055a-6888-4e3a-9863-10f758c5e71a","Type":"ContainerDied","Data":"d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08"} Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.113200 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0de3055a-6888-4e3a-9863-10f758c5e71a","Type":"ContainerDied","Data":"1130096510e7844c756167c7b3c65d321813bd3bcd1d90f953312ffa3fef6d3b"} Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.113221 4786 scope.go:117] "RemoveContainer" containerID="d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.135373 4786 scope.go:117] "RemoveContainer" containerID="d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.135761 4786 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.136003 4786 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a") on node "crc" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.153487 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.161067 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.195322 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:31:08 crc kubenswrapper[4786]: E0313 16:31:08.195738 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerName="setup-container" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.195761 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerName="setup-container" Mar 13 16:31:08 crc kubenswrapper[4786]: E0313 16:31:08.195788 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerName="rabbitmq" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.195796 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerName="rabbitmq" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.195998 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0de3055a-6888-4e3a-9863-10f758c5e71a" containerName="rabbitmq" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.197029 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.197244 4786 scope.go:117] "RemoveContainer" containerID="d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08" Mar 13 16:31:08 crc kubenswrapper[4786]: E0313 16:31:08.197916 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08\": container with ID starting with d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08 not found: ID does not exist" containerID="d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.197952 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08"} err="failed to get container status \"d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08\": rpc error: code = NotFound desc = could not find container \"d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08\": container with ID starting with d401be7026e9439221faf194fd6a97475c9349e6fb8c4fd9f530396189155c08 not found: ID does not exist" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.197974 4786 scope.go:117] "RemoveContainer" containerID="d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6" Mar 13 16:31:08 crc kubenswrapper[4786]: E0313 16:31:08.198495 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6\": container with ID starting with d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6 not found: ID does not exist" containerID="d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.198528 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6"} err="failed to get container status \"d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6\": rpc error: code = NotFound desc = could not find container \"d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6\": container with ID starting with d2c62f3aebf7cdcc2ed43068f17cc7fb6df92fb0c448269a6938295ea43a7ea6 not found: ID does not exist" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.201136 4786 reconciler_common.go:293] "Volume detached for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.201164 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0de3055a-6888-4e3a-9863-10f758c5e71a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.201281 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.203153 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.203214 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.203222 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.203278 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.203153 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.203410 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d8khs" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.203420 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.302335 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.302793 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.302834 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wsxj\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-kube-api-access-2wsxj\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.302949 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/868904a0-2393-4ad2-93c1-ca840552abd8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.302990 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/868904a0-2393-4ad2-93c1-ca840552abd8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.303012 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.303052 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.303075 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.303102 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.303164 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.303195 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.403746 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wsxj\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-kube-api-access-2wsxj\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.403819 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/868904a0-2393-4ad2-93c1-ca840552abd8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.403955 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/868904a0-2393-4ad2-93c1-ca840552abd8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.403991 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.404045 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.404080 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.404120 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.404181 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.404221 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.404283 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.404341 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.404739 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.405563 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.405706 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.406073 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.406190 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/868904a0-2393-4ad2-93c1-ca840552abd8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.407659 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/868904a0-2393-4ad2-93c1-ca840552abd8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.408198 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.408227 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b95abe7429486fe8f4c3539aad0cb48cc6c6a3ebabd5c537e177b1d54a9deebf/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.409528 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/868904a0-2393-4ad2-93c1-ca840552abd8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.409550 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.411276 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.424561 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wsxj\" (UniqueName: \"kubernetes.io/projected/868904a0-2393-4ad2-93c1-ca840552abd8-kube-api-access-2wsxj\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.434328 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9c61e36-148b-4f71-ba9b-4d5eaafdfa9a\") pod \"rabbitmq-cell1-server-0\" (UID: \"868904a0-2393-4ad2-93c1-ca840552abd8\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.525398 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.564692 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0de3055a-6888-4e3a-9863-10f758c5e71a" path="/var/lib/kubelet/pods/0de3055a-6888-4e3a-9863-10f758c5e71a/volumes" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.566286 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9906cf23-b6ae-4b47-a6f2-011885c9954f" path="/var/lib/kubelet/pods/9906cf23-b6ae-4b47-a6f2-011885c9954f/volumes" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.569675 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a594219d-8f84-4426-83e2-c73fd8bcace7" path="/var/lib/kubelet/pods/a594219d-8f84-4426-83e2-c73fd8bcace7/volumes" Mar 13 16:31:08 crc kubenswrapper[4786]: I0313 16:31:08.763461 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 16:31:09 crc kubenswrapper[4786]: I0313 16:31:09.125146 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868904a0-2393-4ad2-93c1-ca840552abd8","Type":"ContainerStarted","Data":"32ded5696fd6d8ece5d391b3b4584f3c28e9b41c76207fa2fe6691505676cbe7"} Mar 13 16:31:10 crc kubenswrapper[4786]: I0313 16:31:10.141053 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868904a0-2393-4ad2-93c1-ca840552abd8","Type":"ContainerStarted","Data":"4cad70883ec30d7117dc404cc993b6e260a32fe3543ccb17d976f5885614d2fc"} Mar 13 16:31:10 crc kubenswrapper[4786]: I0313 16:31:10.144185 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e73a6e9a-15e8-47a1-9818-086aa1a8e60e","Type":"ContainerStarted","Data":"32b4002c7e6ad47aaf1cfa11314de6b78cd4e0e399e5a5f815cafae4af32980d"} Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.117714 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wv48g"] Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.119745 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.124095 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv48g"] Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.285756 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkw5z\" (UniqueName: \"kubernetes.io/projected/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-kube-api-access-dkw5z\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.285918 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-catalog-content\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.285978 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-utilities\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.387347 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkw5z\" (UniqueName: \"kubernetes.io/projected/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-kube-api-access-dkw5z\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.387759 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-catalog-content\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.387930 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-utilities\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.388443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-catalog-content\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.388620 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-utilities\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.410109 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkw5z\" (UniqueName: \"kubernetes.io/projected/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-kube-api-access-dkw5z\") pod \"redhat-marketplace-wv48g\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.498273 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.735841 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv48g"] Mar 13 16:31:37 crc kubenswrapper[4786]: W0313 16:31:37.740807 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71be2c8a_4a88_4aca_b01b_4ccdf85c8ee0.slice/crio-79800f569eb93781d5d140ccf2649db6bdea3aaeacab095ea44049a68684d285 WatchSource:0}: Error finding container 79800f569eb93781d5d140ccf2649db6bdea3aaeacab095ea44049a68684d285: Status 404 returned error can't find the container with id 79800f569eb93781d5d140ccf2649db6bdea3aaeacab095ea44049a68684d285 Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.868974 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:31:37 crc kubenswrapper[4786]: I0313 16:31:37.869041 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:31:38 crc kubenswrapper[4786]: I0313 16:31:38.386653 4786 generic.go:334] "Generic (PLEG): container finished" podID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerID="4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499" exitCode=0 Mar 13 16:31:38 crc kubenswrapper[4786]: I0313 16:31:38.386784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv48g" event={"ID":"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0","Type":"ContainerDied","Data":"4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499"} Mar 13 16:31:38 crc kubenswrapper[4786]: I0313 16:31:38.387119 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv48g" event={"ID":"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0","Type":"ContainerStarted","Data":"79800f569eb93781d5d140ccf2649db6bdea3aaeacab095ea44049a68684d285"} Mar 13 16:31:40 crc kubenswrapper[4786]: I0313 16:31:40.409834 4786 generic.go:334] "Generic (PLEG): container finished" podID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerID="3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228" exitCode=0 Mar 13 16:31:40 crc kubenswrapper[4786]: I0313 16:31:40.409899 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv48g" event={"ID":"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0","Type":"ContainerDied","Data":"3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228"} Mar 13 16:31:40 crc kubenswrapper[4786]: I0313 16:31:40.892214 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f7hff"] Mar 13 16:31:40 crc kubenswrapper[4786]: I0313 16:31:40.894911 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:40 crc kubenswrapper[4786]: I0313 16:31:40.909359 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7hff"] Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.124985 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jkq8\" (UniqueName: \"kubernetes.io/projected/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-kube-api-access-2jkq8\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.125397 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-utilities\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.125576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-catalog-content\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.226987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-catalog-content\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.227284 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jkq8\" (UniqueName: \"kubernetes.io/projected/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-kube-api-access-2jkq8\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.227375 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-utilities\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.227585 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-catalog-content\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.227763 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-utilities\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.256172 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jkq8\" (UniqueName: \"kubernetes.io/projected/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-kube-api-access-2jkq8\") pod \"certified-operators-f7hff\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.419920 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv48g" event={"ID":"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0","Type":"ContainerStarted","Data":"ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de"} Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.453282 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wv48g" podStartSLOduration=1.895255342 podStartE2EDuration="4.453259349s" podCreationTimestamp="2026-03-13 16:31:37 +0000 UTC" firstStartedPulling="2026-03-13 16:31:38.389394416 +0000 UTC m=+5328.552606267" lastFinishedPulling="2026-03-13 16:31:40.947398463 +0000 UTC m=+5331.110610274" observedRunningTime="2026-03-13 16:31:41.442048443 +0000 UTC m=+5331.605260274" watchObservedRunningTime="2026-03-13 16:31:41.453259349 +0000 UTC m=+5331.616471180" Mar 13 16:31:41 crc kubenswrapper[4786]: I0313 16:31:41.532370 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:42 crc kubenswrapper[4786]: I0313 16:31:42.063408 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7hff"] Mar 13 16:31:42 crc kubenswrapper[4786]: I0313 16:31:42.427309 4786 generic.go:334] "Generic (PLEG): container finished" podID="e73a6e9a-15e8-47a1-9818-086aa1a8e60e" containerID="32b4002c7e6ad47aaf1cfa11314de6b78cd4e0e399e5a5f815cafae4af32980d" exitCode=0 Mar 13 16:31:42 crc kubenswrapper[4786]: I0313 16:31:42.427376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e73a6e9a-15e8-47a1-9818-086aa1a8e60e","Type":"ContainerDied","Data":"32b4002c7e6ad47aaf1cfa11314de6b78cd4e0e399e5a5f815cafae4af32980d"} Mar 13 16:31:42 crc kubenswrapper[4786]: I0313 16:31:42.434120 4786 generic.go:334] "Generic (PLEG): container finished" podID="868904a0-2393-4ad2-93c1-ca840552abd8" containerID="4cad70883ec30d7117dc404cc993b6e260a32fe3543ccb17d976f5885614d2fc" exitCode=0 Mar 13 16:31:42 crc kubenswrapper[4786]: I0313 16:31:42.434173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868904a0-2393-4ad2-93c1-ca840552abd8","Type":"ContainerDied","Data":"4cad70883ec30d7117dc404cc993b6e260a32fe3543ccb17d976f5885614d2fc"} Mar 13 16:31:42 crc kubenswrapper[4786]: I0313 16:31:42.439574 4786 generic.go:334] "Generic (PLEG): container finished" podID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerID="8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4" exitCode=0 Mar 13 16:31:42 crc kubenswrapper[4786]: I0313 16:31:42.439994 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7hff" event={"ID":"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f","Type":"ContainerDied","Data":"8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4"} Mar 13 16:31:42 crc kubenswrapper[4786]: I0313 16:31:42.440224 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7hff" event={"ID":"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f","Type":"ContainerStarted","Data":"3ccec4010598df7318e912ca0fc0c713fe35263452e8f82951f34b1e1d47d3a8"} Mar 13 16:31:42 crc kubenswrapper[4786]: E0313 16:31:42.558512 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868904a0_2393_4ad2_93c1_ca840552abd8.slice/crio-4cad70883ec30d7117dc404cc993b6e260a32fe3543ccb17d976f5885614d2fc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868904a0_2393_4ad2_93c1_ca840552abd8.slice/crio-conmon-4cad70883ec30d7117dc404cc993b6e260a32fe3543ccb17d976f5885614d2fc.scope\": RecentStats: unable to find data in memory cache]" Mar 13 16:31:43 crc kubenswrapper[4786]: I0313 16:31:43.451225 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868904a0-2393-4ad2-93c1-ca840552abd8","Type":"ContainerStarted","Data":"d039dc568cae6e74c7cdf626d6d030acccf36213c361c86abbcd1ee56a59d48a"} Mar 13 16:31:43 crc kubenswrapper[4786]: I0313 16:31:43.451775 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:31:43 crc kubenswrapper[4786]: I0313 16:31:43.455139 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e73a6e9a-15e8-47a1-9818-086aa1a8e60e","Type":"ContainerStarted","Data":"48e6baa4313499c016e38e5b264ffdeb3534308bceb6e76e960062d690a9880b"} Mar 13 16:31:43 crc kubenswrapper[4786]: I0313 16:31:43.455845 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 16:31:43 crc kubenswrapper[4786]: I0313 16:31:43.480982 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.480965809 podStartE2EDuration="35.480965809s" podCreationTimestamp="2026-03-13 16:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:31:43.473266449 +0000 UTC m=+5333.636478250" watchObservedRunningTime="2026-03-13 16:31:43.480965809 +0000 UTC m=+5333.644177620" Mar 13 16:31:43 crc kubenswrapper[4786]: I0313 16:31:43.511641 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.511625644 podStartE2EDuration="36.511625644s" podCreationTimestamp="2026-03-13 16:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:31:43.5065934 +0000 UTC m=+5333.669805211" watchObservedRunningTime="2026-03-13 16:31:43.511625644 +0000 UTC m=+5333.674837455" Mar 13 16:31:44 crc kubenswrapper[4786]: I0313 16:31:44.464645 4786 generic.go:334] "Generic (PLEG): container finished" podID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerID="b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda" exitCode=0 Mar 13 16:31:44 crc kubenswrapper[4786]: I0313 16:31:44.464709 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7hff" event={"ID":"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f","Type":"ContainerDied","Data":"b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda"} Mar 13 16:31:45 crc kubenswrapper[4786]: I0313 16:31:45.475650 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7hff" event={"ID":"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f","Type":"ContainerStarted","Data":"d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6"} Mar 13 16:31:47 crc kubenswrapper[4786]: I0313 16:31:47.499354 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:47 crc kubenswrapper[4786]: I0313 16:31:47.499764 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:47 crc kubenswrapper[4786]: I0313 16:31:47.557799 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:47 crc kubenswrapper[4786]: I0313 16:31:47.591930 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f7hff" podStartSLOduration=5.041171967 podStartE2EDuration="7.591862983s" podCreationTimestamp="2026-03-13 16:31:40 +0000 UTC" firstStartedPulling="2026-03-13 16:31:42.442466767 +0000 UTC m=+5332.605678578" lastFinishedPulling="2026-03-13 16:31:44.993157753 +0000 UTC m=+5335.156369594" observedRunningTime="2026-03-13 16:31:45.515560947 +0000 UTC m=+5335.678772758" watchObservedRunningTime="2026-03-13 16:31:47.591862983 +0000 UTC m=+5337.755074804" Mar 13 16:31:48 crc kubenswrapper[4786]: I0313 16:31:48.549508 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:48 crc kubenswrapper[4786]: I0313 16:31:48.670738 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv48g"] Mar 13 16:31:50 crc kubenswrapper[4786]: I0313 16:31:50.511932 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wv48g" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerName="registry-server" containerID="cri-o://ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de" gracePeriod=2 Mar 13 16:31:50 crc kubenswrapper[4786]: I0313 16:31:50.898124 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:50 crc kubenswrapper[4786]: I0313 16:31:50.971490 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-utilities\") pod \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " Mar 13 16:31:50 crc kubenswrapper[4786]: I0313 16:31:50.971639 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkw5z\" (UniqueName: \"kubernetes.io/projected/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-kube-api-access-dkw5z\") pod \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " Mar 13 16:31:50 crc kubenswrapper[4786]: I0313 16:31:50.971666 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-catalog-content\") pod \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\" (UID: \"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0\") " Mar 13 16:31:50 crc kubenswrapper[4786]: I0313 16:31:50.973596 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-utilities" (OuterVolumeSpecName: "utilities") pod "71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" (UID: "71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:31:50 crc kubenswrapper[4786]: I0313 16:31:50.978296 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-kube-api-access-dkw5z" (OuterVolumeSpecName: "kube-api-access-dkw5z") pod "71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" (UID: "71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0"). InnerVolumeSpecName "kube-api-access-dkw5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.001127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" (UID: "71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.073116 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.073155 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkw5z\" (UniqueName: \"kubernetes.io/projected/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-kube-api-access-dkw5z\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.073166 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.524438 4786 generic.go:334] "Generic (PLEG): container finished" podID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerID="ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de" exitCode=0 Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.524541 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv48g" event={"ID":"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0","Type":"ContainerDied","Data":"ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de"} Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.524578 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv48g" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.524703 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv48g" event={"ID":"71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0","Type":"ContainerDied","Data":"79800f569eb93781d5d140ccf2649db6bdea3aaeacab095ea44049a68684d285"} Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.524719 4786 scope.go:117] "RemoveContainer" containerID="ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.533545 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.533597 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.567621 4786 scope.go:117] "RemoveContainer" containerID="3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.572823 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv48g"] Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.580201 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv48g"] Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.580390 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.611802 4786 scope.go:117] "RemoveContainer" containerID="4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.633003 4786 scope.go:117] "RemoveContainer" containerID="ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de" Mar 13 16:31:51 crc kubenswrapper[4786]: E0313 16:31:51.633583 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de\": container with ID starting with ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de not found: ID does not exist" containerID="ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.633619 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de"} err="failed to get container status \"ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de\": rpc error: code = NotFound desc = could not find container \"ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de\": container with ID starting with ca9582ead8bcb8028a4520c35a6f92185d197a2cdde8a229606cd09505bd98de not found: ID does not exist" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.633643 4786 scope.go:117] "RemoveContainer" containerID="3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228" Mar 13 16:31:51 crc kubenswrapper[4786]: E0313 16:31:51.634367 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228\": container with ID starting with 3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228 not found: ID does not exist" containerID="3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.634418 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228"} err="failed to get container status \"3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228\": rpc error: code = NotFound desc = could not find container \"3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228\": container with ID starting with 3cafb7fea1b6c3bbfbcbf169e3196fc7c04713878cc4e6276a9e4e27bb35a228 not found: ID does not exist" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.634452 4786 scope.go:117] "RemoveContainer" containerID="4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499" Mar 13 16:31:51 crc kubenswrapper[4786]: E0313 16:31:51.634897 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499\": container with ID starting with 4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499 not found: ID does not exist" containerID="4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499" Mar 13 16:31:51 crc kubenswrapper[4786]: I0313 16:31:51.634954 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499"} err="failed to get container status \"4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499\": rpc error: code = NotFound desc = could not find container \"4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499\": container with ID starting with 4c251c383f58ce69c01db84ce1ac01d60dcf045de3ff62a58bceefe1a0723499 not found: ID does not exist" Mar 13 16:31:52 crc kubenswrapper[4786]: I0313 16:31:52.564181 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" path="/var/lib/kubelet/pods/71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0/volumes" Mar 13 16:31:52 crc kubenswrapper[4786]: I0313 16:31:52.612775 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:53 crc kubenswrapper[4786]: I0313 16:31:53.877182 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7hff"] Mar 13 16:31:54 crc kubenswrapper[4786]: I0313 16:31:54.558986 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f7hff" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerName="registry-server" containerID="cri-o://d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6" gracePeriod=2 Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.548306 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.574272 4786 generic.go:334] "Generic (PLEG): container finished" podID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerID="d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6" exitCode=0 Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.574318 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7hff" event={"ID":"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f","Type":"ContainerDied","Data":"d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6"} Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.574349 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7hff" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.574376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7hff" event={"ID":"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f","Type":"ContainerDied","Data":"3ccec4010598df7318e912ca0fc0c713fe35263452e8f82951f34b1e1d47d3a8"} Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.574395 4786 scope.go:117] "RemoveContainer" containerID="d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.594566 4786 scope.go:117] "RemoveContainer" containerID="b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.611295 4786 scope.go:117] "RemoveContainer" containerID="8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.638270 4786 scope.go:117] "RemoveContainer" containerID="d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6" Mar 13 16:31:55 crc kubenswrapper[4786]: E0313 16:31:55.638791 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6\": container with ID starting with d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6 not found: ID does not exist" containerID="d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.638916 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6"} err="failed to get container status \"d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6\": rpc error: code = NotFound desc = could not find container \"d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6\": container with ID starting with d9b0cc7e99040712544d06fc6bd359c5a1b55af6e4aa16744d9dd849044a36a6 not found: ID does not exist" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.638984 4786 scope.go:117] "RemoveContainer" containerID="b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda" Mar 13 16:31:55 crc kubenswrapper[4786]: E0313 16:31:55.639488 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda\": container with ID starting with b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda not found: ID does not exist" containerID="b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.639536 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda"} err="failed to get container status \"b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda\": rpc error: code = NotFound desc = could not find container \"b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda\": container with ID starting with b987956c8ae27fec8c1cfc3dc5eb616f3ee5149c419e9deedec9b137f3e45cda not found: ID does not exist" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.639570 4786 scope.go:117] "RemoveContainer" containerID="8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4" Mar 13 16:31:55 crc kubenswrapper[4786]: E0313 16:31:55.639912 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4\": container with ID starting with 8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4 not found: ID does not exist" containerID="8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.639954 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4"} err="failed to get container status \"8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4\": rpc error: code = NotFound desc = could not find container \"8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4\": container with ID starting with 8d85e50ec4daa8e8140f680bb64aa615e673d641702caf100262878deb01d1b4 not found: ID does not exist" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.644650 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-catalog-content\") pod \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.645040 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jkq8\" (UniqueName: \"kubernetes.io/projected/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-kube-api-access-2jkq8\") pod \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.645110 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-utilities\") pod \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\" (UID: \"a94061f3-5864-4c84-a1d9-bd1d1dd5d37f\") " Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.646261 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-utilities" (OuterVolumeSpecName: "utilities") pod "a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" (UID: "a94061f3-5864-4c84-a1d9-bd1d1dd5d37f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.650580 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.652508 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-kube-api-access-2jkq8" (OuterVolumeSpecName: "kube-api-access-2jkq8") pod "a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" (UID: "a94061f3-5864-4c84-a1d9-bd1d1dd5d37f"). InnerVolumeSpecName "kube-api-access-2jkq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.702253 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" (UID: "a94061f3-5864-4c84-a1d9-bd1d1dd5d37f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.752412 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.752460 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jkq8\" (UniqueName: \"kubernetes.io/projected/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f-kube-api-access-2jkq8\") on node \"crc\" DevicePath \"\"" Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.910834 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7hff"] Mar 13 16:31:55 crc kubenswrapper[4786]: I0313 16:31:55.915616 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f7hff"] Mar 13 16:31:56 crc kubenswrapper[4786]: I0313 16:31:56.569629 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" path="/var/lib/kubelet/pods/a94061f3-5864-4c84-a1d9-bd1d1dd5d37f/volumes" Mar 13 16:31:57 crc kubenswrapper[4786]: I0313 16:31:57.529139 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 16:31:58 crc kubenswrapper[4786]: I0313 16:31:58.530134 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.145042 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556992-mjb92"] Mar 13 16:32:00 crc kubenswrapper[4786]: E0313 16:32:00.145974 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerName="extract-content" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.146002 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerName="extract-content" Mar 13 16:32:00 crc kubenswrapper[4786]: E0313 16:32:00.146041 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerName="registry-server" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.146054 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerName="registry-server" Mar 13 16:32:00 crc kubenswrapper[4786]: E0313 16:32:00.146069 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerName="extract-content" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.146082 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerName="extract-content" Mar 13 16:32:00 crc kubenswrapper[4786]: E0313 16:32:00.146099 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerName="registry-server" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.146110 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerName="registry-server" Mar 13 16:32:00 crc kubenswrapper[4786]: E0313 16:32:00.146137 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerName="extract-utilities" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.146150 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerName="extract-utilities" Mar 13 16:32:00 crc kubenswrapper[4786]: E0313 16:32:00.146170 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerName="extract-utilities" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.146184 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerName="extract-utilities" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.146433 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="71be2c8a-4a88-4aca-b01b-4ccdf85c8ee0" containerName="registry-server" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.146456 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94061f3-5864-4c84-a1d9-bd1d1dd5d37f" containerName="registry-server" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.147277 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556992-mjb92" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.151063 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.151881 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.153188 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.160076 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556992-mjb92"] Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.232599 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbd8q\" (UniqueName: \"kubernetes.io/projected/f65251d4-c3a2-4aab-8e06-790dfff1ef83-kube-api-access-cbd8q\") pod \"auto-csr-approver-29556992-mjb92\" (UID: \"f65251d4-c3a2-4aab-8e06-790dfff1ef83\") " pod="openshift-infra/auto-csr-approver-29556992-mjb92" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.334766 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbd8q\" (UniqueName: \"kubernetes.io/projected/f65251d4-c3a2-4aab-8e06-790dfff1ef83-kube-api-access-cbd8q\") pod \"auto-csr-approver-29556992-mjb92\" (UID: \"f65251d4-c3a2-4aab-8e06-790dfff1ef83\") " pod="openshift-infra/auto-csr-approver-29556992-mjb92" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.373563 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbd8q\" (UniqueName: \"kubernetes.io/projected/f65251d4-c3a2-4aab-8e06-790dfff1ef83-kube-api-access-cbd8q\") pod \"auto-csr-approver-29556992-mjb92\" (UID: \"f65251d4-c3a2-4aab-8e06-790dfff1ef83\") " pod="openshift-infra/auto-csr-approver-29556992-mjb92" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.465982 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556992-mjb92" Mar 13 16:32:00 crc kubenswrapper[4786]: I0313 16:32:00.968763 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556992-mjb92"] Mar 13 16:32:01 crc kubenswrapper[4786]: I0313 16:32:01.628222 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556992-mjb92" event={"ID":"f65251d4-c3a2-4aab-8e06-790dfff1ef83","Type":"ContainerStarted","Data":"22964ec9ab01500ce11029bbafd4c49456287035fe46a7b25e80f32a2142d2b2"} Mar 13 16:32:01 crc kubenswrapper[4786]: I0313 16:32:01.675813 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 13 16:32:01 crc kubenswrapper[4786]: I0313 16:32:01.677154 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:32:01 crc kubenswrapper[4786]: I0313 16:32:01.682808 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z8lt4" Mar 13 16:32:01 crc kubenswrapper[4786]: I0313 16:32:01.691985 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:32:01 crc kubenswrapper[4786]: I0313 16:32:01.757952 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p8f6\" (UniqueName: \"kubernetes.io/projected/f04fa1e5-0029-4d35-a92a-a388f0cc35b6-kube-api-access-7p8f6\") pod \"mariadb-client\" (UID: \"f04fa1e5-0029-4d35-a92a-a388f0cc35b6\") " pod="openstack/mariadb-client" Mar 13 16:32:01 crc kubenswrapper[4786]: I0313 16:32:01.859454 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p8f6\" (UniqueName: \"kubernetes.io/projected/f04fa1e5-0029-4d35-a92a-a388f0cc35b6-kube-api-access-7p8f6\") pod \"mariadb-client\" (UID: \"f04fa1e5-0029-4d35-a92a-a388f0cc35b6\") " pod="openstack/mariadb-client" Mar 13 16:32:01 crc kubenswrapper[4786]: I0313 16:32:01.881307 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p8f6\" (UniqueName: \"kubernetes.io/projected/f04fa1e5-0029-4d35-a92a-a388f0cc35b6-kube-api-access-7p8f6\") pod \"mariadb-client\" (UID: \"f04fa1e5-0029-4d35-a92a-a388f0cc35b6\") " pod="openstack/mariadb-client" Mar 13 16:32:02 crc kubenswrapper[4786]: I0313 16:32:02.005359 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:32:02 crc kubenswrapper[4786]: I0313 16:32:02.354544 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:32:02 crc kubenswrapper[4786]: W0313 16:32:02.359278 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf04fa1e5_0029_4d35_a92a_a388f0cc35b6.slice/crio-d9a98d285dd5be57709ee0488d6463a3d9c69dcb522f9fed9fcfda3ceaf33149 WatchSource:0}: Error finding container d9a98d285dd5be57709ee0488d6463a3d9c69dcb522f9fed9fcfda3ceaf33149: Status 404 returned error can't find the container with id d9a98d285dd5be57709ee0488d6463a3d9c69dcb522f9fed9fcfda3ceaf33149 Mar 13 16:32:02 crc kubenswrapper[4786]: I0313 16:32:02.639583 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f04fa1e5-0029-4d35-a92a-a388f0cc35b6","Type":"ContainerStarted","Data":"d9a98d285dd5be57709ee0488d6463a3d9c69dcb522f9fed9fcfda3ceaf33149"} Mar 13 16:32:02 crc kubenswrapper[4786]: I0313 16:32:02.641502 4786 generic.go:334] "Generic (PLEG): container finished" podID="f65251d4-c3a2-4aab-8e06-790dfff1ef83" containerID="7e484f1a72de0c7f13c8b1e84c50a6bd2f41ce9d30245628de9f99a2d2ee8a37" exitCode=0 Mar 13 16:32:02 crc kubenswrapper[4786]: I0313 16:32:02.641530 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556992-mjb92" event={"ID":"f65251d4-c3a2-4aab-8e06-790dfff1ef83","Type":"ContainerDied","Data":"7e484f1a72de0c7f13c8b1e84c50a6bd2f41ce9d30245628de9f99a2d2ee8a37"} Mar 13 16:32:03 crc kubenswrapper[4786]: I0313 16:32:03.653797 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f04fa1e5-0029-4d35-a92a-a388f0cc35b6","Type":"ContainerStarted","Data":"4be8cdd2c87eb87119aa7f99ce5a4563fd4340721efdf0480d71e2d085967b4a"} Mar 13 16:32:03 crc kubenswrapper[4786]: I0313 16:32:03.687095 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.198659271 podStartE2EDuration="2.687068328s" podCreationTimestamp="2026-03-13 16:32:01 +0000 UTC" firstStartedPulling="2026-03-13 16:32:02.361049166 +0000 UTC m=+5352.524260977" lastFinishedPulling="2026-03-13 16:32:02.849458223 +0000 UTC m=+5353.012670034" observedRunningTime="2026-03-13 16:32:03.675220686 +0000 UTC m=+5353.838432497" watchObservedRunningTime="2026-03-13 16:32:03.687068328 +0000 UTC m=+5353.850280139" Mar 13 16:32:04 crc kubenswrapper[4786]: I0313 16:32:04.036509 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556992-mjb92" Mar 13 16:32:04 crc kubenswrapper[4786]: I0313 16:32:04.111447 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbd8q\" (UniqueName: \"kubernetes.io/projected/f65251d4-c3a2-4aab-8e06-790dfff1ef83-kube-api-access-cbd8q\") pod \"f65251d4-c3a2-4aab-8e06-790dfff1ef83\" (UID: \"f65251d4-c3a2-4aab-8e06-790dfff1ef83\") " Mar 13 16:32:04 crc kubenswrapper[4786]: I0313 16:32:04.125611 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65251d4-c3a2-4aab-8e06-790dfff1ef83-kube-api-access-cbd8q" (OuterVolumeSpecName: "kube-api-access-cbd8q") pod "f65251d4-c3a2-4aab-8e06-790dfff1ef83" (UID: "f65251d4-c3a2-4aab-8e06-790dfff1ef83"). InnerVolumeSpecName "kube-api-access-cbd8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:32:04 crc kubenswrapper[4786]: I0313 16:32:04.213469 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbd8q\" (UniqueName: \"kubernetes.io/projected/f65251d4-c3a2-4aab-8e06-790dfff1ef83-kube-api-access-cbd8q\") on node \"crc\" DevicePath \"\"" Mar 13 16:32:04 crc kubenswrapper[4786]: I0313 16:32:04.664359 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556992-mjb92" event={"ID":"f65251d4-c3a2-4aab-8e06-790dfff1ef83","Type":"ContainerDied","Data":"22964ec9ab01500ce11029bbafd4c49456287035fe46a7b25e80f32a2142d2b2"} Mar 13 16:32:04 crc kubenswrapper[4786]: I0313 16:32:04.664937 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22964ec9ab01500ce11029bbafd4c49456287035fe46a7b25e80f32a2142d2b2" Mar 13 16:32:04 crc kubenswrapper[4786]: I0313 16:32:04.664392 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556992-mjb92" Mar 13 16:32:05 crc kubenswrapper[4786]: I0313 16:32:05.111673 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556986-m4gx6"] Mar 13 16:32:05 crc kubenswrapper[4786]: I0313 16:32:05.124197 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556986-m4gx6"] Mar 13 16:32:06 crc kubenswrapper[4786]: I0313 16:32:06.567976 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b49e165-c3c0-4ddc-ae7b-5f731910e1d4" path="/var/lib/kubelet/pods/8b49e165-c3c0-4ddc-ae7b-5f731910e1d4/volumes" Mar 13 16:32:07 crc kubenswrapper[4786]: I0313 16:32:07.868730 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:32:07 crc kubenswrapper[4786]: I0313 16:32:07.869122 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:32:07 crc kubenswrapper[4786]: I0313 16:32:07.869174 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:32:07 crc kubenswrapper[4786]: I0313 16:32:07.869942 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:32:07 crc kubenswrapper[4786]: I0313 16:32:07.870021 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" gracePeriod=600 Mar 13 16:32:08 crc kubenswrapper[4786]: E0313 16:32:08.014449 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:32:08 crc kubenswrapper[4786]: I0313 16:32:08.707191 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" exitCode=0 Mar 13 16:32:08 crc kubenswrapper[4786]: I0313 16:32:08.707254 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade"} Mar 13 16:32:08 crc kubenswrapper[4786]: I0313 16:32:08.707526 4786 scope.go:117] "RemoveContainer" containerID="dc7c313ae26eaf0288c8be9988aeae6dd1a6013e849e3a00ef45160c50254dc2" Mar 13 16:32:08 crc kubenswrapper[4786]: I0313 16:32:08.707886 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:32:08 crc kubenswrapper[4786]: E0313 16:32:08.708141 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:32:13 crc kubenswrapper[4786]: I0313 16:32:13.983333 4786 scope.go:117] "RemoveContainer" containerID="8d0009dc6988b2cd872114ab818d61293fa2515b0ff464035a7a38837f3f5cda" Mar 13 16:32:16 crc kubenswrapper[4786]: I0313 16:32:16.504364 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:32:16 crc kubenswrapper[4786]: I0313 16:32:16.504801 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="f04fa1e5-0029-4d35-a92a-a388f0cc35b6" containerName="mariadb-client" containerID="cri-o://4be8cdd2c87eb87119aa7f99ce5a4563fd4340721efdf0480d71e2d085967b4a" gracePeriod=30 Mar 13 16:32:16 crc kubenswrapper[4786]: I0313 16:32:16.786759 4786 generic.go:334] "Generic (PLEG): container finished" podID="f04fa1e5-0029-4d35-a92a-a388f0cc35b6" containerID="4be8cdd2c87eb87119aa7f99ce5a4563fd4340721efdf0480d71e2d085967b4a" exitCode=143 Mar 13 16:32:16 crc kubenswrapper[4786]: I0313 16:32:16.786808 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f04fa1e5-0029-4d35-a92a-a388f0cc35b6","Type":"ContainerDied","Data":"4be8cdd2c87eb87119aa7f99ce5a4563fd4340721efdf0480d71e2d085967b4a"} Mar 13 16:32:16 crc kubenswrapper[4786]: I0313 16:32:16.989696 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:32:17 crc kubenswrapper[4786]: I0313 16:32:17.034701 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p8f6\" (UniqueName: \"kubernetes.io/projected/f04fa1e5-0029-4d35-a92a-a388f0cc35b6-kube-api-access-7p8f6\") pod \"f04fa1e5-0029-4d35-a92a-a388f0cc35b6\" (UID: \"f04fa1e5-0029-4d35-a92a-a388f0cc35b6\") " Mar 13 16:32:17 crc kubenswrapper[4786]: I0313 16:32:17.041314 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04fa1e5-0029-4d35-a92a-a388f0cc35b6-kube-api-access-7p8f6" (OuterVolumeSpecName: "kube-api-access-7p8f6") pod "f04fa1e5-0029-4d35-a92a-a388f0cc35b6" (UID: "f04fa1e5-0029-4d35-a92a-a388f0cc35b6"). InnerVolumeSpecName "kube-api-access-7p8f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:32:17 crc kubenswrapper[4786]: I0313 16:32:17.137490 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p8f6\" (UniqueName: \"kubernetes.io/projected/f04fa1e5-0029-4d35-a92a-a388f0cc35b6-kube-api-access-7p8f6\") on node \"crc\" DevicePath \"\"" Mar 13 16:32:17 crc kubenswrapper[4786]: I0313 16:32:17.798120 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f04fa1e5-0029-4d35-a92a-a388f0cc35b6","Type":"ContainerDied","Data":"d9a98d285dd5be57709ee0488d6463a3d9c69dcb522f9fed9fcfda3ceaf33149"} Mar 13 16:32:17 crc kubenswrapper[4786]: I0313 16:32:17.798172 4786 scope.go:117] "RemoveContainer" containerID="4be8cdd2c87eb87119aa7f99ce5a4563fd4340721efdf0480d71e2d085967b4a" Mar 13 16:32:17 crc kubenswrapper[4786]: I0313 16:32:17.798209 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:32:17 crc kubenswrapper[4786]: I0313 16:32:17.841076 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:32:17 crc kubenswrapper[4786]: I0313 16:32:17.848609 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:32:18 crc kubenswrapper[4786]: I0313 16:32:18.562229 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04fa1e5-0029-4d35-a92a-a388f0cc35b6" path="/var/lib/kubelet/pods/f04fa1e5-0029-4d35-a92a-a388f0cc35b6/volumes" Mar 13 16:32:23 crc kubenswrapper[4786]: I0313 16:32:23.552010 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:32:23 crc kubenswrapper[4786]: E0313 16:32:23.552711 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:32:38 crc kubenswrapper[4786]: I0313 16:32:38.552250 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:32:38 crc kubenswrapper[4786]: E0313 16:32:38.553113 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:32:51 crc kubenswrapper[4786]: I0313 16:32:51.552014 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:32:51 crc kubenswrapper[4786]: E0313 16:32:51.552663 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:33:03 crc kubenswrapper[4786]: I0313 16:33:03.551943 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:33:03 crc kubenswrapper[4786]: E0313 16:33:03.552623 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:33:15 crc kubenswrapper[4786]: I0313 16:33:15.552739 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:33:15 crc kubenswrapper[4786]: E0313 16:33:15.553962 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:33:28 crc kubenswrapper[4786]: I0313 16:33:28.552824 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:33:28 crc kubenswrapper[4786]: E0313 16:33:28.553804 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:33:40 crc kubenswrapper[4786]: I0313 16:33:40.556382 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:33:40 crc kubenswrapper[4786]: E0313 16:33:40.557725 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:33:52 crc kubenswrapper[4786]: I0313 16:33:52.552281 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:33:52 crc kubenswrapper[4786]: E0313 16:33:52.553250 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.166696 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556994-nwtzb"] Mar 13 16:34:00 crc kubenswrapper[4786]: E0313 16:34:00.167816 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04fa1e5-0029-4d35-a92a-a388f0cc35b6" containerName="mariadb-client" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.167839 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04fa1e5-0029-4d35-a92a-a388f0cc35b6" containerName="mariadb-client" Mar 13 16:34:00 crc kubenswrapper[4786]: E0313 16:34:00.167913 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65251d4-c3a2-4aab-8e06-790dfff1ef83" containerName="oc" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.167929 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65251d4-c3a2-4aab-8e06-790dfff1ef83" containerName="oc" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.168164 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f65251d4-c3a2-4aab-8e06-790dfff1ef83" containerName="oc" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.168187 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04fa1e5-0029-4d35-a92a-a388f0cc35b6" containerName="mariadb-client" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.168944 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556994-nwtzb" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.172045 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.172342 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.172612 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.174177 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556994-nwtzb"] Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.241502 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8zb6\" (UniqueName: \"kubernetes.io/projected/9ad766d3-3c99-4eaa-a277-640201cf3c94-kube-api-access-t8zb6\") pod \"auto-csr-approver-29556994-nwtzb\" (UID: \"9ad766d3-3c99-4eaa-a277-640201cf3c94\") " pod="openshift-infra/auto-csr-approver-29556994-nwtzb" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.342727 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8zb6\" (UniqueName: \"kubernetes.io/projected/9ad766d3-3c99-4eaa-a277-640201cf3c94-kube-api-access-t8zb6\") pod \"auto-csr-approver-29556994-nwtzb\" (UID: \"9ad766d3-3c99-4eaa-a277-640201cf3c94\") " pod="openshift-infra/auto-csr-approver-29556994-nwtzb" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.360838 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8zb6\" (UniqueName: \"kubernetes.io/projected/9ad766d3-3c99-4eaa-a277-640201cf3c94-kube-api-access-t8zb6\") pod \"auto-csr-approver-29556994-nwtzb\" (UID: \"9ad766d3-3c99-4eaa-a277-640201cf3c94\") " pod="openshift-infra/auto-csr-approver-29556994-nwtzb" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.502549 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556994-nwtzb" Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.815451 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556994-nwtzb"] Mar 13 16:34:00 crc kubenswrapper[4786]: I0313 16:34:00.826380 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:34:01 crc kubenswrapper[4786]: I0313 16:34:01.782125 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556994-nwtzb" event={"ID":"9ad766d3-3c99-4eaa-a277-640201cf3c94","Type":"ContainerStarted","Data":"721f8120d966d8876e6829887e91dea3a3e78a8f7941f13cdceacdf936217758"} Mar 13 16:34:02 crc kubenswrapper[4786]: I0313 16:34:02.794164 4786 generic.go:334] "Generic (PLEG): container finished" podID="9ad766d3-3c99-4eaa-a277-640201cf3c94" containerID="869bbac45baba2516d13326299b71472b7e835267e0871603b9baf630008ea2a" exitCode=0 Mar 13 16:34:02 crc kubenswrapper[4786]: I0313 16:34:02.794321 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556994-nwtzb" event={"ID":"9ad766d3-3c99-4eaa-a277-640201cf3c94","Type":"ContainerDied","Data":"869bbac45baba2516d13326299b71472b7e835267e0871603b9baf630008ea2a"} Mar 13 16:34:04 crc kubenswrapper[4786]: I0313 16:34:04.098537 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556994-nwtzb" Mar 13 16:34:04 crc kubenswrapper[4786]: I0313 16:34:04.107471 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8zb6\" (UniqueName: \"kubernetes.io/projected/9ad766d3-3c99-4eaa-a277-640201cf3c94-kube-api-access-t8zb6\") pod \"9ad766d3-3c99-4eaa-a277-640201cf3c94\" (UID: \"9ad766d3-3c99-4eaa-a277-640201cf3c94\") " Mar 13 16:34:04 crc kubenswrapper[4786]: I0313 16:34:04.125388 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad766d3-3c99-4eaa-a277-640201cf3c94-kube-api-access-t8zb6" (OuterVolumeSpecName: "kube-api-access-t8zb6") pod "9ad766d3-3c99-4eaa-a277-640201cf3c94" (UID: "9ad766d3-3c99-4eaa-a277-640201cf3c94"). InnerVolumeSpecName "kube-api-access-t8zb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:34:04 crc kubenswrapper[4786]: I0313 16:34:04.209278 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8zb6\" (UniqueName: \"kubernetes.io/projected/9ad766d3-3c99-4eaa-a277-640201cf3c94-kube-api-access-t8zb6\") on node \"crc\" DevicePath \"\"" Mar 13 16:34:04 crc kubenswrapper[4786]: I0313 16:34:04.552773 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:34:04 crc kubenswrapper[4786]: E0313 16:34:04.553314 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:34:04 crc kubenswrapper[4786]: I0313 16:34:04.817841 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556994-nwtzb" event={"ID":"9ad766d3-3c99-4eaa-a277-640201cf3c94","Type":"ContainerDied","Data":"721f8120d966d8876e6829887e91dea3a3e78a8f7941f13cdceacdf936217758"} Mar 13 16:34:04 crc kubenswrapper[4786]: I0313 16:34:04.817909 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="721f8120d966d8876e6829887e91dea3a3e78a8f7941f13cdceacdf936217758" Mar 13 16:34:04 crc kubenswrapper[4786]: I0313 16:34:04.817965 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556994-nwtzb" Mar 13 16:34:05 crc kubenswrapper[4786]: I0313 16:34:05.171724 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556988-6msng"] Mar 13 16:34:05 crc kubenswrapper[4786]: I0313 16:34:05.177017 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556988-6msng"] Mar 13 16:34:06 crc kubenswrapper[4786]: I0313 16:34:06.568334 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309022bf-f87f-42c2-8cdb-f03ddf74bbd9" path="/var/lib/kubelet/pods/309022bf-f87f-42c2-8cdb-f03ddf74bbd9/volumes" Mar 13 16:34:14 crc kubenswrapper[4786]: I0313 16:34:14.137041 4786 scope.go:117] "RemoveContainer" containerID="ea01c12cf711381e618eeae990725a65856bae6645910485b11e86d9d87cacde" Mar 13 16:34:14 crc kubenswrapper[4786]: I0313 16:34:14.180150 4786 scope.go:117] "RemoveContainer" containerID="90cf092f6a267c5756f3b5141d6b6fbbcb7dee79f877e660242006e224fc0109" Mar 13 16:34:16 crc kubenswrapper[4786]: I0313 16:34:16.553160 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:34:16 crc kubenswrapper[4786]: E0313 16:34:16.554102 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:34:30 crc kubenswrapper[4786]: I0313 16:34:30.561015 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:34:30 crc kubenswrapper[4786]: E0313 16:34:30.562271 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:34:30 crc kubenswrapper[4786]: I0313 16:34:30.895495 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lrkgg"] Mar 13 16:34:30 crc kubenswrapper[4786]: E0313 16:34:30.895818 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad766d3-3c99-4eaa-a277-640201cf3c94" containerName="oc" Mar 13 16:34:30 crc kubenswrapper[4786]: I0313 16:34:30.895836 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad766d3-3c99-4eaa-a277-640201cf3c94" containerName="oc" Mar 13 16:34:30 crc kubenswrapper[4786]: I0313 16:34:30.896031 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad766d3-3c99-4eaa-a277-640201cf3c94" containerName="oc" Mar 13 16:34:30 crc kubenswrapper[4786]: I0313 16:34:30.897167 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:30 crc kubenswrapper[4786]: I0313 16:34:30.932409 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrkgg"] Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.055917 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm4rp\" (UniqueName: \"kubernetes.io/projected/e305f58e-9744-459f-84a4-20087f4f3cac-kube-api-access-lm4rp\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.055970 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-utilities\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.056034 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-catalog-content\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.157263 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm4rp\" (UniqueName: \"kubernetes.io/projected/e305f58e-9744-459f-84a4-20087f4f3cac-kube-api-access-lm4rp\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.157317 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-utilities\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.157382 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-catalog-content\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.157940 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-catalog-content\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.158174 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-utilities\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.180656 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm4rp\" (UniqueName: \"kubernetes.io/projected/e305f58e-9744-459f-84a4-20087f4f3cac-kube-api-access-lm4rp\") pod \"community-operators-lrkgg\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.228309 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:31 crc kubenswrapper[4786]: I0313 16:34:31.537707 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrkgg"] Mar 13 16:34:32 crc kubenswrapper[4786]: I0313 16:34:32.062268 4786 generic.go:334] "Generic (PLEG): container finished" podID="e305f58e-9744-459f-84a4-20087f4f3cac" containerID="3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd" exitCode=0 Mar 13 16:34:32 crc kubenswrapper[4786]: I0313 16:34:32.062348 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrkgg" event={"ID":"e305f58e-9744-459f-84a4-20087f4f3cac","Type":"ContainerDied","Data":"3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd"} Mar 13 16:34:32 crc kubenswrapper[4786]: I0313 16:34:32.062648 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrkgg" event={"ID":"e305f58e-9744-459f-84a4-20087f4f3cac","Type":"ContainerStarted","Data":"36cd8c17b4bc96d663c0f3abe650db4382edea24b3a150d38b7a90f350861762"} Mar 13 16:34:33 crc kubenswrapper[4786]: I0313 16:34:33.077912 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrkgg" event={"ID":"e305f58e-9744-459f-84a4-20087f4f3cac","Type":"ContainerStarted","Data":"d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a"} Mar 13 16:34:34 crc kubenswrapper[4786]: I0313 16:34:34.094142 4786 generic.go:334] "Generic (PLEG): container finished" podID="e305f58e-9744-459f-84a4-20087f4f3cac" containerID="d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a" exitCode=0 Mar 13 16:34:34 crc kubenswrapper[4786]: I0313 16:34:34.094262 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrkgg" event={"ID":"e305f58e-9744-459f-84a4-20087f4f3cac","Type":"ContainerDied","Data":"d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a"} Mar 13 16:34:35 crc kubenswrapper[4786]: I0313 16:34:35.106831 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrkgg" event={"ID":"e305f58e-9744-459f-84a4-20087f4f3cac","Type":"ContainerStarted","Data":"456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0"} Mar 13 16:34:35 crc kubenswrapper[4786]: I0313 16:34:35.130290 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lrkgg" podStartSLOduration=2.685489884 podStartE2EDuration="5.130260712s" podCreationTimestamp="2026-03-13 16:34:30 +0000 UTC" firstStartedPulling="2026-03-13 16:34:32.06516837 +0000 UTC m=+5502.228380221" lastFinishedPulling="2026-03-13 16:34:34.509939198 +0000 UTC m=+5504.673151049" observedRunningTime="2026-03-13 16:34:35.126393837 +0000 UTC m=+5505.289605648" watchObservedRunningTime="2026-03-13 16:34:35.130260712 +0000 UTC m=+5505.293472563" Mar 13 16:34:41 crc kubenswrapper[4786]: I0313 16:34:41.229243 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:41 crc kubenswrapper[4786]: I0313 16:34:41.230393 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:41 crc kubenswrapper[4786]: I0313 16:34:41.288300 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:41 crc kubenswrapper[4786]: I0313 16:34:41.552473 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:34:41 crc kubenswrapper[4786]: E0313 16:34:41.552699 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:34:42 crc kubenswrapper[4786]: I0313 16:34:42.229327 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:42 crc kubenswrapper[4786]: I0313 16:34:42.300085 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrkgg"] Mar 13 16:34:44 crc kubenswrapper[4786]: I0313 16:34:44.187279 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lrkgg" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" containerName="registry-server" containerID="cri-o://456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0" gracePeriod=2 Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.203585 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.203710 4786 generic.go:334] "Generic (PLEG): container finished" podID="e305f58e-9744-459f-84a4-20087f4f3cac" containerID="456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0" exitCode=0 Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.203735 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrkgg" event={"ID":"e305f58e-9744-459f-84a4-20087f4f3cac","Type":"ContainerDied","Data":"456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0"} Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.204460 4786 scope.go:117] "RemoveContainer" containerID="456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.204418 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrkgg" event={"ID":"e305f58e-9744-459f-84a4-20087f4f3cac","Type":"ContainerDied","Data":"36cd8c17b4bc96d663c0f3abe650db4382edea24b3a150d38b7a90f350861762"} Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.243589 4786 scope.go:117] "RemoveContainer" containerID="d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.247005 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-utilities\") pod \"e305f58e-9744-459f-84a4-20087f4f3cac\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.247155 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm4rp\" (UniqueName: \"kubernetes.io/projected/e305f58e-9744-459f-84a4-20087f4f3cac-kube-api-access-lm4rp\") pod \"e305f58e-9744-459f-84a4-20087f4f3cac\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.247260 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-catalog-content\") pod \"e305f58e-9744-459f-84a4-20087f4f3cac\" (UID: \"e305f58e-9744-459f-84a4-20087f4f3cac\") " Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.248267 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-utilities" (OuterVolumeSpecName: "utilities") pod "e305f58e-9744-459f-84a4-20087f4f3cac" (UID: "e305f58e-9744-459f-84a4-20087f4f3cac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.260265 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e305f58e-9744-459f-84a4-20087f4f3cac-kube-api-access-lm4rp" (OuterVolumeSpecName: "kube-api-access-lm4rp") pod "e305f58e-9744-459f-84a4-20087f4f3cac" (UID: "e305f58e-9744-459f-84a4-20087f4f3cac"). InnerVolumeSpecName "kube-api-access-lm4rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.273581 4786 scope.go:117] "RemoveContainer" containerID="3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.324461 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e305f58e-9744-459f-84a4-20087f4f3cac" (UID: "e305f58e-9744-459f-84a4-20087f4f3cac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.337820 4786 scope.go:117] "RemoveContainer" containerID="456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0" Mar 13 16:34:45 crc kubenswrapper[4786]: E0313 16:34:45.338516 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0\": container with ID starting with 456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0 not found: ID does not exist" containerID="456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.338591 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0"} err="failed to get container status \"456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0\": rpc error: code = NotFound desc = could not find container \"456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0\": container with ID starting with 456bbf392d51f3835fb409d5779071c7618f2bcbe395ae043152eb6791be0ed0 not found: ID does not exist" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.338637 4786 scope.go:117] "RemoveContainer" containerID="d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a" Mar 13 16:34:45 crc kubenswrapper[4786]: E0313 16:34:45.339447 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a\": container with ID starting with d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a not found: ID does not exist" containerID="d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.339494 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a"} err="failed to get container status \"d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a\": rpc error: code = NotFound desc = could not find container \"d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a\": container with ID starting with d08dd0fa3304780b349b8e409c0c3c8c4956884943c034af2b32ba827102fe2a not found: ID does not exist" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.339523 4786 scope.go:117] "RemoveContainer" containerID="3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd" Mar 13 16:34:45 crc kubenswrapper[4786]: E0313 16:34:45.340360 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd\": container with ID starting with 3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd not found: ID does not exist" containerID="3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.340517 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd"} err="failed to get container status \"3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd\": rpc error: code = NotFound desc = could not find container \"3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd\": container with ID starting with 3ffb7241f5e9028628ab9ea2e5280efc0c1ce78b2aaef783f8ab620a8b9c62cd not found: ID does not exist" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.349822 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.350088 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305f58e-9744-459f-84a4-20087f4f3cac-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:34:45 crc kubenswrapper[4786]: I0313 16:34:45.350188 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm4rp\" (UniqueName: \"kubernetes.io/projected/e305f58e-9744-459f-84a4-20087f4f3cac-kube-api-access-lm4rp\") on node \"crc\" DevicePath \"\"" Mar 13 16:34:46 crc kubenswrapper[4786]: I0313 16:34:46.222104 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrkgg" Mar 13 16:34:46 crc kubenswrapper[4786]: I0313 16:34:46.273491 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrkgg"] Mar 13 16:34:46 crc kubenswrapper[4786]: I0313 16:34:46.280628 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lrkgg"] Mar 13 16:34:46 crc kubenswrapper[4786]: I0313 16:34:46.569147 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" path="/var/lib/kubelet/pods/e305f58e-9744-459f-84a4-20087f4f3cac/volumes" Mar 13 16:34:55 crc kubenswrapper[4786]: I0313 16:34:55.552497 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:34:55 crc kubenswrapper[4786]: E0313 16:34:55.554508 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:35:09 crc kubenswrapper[4786]: I0313 16:35:09.553299 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:35:09 crc kubenswrapper[4786]: E0313 16:35:09.554445 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:35:22 crc kubenswrapper[4786]: I0313 16:35:22.552413 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:35:22 crc kubenswrapper[4786]: E0313 16:35:22.553793 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:35:34 crc kubenswrapper[4786]: I0313 16:35:34.553220 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:35:34 crc kubenswrapper[4786]: E0313 16:35:34.554745 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:35:46 crc kubenswrapper[4786]: I0313 16:35:46.553004 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:35:46 crc kubenswrapper[4786]: E0313 16:35:46.554300 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.140734 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 13 16:35:56 crc kubenswrapper[4786]: E0313 16:35:56.141420 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" containerName="extract-content" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.141437 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" containerName="extract-content" Mar 13 16:35:56 crc kubenswrapper[4786]: E0313 16:35:56.141452 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" containerName="registry-server" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.141458 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" containerName="registry-server" Mar 13 16:35:56 crc kubenswrapper[4786]: E0313 16:35:56.141482 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" containerName="extract-utilities" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.141489 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" containerName="extract-utilities" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.141623 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e305f58e-9744-459f-84a4-20087f4f3cac" containerName="registry-server" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.142139 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.144675 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z8lt4" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.163463 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.244159 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkg89\" (UniqueName: \"kubernetes.io/projected/d723d777-35e7-4884-b57d-bffbb87a4228-kube-api-access-tkg89\") pod \"mariadb-copy-data\" (UID: \"d723d777-35e7-4884-b57d-bffbb87a4228\") " pod="openstack/mariadb-copy-data" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.244420 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f2c11fa1-c9b0-4354-96a9-fa8412bdf8c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2c11fa1-c9b0-4354-96a9-fa8412bdf8c7\") pod \"mariadb-copy-data\" (UID: \"d723d777-35e7-4884-b57d-bffbb87a4228\") " pod="openstack/mariadb-copy-data" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.346264 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkg89\" (UniqueName: \"kubernetes.io/projected/d723d777-35e7-4884-b57d-bffbb87a4228-kube-api-access-tkg89\") pod \"mariadb-copy-data\" (UID: \"d723d777-35e7-4884-b57d-bffbb87a4228\") " pod="openstack/mariadb-copy-data" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.346345 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f2c11fa1-c9b0-4354-96a9-fa8412bdf8c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2c11fa1-c9b0-4354-96a9-fa8412bdf8c7\") pod \"mariadb-copy-data\" (UID: \"d723d777-35e7-4884-b57d-bffbb87a4228\") " pod="openstack/mariadb-copy-data" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.349893 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.350022 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f2c11fa1-c9b0-4354-96a9-fa8412bdf8c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2c11fa1-c9b0-4354-96a9-fa8412bdf8c7\") pod \"mariadb-copy-data\" (UID: \"d723d777-35e7-4884-b57d-bffbb87a4228\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/797fffa864fe01fed085b7b23d6a686c1a7bcbc238a794b14f5fb91e8a09b566/globalmount\"" pod="openstack/mariadb-copy-data" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.376689 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkg89\" (UniqueName: \"kubernetes.io/projected/d723d777-35e7-4884-b57d-bffbb87a4228-kube-api-access-tkg89\") pod \"mariadb-copy-data\" (UID: \"d723d777-35e7-4884-b57d-bffbb87a4228\") " pod="openstack/mariadb-copy-data" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.398348 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f2c11fa1-c9b0-4354-96a9-fa8412bdf8c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2c11fa1-c9b0-4354-96a9-fa8412bdf8c7\") pod \"mariadb-copy-data\" (UID: \"d723d777-35e7-4884-b57d-bffbb87a4228\") " pod="openstack/mariadb-copy-data" Mar 13 16:35:56 crc kubenswrapper[4786]: I0313 16:35:56.462322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 13 16:35:57 crc kubenswrapper[4786]: I0313 16:35:57.021520 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 13 16:35:57 crc kubenswrapper[4786]: I0313 16:35:57.870164 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"d723d777-35e7-4884-b57d-bffbb87a4228","Type":"ContainerStarted","Data":"fe9cade8a3188ad6fe9b67342bc913e937fb9f8f3232696fd49e9749429c6719"} Mar 13 16:35:57 crc kubenswrapper[4786]: I0313 16:35:57.870501 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"d723d777-35e7-4884-b57d-bffbb87a4228","Type":"ContainerStarted","Data":"6f988b4f68678603c7f5c444b7b98f67c58a2acbca408aa762f58f2420c1c743"} Mar 13 16:35:57 crc kubenswrapper[4786]: I0313 16:35:57.889020 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.889005089 podStartE2EDuration="2.889005089s" podCreationTimestamp="2026-03-13 16:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:35:57.888367594 +0000 UTC m=+5588.051579465" watchObservedRunningTime="2026-03-13 16:35:57.889005089 +0000 UTC m=+5588.052216900" Mar 13 16:35:58 crc kubenswrapper[4786]: I0313 16:35:58.552126 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:35:58 crc kubenswrapper[4786]: E0313 16:35:58.552731 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.139736 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556996-78wh6"] Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.140980 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556996-78wh6" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.142962 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.143021 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.146020 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.147176 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556996-78wh6"] Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.318739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhzc9\" (UniqueName: \"kubernetes.io/projected/c6ecb255-40fb-45aa-8bdc-3080d956dec4-kube-api-access-hhzc9\") pod \"auto-csr-approver-29556996-78wh6\" (UID: \"c6ecb255-40fb-45aa-8bdc-3080d956dec4\") " pod="openshift-infra/auto-csr-approver-29556996-78wh6" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.420225 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhzc9\" (UniqueName: \"kubernetes.io/projected/c6ecb255-40fb-45aa-8bdc-3080d956dec4-kube-api-access-hhzc9\") pod \"auto-csr-approver-29556996-78wh6\" (UID: \"c6ecb255-40fb-45aa-8bdc-3080d956dec4\") " pod="openshift-infra/auto-csr-approver-29556996-78wh6" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.437521 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhzc9\" (UniqueName: \"kubernetes.io/projected/c6ecb255-40fb-45aa-8bdc-3080d956dec4-kube-api-access-hhzc9\") pod \"auto-csr-approver-29556996-78wh6\" (UID: \"c6ecb255-40fb-45aa-8bdc-3080d956dec4\") " pod="openshift-infra/auto-csr-approver-29556996-78wh6" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.466001 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556996-78wh6" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.520846 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.521962 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.529940 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.624637 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thn8g\" (UniqueName: \"kubernetes.io/projected/eef51f20-2a32-46f3-9ba4-10bbda4de9ac-kube-api-access-thn8g\") pod \"mariadb-client\" (UID: \"eef51f20-2a32-46f3-9ba4-10bbda4de9ac\") " pod="openstack/mariadb-client" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.726424 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thn8g\" (UniqueName: \"kubernetes.io/projected/eef51f20-2a32-46f3-9ba4-10bbda4de9ac-kube-api-access-thn8g\") pod \"mariadb-client\" (UID: \"eef51f20-2a32-46f3-9ba4-10bbda4de9ac\") " pod="openstack/mariadb-client" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.758910 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thn8g\" (UniqueName: \"kubernetes.io/projected/eef51f20-2a32-46f3-9ba4-10bbda4de9ac-kube-api-access-thn8g\") pod \"mariadb-client\" (UID: \"eef51f20-2a32-46f3-9ba4-10bbda4de9ac\") " pod="openstack/mariadb-client" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.879393 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:36:00 crc kubenswrapper[4786]: I0313 16:36:00.884483 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556996-78wh6"] Mar 13 16:36:01 crc kubenswrapper[4786]: I0313 16:36:01.305651 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:01 crc kubenswrapper[4786]: W0313 16:36:01.310511 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef51f20_2a32_46f3_9ba4_10bbda4de9ac.slice/crio-b116b0c8e9d4f47e9403eb072df0df900399a5b3e6415d1b68b44b293672b64f WatchSource:0}: Error finding container b116b0c8e9d4f47e9403eb072df0df900399a5b3e6415d1b68b44b293672b64f: Status 404 returned error can't find the container with id b116b0c8e9d4f47e9403eb072df0df900399a5b3e6415d1b68b44b293672b64f Mar 13 16:36:01 crc kubenswrapper[4786]: I0313 16:36:01.910651 4786 generic.go:334] "Generic (PLEG): container finished" podID="eef51f20-2a32-46f3-9ba4-10bbda4de9ac" containerID="d54087474411a1bfce3cf9f89513f5a54ef5502e83a32753d3ca6fb09e86914a" exitCode=0 Mar 13 16:36:01 crc kubenswrapper[4786]: I0313 16:36:01.910996 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"eef51f20-2a32-46f3-9ba4-10bbda4de9ac","Type":"ContainerDied","Data":"d54087474411a1bfce3cf9f89513f5a54ef5502e83a32753d3ca6fb09e86914a"} Mar 13 16:36:01 crc kubenswrapper[4786]: I0313 16:36:01.911028 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"eef51f20-2a32-46f3-9ba4-10bbda4de9ac","Type":"ContainerStarted","Data":"b116b0c8e9d4f47e9403eb072df0df900399a5b3e6415d1b68b44b293672b64f"} Mar 13 16:36:01 crc kubenswrapper[4786]: I0313 16:36:01.912370 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556996-78wh6" event={"ID":"c6ecb255-40fb-45aa-8bdc-3080d956dec4","Type":"ContainerStarted","Data":"e6a64b4502f445f9c4c9bc6dfb6033d8f7bf812838ccceb0fb77a50351a43da0"} Mar 13 16:36:02 crc kubenswrapper[4786]: I0313 16:36:02.922233 4786 generic.go:334] "Generic (PLEG): container finished" podID="c6ecb255-40fb-45aa-8bdc-3080d956dec4" containerID="29b1da2d72be99bd636d74bf45826a04a282ffcf5457c18cc297564bc69fe9a2" exitCode=0 Mar 13 16:36:02 crc kubenswrapper[4786]: I0313 16:36:02.922322 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556996-78wh6" event={"ID":"c6ecb255-40fb-45aa-8bdc-3080d956dec4","Type":"ContainerDied","Data":"29b1da2d72be99bd636d74bf45826a04a282ffcf5457c18cc297564bc69fe9a2"} Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.217062 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.237211 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_eef51f20-2a32-46f3-9ba4-10bbda4de9ac/mariadb-client/0.log" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.262687 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.269928 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.373403 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thn8g\" (UniqueName: \"kubernetes.io/projected/eef51f20-2a32-46f3-9ba4-10bbda4de9ac-kube-api-access-thn8g\") pod \"eef51f20-2a32-46f3-9ba4-10bbda4de9ac\" (UID: \"eef51f20-2a32-46f3-9ba4-10bbda4de9ac\") " Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.384966 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:03 crc kubenswrapper[4786]: E0313 16:36:03.385322 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef51f20-2a32-46f3-9ba4-10bbda4de9ac" containerName="mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.385343 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef51f20-2a32-46f3-9ba4-10bbda4de9ac" containerName="mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.385487 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef51f20-2a32-46f3-9ba4-10bbda4de9ac" containerName="mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.386004 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.390584 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.428961 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef51f20-2a32-46f3-9ba4-10bbda4de9ac-kube-api-access-thn8g" (OuterVolumeSpecName: "kube-api-access-thn8g") pod "eef51f20-2a32-46f3-9ba4-10bbda4de9ac" (UID: "eef51f20-2a32-46f3-9ba4-10bbda4de9ac"). InnerVolumeSpecName "kube-api-access-thn8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.481340 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnkr8\" (UniqueName: \"kubernetes.io/projected/7a8c28d0-ba46-4ebe-90e2-eb2572557c4f-kube-api-access-bnkr8\") pod \"mariadb-client\" (UID: \"7a8c28d0-ba46-4ebe-90e2-eb2572557c4f\") " pod="openstack/mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.481436 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thn8g\" (UniqueName: \"kubernetes.io/projected/eef51f20-2a32-46f3-9ba4-10bbda4de9ac-kube-api-access-thn8g\") on node \"crc\" DevicePath \"\"" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.582927 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnkr8\" (UniqueName: \"kubernetes.io/projected/7a8c28d0-ba46-4ebe-90e2-eb2572557c4f-kube-api-access-bnkr8\") pod \"mariadb-client\" (UID: \"7a8c28d0-ba46-4ebe-90e2-eb2572557c4f\") " pod="openstack/mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.607034 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnkr8\" (UniqueName: \"kubernetes.io/projected/7a8c28d0-ba46-4ebe-90e2-eb2572557c4f-kube-api-access-bnkr8\") pod \"mariadb-client\" (UID: \"7a8c28d0-ba46-4ebe-90e2-eb2572557c4f\") " pod="openstack/mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.745896 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.933544 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.933979 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b116b0c8e9d4f47e9403eb072df0df900399a5b3e6415d1b68b44b293672b64f" Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.935223 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:03 crc kubenswrapper[4786]: I0313 16:36:03.949199 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="eef51f20-2a32-46f3-9ba4-10bbda4de9ac" podUID="7a8c28d0-ba46-4ebe-90e2-eb2572557c4f" Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.189379 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556996-78wh6" Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.296973 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhzc9\" (UniqueName: \"kubernetes.io/projected/c6ecb255-40fb-45aa-8bdc-3080d956dec4-kube-api-access-hhzc9\") pod \"c6ecb255-40fb-45aa-8bdc-3080d956dec4\" (UID: \"c6ecb255-40fb-45aa-8bdc-3080d956dec4\") " Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.306131 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ecb255-40fb-45aa-8bdc-3080d956dec4-kube-api-access-hhzc9" (OuterVolumeSpecName: "kube-api-access-hhzc9") pod "c6ecb255-40fb-45aa-8bdc-3080d956dec4" (UID: "c6ecb255-40fb-45aa-8bdc-3080d956dec4"). InnerVolumeSpecName "kube-api-access-hhzc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.398741 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhzc9\" (UniqueName: \"kubernetes.io/projected/c6ecb255-40fb-45aa-8bdc-3080d956dec4-kube-api-access-hhzc9\") on node \"crc\" DevicePath \"\"" Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.567471 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef51f20-2a32-46f3-9ba4-10bbda4de9ac" path="/var/lib/kubelet/pods/eef51f20-2a32-46f3-9ba4-10bbda4de9ac/volumes" Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.947138 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556996-78wh6" event={"ID":"c6ecb255-40fb-45aa-8bdc-3080d956dec4","Type":"ContainerDied","Data":"e6a64b4502f445f9c4c9bc6dfb6033d8f7bf812838ccceb0fb77a50351a43da0"} Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.947195 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6a64b4502f445f9c4c9bc6dfb6033d8f7bf812838ccceb0fb77a50351a43da0" Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.947276 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556996-78wh6" Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.951820 4786 generic.go:334] "Generic (PLEG): container finished" podID="7a8c28d0-ba46-4ebe-90e2-eb2572557c4f" containerID="072edeb945261cd8b656a9ec944c76a7ad77ceb6578f6aaab311f86c3ad7251e" exitCode=0 Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.951915 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"7a8c28d0-ba46-4ebe-90e2-eb2572557c4f","Type":"ContainerDied","Data":"072edeb945261cd8b656a9ec944c76a7ad77ceb6578f6aaab311f86c3ad7251e"} Mar 13 16:36:04 crc kubenswrapper[4786]: I0313 16:36:04.951966 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"7a8c28d0-ba46-4ebe-90e2-eb2572557c4f","Type":"ContainerStarted","Data":"6360d391fe763d0cec7608c7bd6546f1c831289df2d132d059f577b79e1c0984"} Mar 13 16:36:05 crc kubenswrapper[4786]: I0313 16:36:05.304898 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556990-w5rpx"] Mar 13 16:36:05 crc kubenswrapper[4786]: I0313 16:36:05.313783 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556990-w5rpx"] Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.270298 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.289150 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_7a8c28d0-ba46-4ebe-90e2-eb2572557c4f/mariadb-client/0.log" Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.319543 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.326003 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.434660 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnkr8\" (UniqueName: \"kubernetes.io/projected/7a8c28d0-ba46-4ebe-90e2-eb2572557c4f-kube-api-access-bnkr8\") pod \"7a8c28d0-ba46-4ebe-90e2-eb2572557c4f\" (UID: \"7a8c28d0-ba46-4ebe-90e2-eb2572557c4f\") " Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.567155 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6fa2cb-69af-420e-a158-9ebfded4c3e0" path="/var/lib/kubelet/pods/6e6fa2cb-69af-420e-a158-9ebfded4c3e0/volumes" Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.599539 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8c28d0-ba46-4ebe-90e2-eb2572557c4f-kube-api-access-bnkr8" (OuterVolumeSpecName: "kube-api-access-bnkr8") pod "7a8c28d0-ba46-4ebe-90e2-eb2572557c4f" (UID: "7a8c28d0-ba46-4ebe-90e2-eb2572557c4f"). InnerVolumeSpecName "kube-api-access-bnkr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.639482 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnkr8\" (UniqueName: \"kubernetes.io/projected/7a8c28d0-ba46-4ebe-90e2-eb2572557c4f-kube-api-access-bnkr8\") on node \"crc\" DevicePath \"\"" Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.973640 4786 scope.go:117] "RemoveContainer" containerID="072edeb945261cd8b656a9ec944c76a7ad77ceb6578f6aaab311f86c3ad7251e" Mar 13 16:36:06 crc kubenswrapper[4786]: I0313 16:36:06.973950 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 13 16:36:08 crc kubenswrapper[4786]: I0313 16:36:08.566100 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8c28d0-ba46-4ebe-90e2-eb2572557c4f" path="/var/lib/kubelet/pods/7a8c28d0-ba46-4ebe-90e2-eb2572557c4f/volumes" Mar 13 16:36:09 crc kubenswrapper[4786]: I0313 16:36:09.554473 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:36:09 crc kubenswrapper[4786]: E0313 16:36:09.554842 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:36:14 crc kubenswrapper[4786]: I0313 16:36:14.278471 4786 scope.go:117] "RemoveContainer" containerID="624ab215ed4598c9fc43a1198be2f3293fffa58a2510f13ae0715d6741e2e414" Mar 13 16:36:14 crc kubenswrapper[4786]: I0313 16:36:14.300640 4786 scope.go:117] "RemoveContainer" containerID="6c0eaa75172523cd9c647552ed48986377ed64e1e9cee433e3966677d4387752" Mar 13 16:36:14 crc kubenswrapper[4786]: I0313 16:36:14.319635 4786 scope.go:117] "RemoveContainer" containerID="b676824333dc9f6a07480748522760ddccca6263a0189ec1784b3afd9078c0d2" Mar 13 16:36:23 crc kubenswrapper[4786]: I0313 16:36:23.552219 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:36:23 crc kubenswrapper[4786]: E0313 16:36:23.553171 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:36:37 crc kubenswrapper[4786]: I0313 16:36:37.552109 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:36:37 crc kubenswrapper[4786]: E0313 16:36:37.552882 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.654418 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 16:36:40 crc kubenswrapper[4786]: E0313 16:36:40.655275 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8c28d0-ba46-4ebe-90e2-eb2572557c4f" containerName="mariadb-client" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.655297 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8c28d0-ba46-4ebe-90e2-eb2572557c4f" containerName="mariadb-client" Mar 13 16:36:40 crc kubenswrapper[4786]: E0313 16:36:40.655349 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ecb255-40fb-45aa-8bdc-3080d956dec4" containerName="oc" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.655365 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ecb255-40fb-45aa-8bdc-3080d956dec4" containerName="oc" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.655622 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ecb255-40fb-45aa-8bdc-3080d956dec4" containerName="oc" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.655651 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8c28d0-ba46-4ebe-90e2-eb2572557c4f" containerName="mariadb-client" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.657054 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.658760 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.658971 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.658971 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bqfvd" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.659828 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.661213 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.669663 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.671685 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.700853 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.719447 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.720892 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.725286 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.737009 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.775950 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.775994 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm9nk\" (UniqueName: \"kubernetes.io/projected/7a919b26-f2f2-411c-b7ec-afe290a48417-kube-api-access-fm9nk\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776105 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776184 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f0a69246-87c7-4398-ae6a-21d707e40f23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f0a69246-87c7-4398-ae6a-21d707e40f23\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776232 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l49sl\" (UniqueName: \"kubernetes.io/projected/9830d010-b8d6-43a4-b49e-5300740afa03-kube-api-access-l49sl\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776305 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ea7c72-49d9-4e99-a890-cb51a6ea441e-config\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776328 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776369 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776471 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bn4\" (UniqueName: \"kubernetes.io/projected/13ea7c72-49d9-4e99-a890-cb51a6ea441e-kube-api-access-c2bn4\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776522 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9830d010-b8d6-43a4-b49e-5300740afa03-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776541 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776619 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13ea7c72-49d9-4e99-a890-cb51a6ea441e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776635 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9830d010-b8d6-43a4-b49e-5300740afa03-config\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776663 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a919b26-f2f2-411c-b7ec-afe290a48417-config\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776681 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6c159d77-50b8-42a7-be7d-d7f7a99777d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6c159d77-50b8-42a7-be7d-d7f7a99777d8\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776744 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776831 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13ea7c72-49d9-4e99-a890-cb51a6ea441e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776908 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-50f76092-e655-4a33-b6be-cbf8e209b8f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50f76092-e655-4a33-b6be-cbf8e209b8f6\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776956 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a919b26-f2f2-411c-b7ec-afe290a48417-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.776990 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a919b26-f2f2-411c-b7ec-afe290a48417-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.777027 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.777092 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9830d010-b8d6-43a4-b49e-5300740afa03-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878553 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878620 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm9nk\" (UniqueName: \"kubernetes.io/projected/7a919b26-f2f2-411c-b7ec-afe290a48417-kube-api-access-fm9nk\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878674 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878726 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f0a69246-87c7-4398-ae6a-21d707e40f23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f0a69246-87c7-4398-ae6a-21d707e40f23\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l49sl\" (UniqueName: \"kubernetes.io/projected/9830d010-b8d6-43a4-b49e-5300740afa03-kube-api-access-l49sl\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878831 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ea7c72-49d9-4e99-a890-cb51a6ea441e-config\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878890 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878926 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878961 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.878995 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bn4\" (UniqueName: \"kubernetes.io/projected/13ea7c72-49d9-4e99-a890-cb51a6ea441e-kube-api-access-c2bn4\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880003 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880045 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9830d010-b8d6-43a4-b49e-5300740afa03-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880079 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13ea7c72-49d9-4e99-a890-cb51a6ea441e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880146 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9830d010-b8d6-43a4-b49e-5300740afa03-config\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880182 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a919b26-f2f2-411c-b7ec-afe290a48417-config\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880213 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6c159d77-50b8-42a7-be7d-d7f7a99777d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6c159d77-50b8-42a7-be7d-d7f7a99777d8\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880263 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880323 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13ea7c72-49d9-4e99-a890-cb51a6ea441e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880373 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-50f76092-e655-4a33-b6be-cbf8e209b8f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50f76092-e655-4a33-b6be-cbf8e209b8f6\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880425 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a919b26-f2f2-411c-b7ec-afe290a48417-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880470 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a919b26-f2f2-411c-b7ec-afe290a48417-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880520 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880577 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9830d010-b8d6-43a4-b49e-5300740afa03-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.880935 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ea7c72-49d9-4e99-a890-cb51a6ea441e-config\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.881660 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13ea7c72-49d9-4e99-a890-cb51a6ea441e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.881974 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13ea7c72-49d9-4e99-a890-cb51a6ea441e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.882636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9830d010-b8d6-43a4-b49e-5300740afa03-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.883236 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9830d010-b8d6-43a4-b49e-5300740afa03-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.883313 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a919b26-f2f2-411c-b7ec-afe290a48417-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.883614 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9830d010-b8d6-43a4-b49e-5300740afa03-config\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.885181 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a919b26-f2f2-411c-b7ec-afe290a48417-config\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.886053 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.886123 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f0a69246-87c7-4398-ae6a-21d707e40f23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f0a69246-87c7-4398-ae6a-21d707e40f23\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/caf45b15360f0989a11c0d653efd5a4894c5011654a0e1024b998493f050207a/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.889486 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.889970 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.889970 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.889991 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-50f76092-e655-4a33-b6be-cbf8e209b8f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50f76092-e655-4a33-b6be-cbf8e209b8f6\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c8ec14f28346aae58b71b169dd1a885cb05e1d3bc0e2fec5eef574e815f998c0/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.890034 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.890039 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6c159d77-50b8-42a7-be7d-d7f7a99777d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6c159d77-50b8-42a7-be7d-d7f7a99777d8\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9dbae5d189ac1fd98752dc0a5741293f8741eb34d963d94f7b3cdbab021f16f7/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.890682 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.895258 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a919b26-f2f2-411c-b7ec-afe290a48417-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.895802 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13ea7c72-49d9-4e99-a890-cb51a6ea441e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.896908 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.897004 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.897830 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.902527 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9830d010-b8d6-43a4-b49e-5300740afa03-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.902535 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a919b26-f2f2-411c-b7ec-afe290a48417-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.906638 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l49sl\" (UniqueName: \"kubernetes.io/projected/9830d010-b8d6-43a4-b49e-5300740afa03-kube-api-access-l49sl\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.912357 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm9nk\" (UniqueName: \"kubernetes.io/projected/7a919b26-f2f2-411c-b7ec-afe290a48417-kube-api-access-fm9nk\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.921115 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bn4\" (UniqueName: \"kubernetes.io/projected/13ea7c72-49d9-4e99-a890-cb51a6ea441e-kube-api-access-c2bn4\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.948977 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f0a69246-87c7-4398-ae6a-21d707e40f23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f0a69246-87c7-4398-ae6a-21d707e40f23\") pod \"ovsdbserver-nb-0\" (UID: \"13ea7c72-49d9-4e99-a890-cb51a6ea441e\") " pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.953188 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6c159d77-50b8-42a7-be7d-d7f7a99777d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6c159d77-50b8-42a7-be7d-d7f7a99777d8\") pod \"ovsdbserver-nb-2\" (UID: \"9830d010-b8d6-43a4-b49e-5300740afa03\") " pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.967109 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-50f76092-e655-4a33-b6be-cbf8e209b8f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50f76092-e655-4a33-b6be-cbf8e209b8f6\") pod \"ovsdbserver-nb-1\" (UID: \"7a919b26-f2f2-411c-b7ec-afe290a48417\") " pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:40 crc kubenswrapper[4786]: I0313 16:36:40.996284 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:41 crc kubenswrapper[4786]: I0313 16:36:41.004629 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:41 crc kubenswrapper[4786]: I0313 16:36:41.045643 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:41 crc kubenswrapper[4786]: I0313 16:36:41.577984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:41.651539 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 13 16:36:42 crc kubenswrapper[4786]: W0313 16:36:41.653783 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a919b26_f2f2_411c_b7ec_afe290a48417.slice/crio-6b0e55dc9cb1b0cb06ccc1952f5fe8512cd4388aecaac10c75da9b86c66a58ec WatchSource:0}: Error finding container 6b0e55dc9cb1b0cb06ccc1952f5fe8512cd4388aecaac10c75da9b86c66a58ec: Status 404 returned error can't find the container with id 6b0e55dc9cb1b0cb06ccc1952f5fe8512cd4388aecaac10c75da9b86c66a58ec Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.278916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13ea7c72-49d9-4e99-a890-cb51a6ea441e","Type":"ContainerStarted","Data":"722259ae423ef0d9c4011951827f0562dfcd0357e6dae01ec7989d7e67eacfcc"} Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.279283 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13ea7c72-49d9-4e99-a890-cb51a6ea441e","Type":"ContainerStarted","Data":"bab0232e3ba2182fb9922345c473fef3199fa9ebc0784032249bea13c1a340a1"} Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.279318 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13ea7c72-49d9-4e99-a890-cb51a6ea441e","Type":"ContainerStarted","Data":"1fb170a3bf3e97c2ffd17ba8243d4f5a4d8134adab65baf512c06762b0ade12b"} Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.281079 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7a919b26-f2f2-411c-b7ec-afe290a48417","Type":"ContainerStarted","Data":"2173c62be625c4b1ff102b4e6aeb852d0ccfcd4af527818ed76535df91dfc9ac"} Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.281122 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7a919b26-f2f2-411c-b7ec-afe290a48417","Type":"ContainerStarted","Data":"6e2fdd965a981e8b582a2c722efa09f95cc72861fd7403f9815c86ed5ef05f3b"} Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.281132 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7a919b26-f2f2-411c-b7ec-afe290a48417","Type":"ContainerStarted","Data":"6b0e55dc9cb1b0cb06ccc1952f5fe8512cd4388aecaac10c75da9b86c66a58ec"} Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.305178 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.305160154 podStartE2EDuration="3.305160154s" podCreationTimestamp="2026-03-13 16:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:36:42.297819564 +0000 UTC m=+5632.461031415" watchObservedRunningTime="2026-03-13 16:36:42.305160154 +0000 UTC m=+5632.468371965" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.332099 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.332084197 podStartE2EDuration="3.332084197s" podCreationTimestamp="2026-03-13 16:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:36:42.327930875 +0000 UTC m=+5632.491142686" watchObservedRunningTime="2026-03-13 16:36:42.332084197 +0000 UTC m=+5632.495296008" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.378159 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.379812 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.383704 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.383777 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nkps8" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.383955 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.385563 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.395555 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.407708 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.409724 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.420617 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.423602 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.430466 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.446449 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.488544 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506256 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506294 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c778abf8-64be-4c6e-993a-df2bf80128c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506322 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506342 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506383 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/986264a5-c61a-430c-818f-b0c66693eb87-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506407 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9fd2bec7-23ff-44b5-9e58-298fa5795851\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fd2bec7-23ff-44b5-9e58-298fa5795851\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506586 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c778abf8-64be-4c6e-993a-df2bf80128c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506691 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506819 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/986264a5-c61a-430c-818f-b0c66693eb87-config\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506879 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ab28cc-d71c-43f8-8448-a7175567fd08-config\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506921 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2acd2686-d9aa-4c76-ad74-bf3fa348b720\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2acd2686-d9aa-4c76-ad74-bf3fa348b720\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.506999 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44ab28cc-d71c-43f8-8448-a7175567fd08-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507372 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507433 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507451 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44ab28cc-d71c-43f8-8448-a7175567fd08-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507511 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8be72eeb-434b-46c7-b14d-fa28b45b7264\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8be72eeb-434b-46c7-b14d-fa28b45b7264\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507545 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgfkc\" (UniqueName: \"kubernetes.io/projected/986264a5-c61a-430c-818f-b0c66693eb87-kube-api-access-fgfkc\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507691 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/986264a5-c61a-430c-818f-b0c66693eb87-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507824 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk25w\" (UniqueName: \"kubernetes.io/projected/44ab28cc-d71c-43f8-8448-a7175567fd08-kube-api-access-jk25w\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507901 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqsg5\" (UniqueName: \"kubernetes.io/projected/c778abf8-64be-4c6e-993a-df2bf80128c0-kube-api-access-fqsg5\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507935 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.507997 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c778abf8-64be-4c6e-993a-df2bf80128c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.609755 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/986264a5-c61a-430c-818f-b0c66693eb87-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.609824 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk25w\" (UniqueName: \"kubernetes.io/projected/44ab28cc-d71c-43f8-8448-a7175567fd08-kube-api-access-jk25w\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.609880 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqsg5\" (UniqueName: \"kubernetes.io/projected/c778abf8-64be-4c6e-993a-df2bf80128c0-kube-api-access-fqsg5\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.609907 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.609928 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c778abf8-64be-4c6e-993a-df2bf80128c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610522 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610581 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c778abf8-64be-4c6e-993a-df2bf80128c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610696 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610707 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c778abf8-64be-4c6e-993a-df2bf80128c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610724 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610779 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/986264a5-c61a-430c-818f-b0c66693eb87-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9fd2bec7-23ff-44b5-9e58-298fa5795851\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fd2bec7-23ff-44b5-9e58-298fa5795851\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610941 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/986264a5-c61a-430c-818f-b0c66693eb87-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.610946 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c778abf8-64be-4c6e-993a-df2bf80128c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611026 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611078 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/986264a5-c61a-430c-818f-b0c66693eb87-config\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611108 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ab28cc-d71c-43f8-8448-a7175567fd08-config\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611141 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2acd2686-d9aa-4c76-ad74-bf3fa348b720\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2acd2686-d9aa-4c76-ad74-bf3fa348b720\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611179 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44ab28cc-d71c-43f8-8448-a7175567fd08-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611236 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611285 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611309 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44ab28cc-d71c-43f8-8448-a7175567fd08-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611337 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8be72eeb-434b-46c7-b14d-fa28b45b7264\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8be72eeb-434b-46c7-b14d-fa28b45b7264\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611385 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611413 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgfkc\" (UniqueName: \"kubernetes.io/projected/986264a5-c61a-430c-818f-b0c66693eb87-kube-api-access-fgfkc\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.611455 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.612099 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c778abf8-64be-4c6e-993a-df2bf80128c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.613675 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/986264a5-c61a-430c-818f-b0c66693eb87-config\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.616457 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/986264a5-c61a-430c-818f-b0c66693eb87-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.623044 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.624271 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.624340 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2acd2686-d9aa-4c76-ad74-bf3fa348b720\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2acd2686-d9aa-4c76-ad74-bf3fa348b720\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ddbe8ce076d3df8d4b07a10ac7e727cabb5bbd3b15815ef4326c5fa3b4852ebd/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.625160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44ab28cc-d71c-43f8-8448-a7175567fd08-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.625340 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.625967 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44ab28cc-d71c-43f8-8448-a7175567fd08-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.626016 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.626046 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8be72eeb-434b-46c7-b14d-fa28b45b7264\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8be72eeb-434b-46c7-b14d-fa28b45b7264\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/67b00314e14174a50978a283ce2689b489693ca21f1b98d3cb00860048f436b9/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.625823 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.626366 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c778abf8-64be-4c6e-993a-df2bf80128c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.627169 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.627393 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.628637 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ab28cc-d71c-43f8-8448-a7175567fd08-config\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.630313 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986264a5-c61a-430c-818f-b0c66693eb87-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.630443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c778abf8-64be-4c6e-993a-df2bf80128c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.630812 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.630850 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9fd2bec7-23ff-44b5-9e58-298fa5795851\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fd2bec7-23ff-44b5-9e58-298fa5795851\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b5043388822f00cc85914125088176653263ce6fc6698e240d7bb38bb541823d/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.632157 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk25w\" (UniqueName: \"kubernetes.io/projected/44ab28cc-d71c-43f8-8448-a7175567fd08-kube-api-access-jk25w\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.632694 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgfkc\" (UniqueName: \"kubernetes.io/projected/986264a5-c61a-430c-818f-b0c66693eb87-kube-api-access-fgfkc\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.633659 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.635090 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqsg5\" (UniqueName: \"kubernetes.io/projected/c778abf8-64be-4c6e-993a-df2bf80128c0-kube-api-access-fqsg5\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.637121 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44ab28cc-d71c-43f8-8448-a7175567fd08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.669047 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2acd2686-d9aa-4c76-ad74-bf3fa348b720\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2acd2686-d9aa-4c76-ad74-bf3fa348b720\") pod \"ovsdbserver-sb-1\" (UID: \"986264a5-c61a-430c-818f-b0c66693eb87\") " pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.669783 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8be72eeb-434b-46c7-b14d-fa28b45b7264\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8be72eeb-434b-46c7-b14d-fa28b45b7264\") pod \"ovsdbserver-sb-2\" (UID: \"44ab28cc-d71c-43f8-8448-a7175567fd08\") " pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.679578 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9fd2bec7-23ff-44b5-9e58-298fa5795851\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fd2bec7-23ff-44b5-9e58-298fa5795851\") pod \"ovsdbserver-sb-0\" (UID: \"c778abf8-64be-4c6e-993a-df2bf80128c0\") " pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.701144 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.768562 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:42 crc kubenswrapper[4786]: I0313 16:36:42.775388 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:43 crc kubenswrapper[4786]: I0313 16:36:43.293896 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"9830d010-b8d6-43a4-b49e-5300740afa03","Type":"ContainerStarted","Data":"84aea696da051f765526793c29cde1deab1b2ee78c888a4812ca6c7a44f7d6ab"} Mar 13 16:36:43 crc kubenswrapper[4786]: I0313 16:36:43.294176 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"9830d010-b8d6-43a4-b49e-5300740afa03","Type":"ContainerStarted","Data":"4b2c9297f2ff48b8876b6fa06df0a1452dfa280df500cd99ff55bd8e98fcca2f"} Mar 13 16:36:43 crc kubenswrapper[4786]: I0313 16:36:43.294304 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"9830d010-b8d6-43a4-b49e-5300740afa03","Type":"ContainerStarted","Data":"23a5e78e1e91035712b2511e626a524946cf7d73923a7c38c81ca4386a1689ff"} Mar 13 16:36:43 crc kubenswrapper[4786]: I0313 16:36:43.332629 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.332594944 podStartE2EDuration="4.332594944s" podCreationTimestamp="2026-03-13 16:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:36:43.316286892 +0000 UTC m=+5633.479498733" watchObservedRunningTime="2026-03-13 16:36:43.332594944 +0000 UTC m=+5633.495806795" Mar 13 16:36:43 crc kubenswrapper[4786]: I0313 16:36:43.370368 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 16:36:43 crc kubenswrapper[4786]: W0313 16:36:43.374540 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc778abf8_64be_4c6e_993a_df2bf80128c0.slice/crio-ea3d0698f29cad34b5d97e1b7b7d82c788ad1664d9a844e3df5eb6f00be37827 WatchSource:0}: Error finding container ea3d0698f29cad34b5d97e1b7b7d82c788ad1664d9a844e3df5eb6f00be37827: Status 404 returned error can't find the container with id ea3d0698f29cad34b5d97e1b7b7d82c788ad1664d9a844e3df5eb6f00be37827 Mar 13 16:36:43 crc kubenswrapper[4786]: I0313 16:36:43.459177 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 13 16:36:43 crc kubenswrapper[4786]: I0313 16:36:43.997687 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.005669 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.046089 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.046157 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.158970 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 13 16:36:44 crc kubenswrapper[4786]: W0313 16:36:44.160031 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44ab28cc_d71c_43f8_8448_a7175567fd08.slice/crio-bc8ef33b692dcfa609069af38f5b38ae2f7f100705dda54346688ca8f60aece3 WatchSource:0}: Error finding container bc8ef33b692dcfa609069af38f5b38ae2f7f100705dda54346688ca8f60aece3: Status 404 returned error can't find the container with id bc8ef33b692dcfa609069af38f5b38ae2f7f100705dda54346688ca8f60aece3 Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.310778 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"986264a5-c61a-430c-818f-b0c66693eb87","Type":"ContainerStarted","Data":"6da4fe9e1fa9c4e57da9c9ce31a0f05b888a4deb81b8ef996fcfdddda5e20b9c"} Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.310816 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"986264a5-c61a-430c-818f-b0c66693eb87","Type":"ContainerStarted","Data":"a79efadface55f4c9bf8fc1d32f0c9d510d9dd44aa42ad3cedcfa40312650e19"} Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.310826 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"986264a5-c61a-430c-818f-b0c66693eb87","Type":"ContainerStarted","Data":"38b8dfd5f065ad54be670050130a22cdc3bbfde1ed2b9e89fc2c982e4c3b7662"} Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.314916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c778abf8-64be-4c6e-993a-df2bf80128c0","Type":"ContainerStarted","Data":"a83e28057edc754faf41470b9bf1e3aa0b264041f90f1a8190a6f448a0c67aa5"} Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.314938 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c778abf8-64be-4c6e-993a-df2bf80128c0","Type":"ContainerStarted","Data":"45e3173e9c8400f77da7e1873538c163584037f4ba814f339380e6348f77bb59"} Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.314975 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c778abf8-64be-4c6e-993a-df2bf80128c0","Type":"ContainerStarted","Data":"ea3d0698f29cad34b5d97e1b7b7d82c788ad1664d9a844e3df5eb6f00be37827"} Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.317740 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"44ab28cc-d71c-43f8-8448-a7175567fd08","Type":"ContainerStarted","Data":"bc8ef33b692dcfa609069af38f5b38ae2f7f100705dda54346688ca8f60aece3"} Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.318154 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.373902 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.373845933 podStartE2EDuration="3.373845933s" podCreationTimestamp="2026-03-13 16:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:36:44.339874087 +0000 UTC m=+5634.503085908" watchObservedRunningTime="2026-03-13 16:36:44.373845933 +0000 UTC m=+5634.537057774" Mar 13 16:36:44 crc kubenswrapper[4786]: I0313 16:36:44.376816 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.376798916 podStartE2EDuration="3.376798916s" podCreationTimestamp="2026-03-13 16:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:36:44.364500553 +0000 UTC m=+5634.527712404" watchObservedRunningTime="2026-03-13 16:36:44.376798916 +0000 UTC m=+5634.540010767" Mar 13 16:36:45 crc kubenswrapper[4786]: I0313 16:36:45.333612 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"44ab28cc-d71c-43f8-8448-a7175567fd08","Type":"ContainerStarted","Data":"c4ce51b7fc8755c81ae05c43d9d3241314fe451e94d247a8ff44e4dd23c551fd"} Mar 13 16:36:45 crc kubenswrapper[4786]: I0313 16:36:45.334052 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"44ab28cc-d71c-43f8-8448-a7175567fd08","Type":"ContainerStarted","Data":"4fd0b6e41ad02ae27f4acb181607134404a952b1b58e31aa42aabfce96ebd300"} Mar 13 16:36:45 crc kubenswrapper[4786]: I0313 16:36:45.377670 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.37764217 podStartE2EDuration="4.37764217s" podCreationTimestamp="2026-03-13 16:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:36:45.362026065 +0000 UTC m=+5635.525237906" watchObservedRunningTime="2026-03-13 16:36:45.37764217 +0000 UTC m=+5635.540854021" Mar 13 16:36:45 crc kubenswrapper[4786]: I0313 16:36:45.701488 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:45 crc kubenswrapper[4786]: I0313 16:36:45.769293 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:45 crc kubenswrapper[4786]: I0313 16:36:45.776056 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:45 crc kubenswrapper[4786]: I0313 16:36:45.997101 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.039336 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.046758 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.326640 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65bd9d88cc-xwzpg"] Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.328363 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.330548 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.335101 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65bd9d88cc-xwzpg"] Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.477693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-ovsdbserver-nb\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.478510 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-dns-svc\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.479033 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-config\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.479611 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vsld\" (UniqueName: \"kubernetes.io/projected/4bd81442-912c-4239-ad09-7b26b3f423ed-kube-api-access-8vsld\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.580589 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-config\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.580639 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vsld\" (UniqueName: \"kubernetes.io/projected/4bd81442-912c-4239-ad09-7b26b3f423ed-kube-api-access-8vsld\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.580705 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-ovsdbserver-nb\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.580735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-dns-svc\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.581563 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-dns-svc\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.581593 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-config\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.581699 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-ovsdbserver-nb\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.612996 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vsld\" (UniqueName: \"kubernetes.io/projected/4bd81442-912c-4239-ad09-7b26b3f423ed-kube-api-access-8vsld\") pod \"dnsmasq-dns-65bd9d88cc-xwzpg\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:46 crc kubenswrapper[4786]: I0313 16:36:46.655909 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.063800 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.109704 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.120910 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65bd9d88cc-xwzpg"] Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.133508 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 16:36:47 crc kubenswrapper[4786]: W0313 16:36:47.137118 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bd81442_912c_4239_ad09_7b26b3f423ed.slice/crio-32d82c40e3b5c81a437540f11efcacfa7596c693cd51973e9ae344b2501fdc39 WatchSource:0}: Error finding container 32d82c40e3b5c81a437540f11efcacfa7596c693cd51973e9ae344b2501fdc39: Status 404 returned error can't find the container with id 32d82c40e3b5c81a437540f11efcacfa7596c693cd51973e9ae344b2501fdc39 Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.351158 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" event={"ID":"4bd81442-912c-4239-ad09-7b26b3f423ed","Type":"ContainerStarted","Data":"4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729"} Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.351193 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" event={"ID":"4bd81442-912c-4239-ad09-7b26b3f423ed","Type":"ContainerStarted","Data":"32d82c40e3b5c81a437540f11efcacfa7596c693cd51973e9ae344b2501fdc39"} Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.411007 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.702070 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.769039 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:47 crc kubenswrapper[4786]: I0313 16:36:47.776321 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:48 crc kubenswrapper[4786]: I0313 16:36:48.364474 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bd81442-912c-4239-ad09-7b26b3f423ed" containerID="4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729" exitCode=0 Mar 13 16:36:48 crc kubenswrapper[4786]: I0313 16:36:48.364535 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" event={"ID":"4bd81442-912c-4239-ad09-7b26b3f423ed","Type":"ContainerDied","Data":"4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729"} Mar 13 16:36:48 crc kubenswrapper[4786]: I0313 16:36:48.780692 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:48 crc kubenswrapper[4786]: I0313 16:36:48.845125 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:48 crc kubenswrapper[4786]: I0313 16:36:48.846947 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 16:36:48 crc kubenswrapper[4786]: I0313 16:36:48.862137 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:48 crc kubenswrapper[4786]: I0313 16:36:48.918804 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.083107 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65bd9d88cc-xwzpg"] Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.121141 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56ff768df5-zt69v"] Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.122832 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.124459 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.146066 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56ff768df5-zt69v"] Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.259239 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-config\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.259322 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-nb\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.259353 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-dns-svc\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.259405 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-sb\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.259463 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rplk\" (UniqueName: \"kubernetes.io/projected/b6119049-0a0d-4811-8d7f-bfbfe13882c8-kube-api-access-2rplk\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.360722 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rplk\" (UniqueName: \"kubernetes.io/projected/b6119049-0a0d-4811-8d7f-bfbfe13882c8-kube-api-access-2rplk\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.360815 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-config\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.360882 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-nb\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.360912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-dns-svc\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.360977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-sb\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.361623 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-config\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.361678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-nb\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.362036 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-sb\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.362037 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-dns-svc\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.375223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" event={"ID":"4bd81442-912c-4239-ad09-7b26b3f423ed","Type":"ContainerStarted","Data":"d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641"} Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.376227 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.399301 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" podStartSLOduration=3.399279096 podStartE2EDuration="3.399279096s" podCreationTimestamp="2026-03-13 16:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:36:49.395663197 +0000 UTC m=+5639.558875008" watchObservedRunningTime="2026-03-13 16:36:49.399279096 +0000 UTC m=+5639.562490927" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.400036 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rplk\" (UniqueName: \"kubernetes.io/projected/b6119049-0a0d-4811-8d7f-bfbfe13882c8-kube-api-access-2rplk\") pod \"dnsmasq-dns-56ff768df5-zt69v\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.416182 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.443061 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:49 crc kubenswrapper[4786]: I0313 16:36:49.887585 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56ff768df5-zt69v"] Mar 13 16:36:49 crc kubenswrapper[4786]: W0313 16:36:49.903313 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6119049_0a0d_4811_8d7f_bfbfe13882c8.slice/crio-f0f29df52c4f00ca46a2c115c25992ecd22a5b58c7a70bac6ec99735b59d843f WatchSource:0}: Error finding container f0f29df52c4f00ca46a2c115c25992ecd22a5b58c7a70bac6ec99735b59d843f: Status 404 returned error can't find the container with id f0f29df52c4f00ca46a2c115c25992ecd22a5b58c7a70bac6ec99735b59d843f Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.386343 4786 generic.go:334] "Generic (PLEG): container finished" podID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" containerID="bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792" exitCode=0 Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.386457 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" event={"ID":"b6119049-0a0d-4811-8d7f-bfbfe13882c8","Type":"ContainerDied","Data":"bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792"} Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.386515 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" event={"ID":"b6119049-0a0d-4811-8d7f-bfbfe13882c8","Type":"ContainerStarted","Data":"f0f29df52c4f00ca46a2c115c25992ecd22a5b58c7a70bac6ec99735b59d843f"} Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.386825 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" podUID="4bd81442-912c-4239-ad09-7b26b3f423ed" containerName="dnsmasq-dns" containerID="cri-o://d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641" gracePeriod=10 Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.853040 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.889257 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vsld\" (UniqueName: \"kubernetes.io/projected/4bd81442-912c-4239-ad09-7b26b3f423ed-kube-api-access-8vsld\") pod \"4bd81442-912c-4239-ad09-7b26b3f423ed\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.889369 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-config\") pod \"4bd81442-912c-4239-ad09-7b26b3f423ed\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.889411 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-dns-svc\") pod \"4bd81442-912c-4239-ad09-7b26b3f423ed\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.889452 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-ovsdbserver-nb\") pod \"4bd81442-912c-4239-ad09-7b26b3f423ed\" (UID: \"4bd81442-912c-4239-ad09-7b26b3f423ed\") " Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.908880 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd81442-912c-4239-ad09-7b26b3f423ed-kube-api-access-8vsld" (OuterVolumeSpecName: "kube-api-access-8vsld") pod "4bd81442-912c-4239-ad09-7b26b3f423ed" (UID: "4bd81442-912c-4239-ad09-7b26b3f423ed"). InnerVolumeSpecName "kube-api-access-8vsld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.947258 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4bd81442-912c-4239-ad09-7b26b3f423ed" (UID: "4bd81442-912c-4239-ad09-7b26b3f423ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.966604 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4bd81442-912c-4239-ad09-7b26b3f423ed" (UID: "4bd81442-912c-4239-ad09-7b26b3f423ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.968130 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-config" (OuterVolumeSpecName: "config") pod "4bd81442-912c-4239-ad09-7b26b3f423ed" (UID: "4bd81442-912c-4239-ad09-7b26b3f423ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.991506 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.991539 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vsld\" (UniqueName: \"kubernetes.io/projected/4bd81442-912c-4239-ad09-7b26b3f423ed-kube-api-access-8vsld\") on node \"crc\" DevicePath \"\"" Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.991554 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:36:50 crc kubenswrapper[4786]: I0313 16:36:50.991566 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bd81442-912c-4239-ad09-7b26b3f423ed-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.398514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" event={"ID":"b6119049-0a0d-4811-8d7f-bfbfe13882c8","Type":"ContainerStarted","Data":"3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe"} Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.399478 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.402197 4786 generic.go:334] "Generic (PLEG): container finished" podID="4bd81442-912c-4239-ad09-7b26b3f423ed" containerID="d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641" exitCode=0 Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.402228 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" event={"ID":"4bd81442-912c-4239-ad09-7b26b3f423ed","Type":"ContainerDied","Data":"d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641"} Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.402258 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" event={"ID":"4bd81442-912c-4239-ad09-7b26b3f423ed","Type":"ContainerDied","Data":"32d82c40e3b5c81a437540f11efcacfa7596c693cd51973e9ae344b2501fdc39"} Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.402278 4786 scope.go:117] "RemoveContainer" containerID="d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.402344 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bd9d88cc-xwzpg" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.424340 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" podStartSLOduration=2.42432538 podStartE2EDuration="2.42432538s" podCreationTimestamp="2026-03-13 16:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:36:51.420008984 +0000 UTC m=+5641.583220795" watchObservedRunningTime="2026-03-13 16:36:51.42432538 +0000 UTC m=+5641.587537181" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.453448 4786 scope.go:117] "RemoveContainer" containerID="4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.462491 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65bd9d88cc-xwzpg"] Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.469323 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65bd9d88cc-xwzpg"] Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.487003 4786 scope.go:117] "RemoveContainer" containerID="d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641" Mar 13 16:36:51 crc kubenswrapper[4786]: E0313 16:36:51.487423 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641\": container with ID starting with d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641 not found: ID does not exist" containerID="d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.487454 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641"} err="failed to get container status \"d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641\": rpc error: code = NotFound desc = could not find container \"d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641\": container with ID starting with d6e26250ed8f00e5daa2300d14842335478daee6fd3ce9ae08f7a1c3eb0c8641 not found: ID does not exist" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.487475 4786 scope.go:117] "RemoveContainer" containerID="4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729" Mar 13 16:36:51 crc kubenswrapper[4786]: E0313 16:36:51.487682 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729\": container with ID starting with 4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729 not found: ID does not exist" containerID="4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.487714 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729"} err="failed to get container status \"4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729\": rpc error: code = NotFound desc = could not find container \"4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729\": container with ID starting with 4ac080093f906d1b11e0f47e216fff9b4f160d9b7e4cf17fc5b4b22fbce67729 not found: ID does not exist" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.551500 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:36:51 crc kubenswrapper[4786]: E0313 16:36:51.551717 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.635891 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 13 16:36:51 crc kubenswrapper[4786]: E0313 16:36:51.636201 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd81442-912c-4239-ad09-7b26b3f423ed" containerName="init" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.636216 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd81442-912c-4239-ad09-7b26b3f423ed" containerName="init" Mar 13 16:36:51 crc kubenswrapper[4786]: E0313 16:36:51.636239 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd81442-912c-4239-ad09-7b26b3f423ed" containerName="dnsmasq-dns" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.636246 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd81442-912c-4239-ad09-7b26b3f423ed" containerName="dnsmasq-dns" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.636385 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd81442-912c-4239-ad09-7b26b3f423ed" containerName="dnsmasq-dns" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.636933 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.638837 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.645123 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.709571 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtwnd\" (UniqueName: \"kubernetes.io/projected/afa86413-0653-43ff-9f79-327d16b5ea3c-kube-api-access-qtwnd\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.709631 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c6cdd4d8-fa96-4dd8-9aeb-56bff494152a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6cdd4d8-fa96-4dd8-9aeb-56bff494152a\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.709676 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/afa86413-0653-43ff-9f79-327d16b5ea3c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.811340 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtwnd\" (UniqueName: \"kubernetes.io/projected/afa86413-0653-43ff-9f79-327d16b5ea3c-kube-api-access-qtwnd\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.811392 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c6cdd4d8-fa96-4dd8-9aeb-56bff494152a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6cdd4d8-fa96-4dd8-9aeb-56bff494152a\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.811435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/afa86413-0653-43ff-9f79-327d16b5ea3c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.816405 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.816451 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c6cdd4d8-fa96-4dd8-9aeb-56bff494152a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6cdd4d8-fa96-4dd8-9aeb-56bff494152a\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fabe6cf12e9cac1b30a4961832c3aa970eb0c8d3e3251c896703d078b8d38f14/globalmount\"" pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.823587 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/afa86413-0653-43ff-9f79-327d16b5ea3c-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.827581 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtwnd\" (UniqueName: \"kubernetes.io/projected/afa86413-0653-43ff-9f79-327d16b5ea3c-kube-api-access-qtwnd\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.839805 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c6cdd4d8-fa96-4dd8-9aeb-56bff494152a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6cdd4d8-fa96-4dd8-9aeb-56bff494152a\") pod \"ovn-copy-data\" (UID: \"afa86413-0653-43ff-9f79-327d16b5ea3c\") " pod="openstack/ovn-copy-data" Mar 13 16:36:51 crc kubenswrapper[4786]: I0313 16:36:51.961270 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 13 16:36:52 crc kubenswrapper[4786]: I0313 16:36:52.473517 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 13 16:36:52 crc kubenswrapper[4786]: W0313 16:36:52.481663 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafa86413_0653_43ff_9f79_327d16b5ea3c.slice/crio-377f8880382928b5e74d511a0099ea26c202b7c8c8a3e507ff4015e314b672da WatchSource:0}: Error finding container 377f8880382928b5e74d511a0099ea26c202b7c8c8a3e507ff4015e314b672da: Status 404 returned error can't find the container with id 377f8880382928b5e74d511a0099ea26c202b7c8c8a3e507ff4015e314b672da Mar 13 16:36:52 crc kubenswrapper[4786]: I0313 16:36:52.565075 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd81442-912c-4239-ad09-7b26b3f423ed" path="/var/lib/kubelet/pods/4bd81442-912c-4239-ad09-7b26b3f423ed/volumes" Mar 13 16:36:53 crc kubenswrapper[4786]: I0313 16:36:53.426067 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"afa86413-0653-43ff-9f79-327d16b5ea3c","Type":"ContainerStarted","Data":"e78364f19771f537470159d5fbec939ad753c179ad57203cd4cd7c124d1d646e"} Mar 13 16:36:53 crc kubenswrapper[4786]: I0313 16:36:53.426366 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"afa86413-0653-43ff-9f79-327d16b5ea3c","Type":"ContainerStarted","Data":"377f8880382928b5e74d511a0099ea26c202b7c8c8a3e507ff4015e314b672da"} Mar 13 16:36:53 crc kubenswrapper[4786]: I0313 16:36:53.444648 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.937214142 podStartE2EDuration="3.444617906s" podCreationTimestamp="2026-03-13 16:36:50 +0000 UTC" firstStartedPulling="2026-03-13 16:36:52.486527815 +0000 UTC m=+5642.649739636" lastFinishedPulling="2026-03-13 16:36:52.993931569 +0000 UTC m=+5643.157143400" observedRunningTime="2026-03-13 16:36:53.440471644 +0000 UTC m=+5643.603683455" watchObservedRunningTime="2026-03-13 16:36:53.444617906 +0000 UTC m=+5643.607829757" Mar 13 16:36:57 crc kubenswrapper[4786]: E0313 16:36:57.352948 4786 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.12:59570->38.102.83.12:40601: read tcp 38.102.83.12:59570->38.102.83.12:40601: read: connection reset by peer Mar 13 16:36:57 crc kubenswrapper[4786]: E0313 16:36:57.972726 4786 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.12:59576->38.102.83.12:40601: write tcp 38.102.83.12:59576->38.102.83.12:40601: write: broken pipe Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.663355 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.665664 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.668941 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.669125 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.669318 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.669346 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-drmtk" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.715166 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.741358 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87781344-d04e-483c-a281-8dfb63ec64b9-scripts\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.741434 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87781344-d04e-483c-a281-8dfb63ec64b9-config\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.741575 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.741648 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.741721 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.742301 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn6wz\" (UniqueName: \"kubernetes.io/projected/87781344-d04e-483c-a281-8dfb63ec64b9-kube-api-access-fn6wz\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.742475 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87781344-d04e-483c-a281-8dfb63ec64b9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.842955 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87781344-d04e-483c-a281-8dfb63ec64b9-scripts\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.842997 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87781344-d04e-483c-a281-8dfb63ec64b9-config\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.843027 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.843049 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.843073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.843112 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn6wz\" (UniqueName: \"kubernetes.io/projected/87781344-d04e-483c-a281-8dfb63ec64b9-kube-api-access-fn6wz\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.843149 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87781344-d04e-483c-a281-8dfb63ec64b9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.843870 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87781344-d04e-483c-a281-8dfb63ec64b9-scripts\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.844140 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87781344-d04e-483c-a281-8dfb63ec64b9-config\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.844143 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87781344-d04e-483c-a281-8dfb63ec64b9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.850510 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.850605 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.863147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87781344-d04e-483c-a281-8dfb63ec64b9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:58 crc kubenswrapper[4786]: I0313 16:36:58.875087 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn6wz\" (UniqueName: \"kubernetes.io/projected/87781344-d04e-483c-a281-8dfb63ec64b9-kube-api-access-fn6wz\") pod \"ovn-northd-0\" (UID: \"87781344-d04e-483c-a281-8dfb63ec64b9\") " pod="openstack/ovn-northd-0" Mar 13 16:36:59 crc kubenswrapper[4786]: I0313 16:36:59.003490 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 16:36:59 crc kubenswrapper[4786]: I0313 16:36:59.446031 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:36:59 crc kubenswrapper[4786]: I0313 16:36:59.502712 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-bsvld"] Mar 13 16:36:59 crc kubenswrapper[4786]: I0313 16:36:59.503023 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" podUID="d61bb448-910e-4e1c-a869-c053109566fd" containerName="dnsmasq-dns" containerID="cri-o://a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376" gracePeriod=10 Mar 13 16:36:59 crc kubenswrapper[4786]: I0313 16:36:59.622947 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 16:36:59 crc kubenswrapper[4786]: W0313 16:36:59.642921 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87781344_d04e_483c_a281_8dfb63ec64b9.slice/crio-d2a3afda5798bbde1ea057bd322fca95a65874553e03fa2edbb4f1b8732605dc WatchSource:0}: Error finding container d2a3afda5798bbde1ea057bd322fca95a65874553e03fa2edbb4f1b8732605dc: Status 404 returned error can't find the container with id d2a3afda5798bbde1ea057bd322fca95a65874553e03fa2edbb4f1b8732605dc Mar 13 16:36:59 crc kubenswrapper[4786]: I0313 16:36:59.979441 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.171004 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-dns-svc\") pod \"d61bb448-910e-4e1c-a869-c053109566fd\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.171376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-config\") pod \"d61bb448-910e-4e1c-a869-c053109566fd\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.171471 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbhmm\" (UniqueName: \"kubernetes.io/projected/d61bb448-910e-4e1c-a869-c053109566fd-kube-api-access-tbhmm\") pod \"d61bb448-910e-4e1c-a869-c053109566fd\" (UID: \"d61bb448-910e-4e1c-a869-c053109566fd\") " Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.176416 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61bb448-910e-4e1c-a869-c053109566fd-kube-api-access-tbhmm" (OuterVolumeSpecName: "kube-api-access-tbhmm") pod "d61bb448-910e-4e1c-a869-c053109566fd" (UID: "d61bb448-910e-4e1c-a869-c053109566fd"). InnerVolumeSpecName "kube-api-access-tbhmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.209143 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d61bb448-910e-4e1c-a869-c053109566fd" (UID: "d61bb448-910e-4e1c-a869-c053109566fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.220290 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-config" (OuterVolumeSpecName: "config") pod "d61bb448-910e-4e1c-a869-c053109566fd" (UID: "d61bb448-910e-4e1c-a869-c053109566fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.273339 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.273375 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61bb448-910e-4e1c-a869-c053109566fd-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.273390 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbhmm\" (UniqueName: \"kubernetes.io/projected/d61bb448-910e-4e1c-a869-c053109566fd-kube-api-access-tbhmm\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.502951 4786 generic.go:334] "Generic (PLEG): container finished" podID="d61bb448-910e-4e1c-a869-c053109566fd" containerID="a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376" exitCode=0 Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.503041 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" event={"ID":"d61bb448-910e-4e1c-a869-c053109566fd","Type":"ContainerDied","Data":"a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376"} Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.503077 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" event={"ID":"d61bb448-910e-4e1c-a869-c053109566fd","Type":"ContainerDied","Data":"99602dfef1c06d537dc47312731993359f4d64f0b49f0d7c36ebad702ccd925e"} Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.503099 4786 scope.go:117] "RemoveContainer" containerID="a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.503120 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684c864bc9-bsvld" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.505551 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87781344-d04e-483c-a281-8dfb63ec64b9","Type":"ContainerStarted","Data":"12db6c3db52815f599adafb25ea23905b5649870d9919b45c7af3cfc10fff7c6"} Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.505621 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87781344-d04e-483c-a281-8dfb63ec64b9","Type":"ContainerStarted","Data":"bc41d7774ef415342c23370196e47ea4548c5b70cc0ac6e5dbe6574edbc27485"} Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.505648 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87781344-d04e-483c-a281-8dfb63ec64b9","Type":"ContainerStarted","Data":"d2a3afda5798bbde1ea057bd322fca95a65874553e03fa2edbb4f1b8732605dc"} Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.506018 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.535131 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.535106408 podStartE2EDuration="2.535106408s" podCreationTimestamp="2026-03-13 16:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:37:00.528008743 +0000 UTC m=+5650.691220614" watchObservedRunningTime="2026-03-13 16:37:00.535106408 +0000 UTC m=+5650.698318219" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.544292 4786 scope.go:117] "RemoveContainer" containerID="c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.569775 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-bsvld"] Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.579131 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-684c864bc9-bsvld"] Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.581701 4786 scope.go:117] "RemoveContainer" containerID="a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376" Mar 13 16:37:00 crc kubenswrapper[4786]: E0313 16:37:00.582215 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376\": container with ID starting with a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376 not found: ID does not exist" containerID="a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.582278 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376"} err="failed to get container status \"a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376\": rpc error: code = NotFound desc = could not find container \"a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376\": container with ID starting with a3b7296407a5dcd9067f8400eb8137140ee03448694cf6dac9a00c3599277376 not found: ID does not exist" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.582322 4786 scope.go:117] "RemoveContainer" containerID="c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d" Mar 13 16:37:00 crc kubenswrapper[4786]: E0313 16:37:00.582783 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d\": container with ID starting with c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d not found: ID does not exist" containerID="c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d" Mar 13 16:37:00 crc kubenswrapper[4786]: I0313 16:37:00.582817 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d"} err="failed to get container status \"c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d\": rpc error: code = NotFound desc = could not find container \"c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d\": container with ID starting with c31e9388fab5fa2b58fd155466436cae2dd7830855f288fc59eacf1c40d7b44d not found: ID does not exist" Mar 13 16:37:02 crc kubenswrapper[4786]: I0313 16:37:02.571010 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61bb448-910e-4e1c-a869-c053109566fd" path="/var/lib/kubelet/pods/d61bb448-910e-4e1c-a869-c053109566fd/volumes" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.055382 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gztgw"] Mar 13 16:37:04 crc kubenswrapper[4786]: E0313 16:37:04.056216 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61bb448-910e-4e1c-a869-c053109566fd" containerName="dnsmasq-dns" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.056236 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61bb448-910e-4e1c-a869-c053109566fd" containerName="dnsmasq-dns" Mar 13 16:37:04 crc kubenswrapper[4786]: E0313 16:37:04.056260 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61bb448-910e-4e1c-a869-c053109566fd" containerName="init" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.056267 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61bb448-910e-4e1c-a869-c053109566fd" containerName="init" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.056457 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61bb448-910e-4e1c-a869-c053109566fd" containerName="dnsmasq-dns" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.056970 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.063044 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-51aa-account-create-update-k7cfk"] Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.064371 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.065787 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.074477 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-51aa-account-create-update-k7cfk"] Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.112227 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gztgw"] Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.146232 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qktkz\" (UniqueName: \"kubernetes.io/projected/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-kube-api-access-qktkz\") pod \"keystone-db-create-gztgw\" (UID: \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\") " pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.146606 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgcrk\" (UniqueName: \"kubernetes.io/projected/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-kube-api-access-rgcrk\") pod \"keystone-51aa-account-create-update-k7cfk\" (UID: \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\") " pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.146693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-operator-scripts\") pod \"keystone-51aa-account-create-update-k7cfk\" (UID: \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\") " pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.146835 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-operator-scripts\") pod \"keystone-db-create-gztgw\" (UID: \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\") " pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.248525 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-operator-scripts\") pod \"keystone-db-create-gztgw\" (UID: \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\") " pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.248657 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qktkz\" (UniqueName: \"kubernetes.io/projected/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-kube-api-access-qktkz\") pod \"keystone-db-create-gztgw\" (UID: \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\") " pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.248731 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgcrk\" (UniqueName: \"kubernetes.io/projected/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-kube-api-access-rgcrk\") pod \"keystone-51aa-account-create-update-k7cfk\" (UID: \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\") " pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.248767 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-operator-scripts\") pod \"keystone-51aa-account-create-update-k7cfk\" (UID: \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\") " pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.249495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-operator-scripts\") pod \"keystone-db-create-gztgw\" (UID: \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\") " pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.249755 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-operator-scripts\") pod \"keystone-51aa-account-create-update-k7cfk\" (UID: \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\") " pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.272474 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qktkz\" (UniqueName: \"kubernetes.io/projected/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-kube-api-access-qktkz\") pod \"keystone-db-create-gztgw\" (UID: \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\") " pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.276945 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgcrk\" (UniqueName: \"kubernetes.io/projected/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-kube-api-access-rgcrk\") pod \"keystone-51aa-account-create-update-k7cfk\" (UID: \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\") " pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.377938 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.396840 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.892134 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gztgw"] Mar 13 16:37:04 crc kubenswrapper[4786]: W0313 16:37:04.893101 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64219e76_8dbd_4a87_8b6e_a6cc0818f23e.slice/crio-be284c37097c9acf3c2b561cd669ad0853568195a6f64d7bb5286bfbd8ab539f WatchSource:0}: Error finding container be284c37097c9acf3c2b561cd669ad0853568195a6f64d7bb5286bfbd8ab539f: Status 404 returned error can't find the container with id be284c37097c9acf3c2b561cd669ad0853568195a6f64d7bb5286bfbd8ab539f Mar 13 16:37:04 crc kubenswrapper[4786]: I0313 16:37:04.947579 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-51aa-account-create-update-k7cfk"] Mar 13 16:37:04 crc kubenswrapper[4786]: W0313 16:37:04.954443 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba347180_a21a_4e7b_be37_99d9f2ce2cdb.slice/crio-5a37a8413242817cd40b0b7f9190684325c9b838026924c56c914ddb98190249 WatchSource:0}: Error finding container 5a37a8413242817cd40b0b7f9190684325c9b838026924c56c914ddb98190249: Status 404 returned error can't find the container with id 5a37a8413242817cd40b0b7f9190684325c9b838026924c56c914ddb98190249 Mar 13 16:37:05 crc kubenswrapper[4786]: I0313 16:37:05.552310 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:37:05 crc kubenswrapper[4786]: E0313 16:37:05.552677 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:37:05 crc kubenswrapper[4786]: I0313 16:37:05.570102 4786 generic.go:334] "Generic (PLEG): container finished" podID="64219e76-8dbd-4a87-8b6e-a6cc0818f23e" containerID="cc0e2d7576cc4e62a498528df1756ac070f13283e5a8e61a6fb19cdebe4040b1" exitCode=0 Mar 13 16:37:05 crc kubenswrapper[4786]: I0313 16:37:05.570176 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gztgw" event={"ID":"64219e76-8dbd-4a87-8b6e-a6cc0818f23e","Type":"ContainerDied","Data":"cc0e2d7576cc4e62a498528df1756ac070f13283e5a8e61a6fb19cdebe4040b1"} Mar 13 16:37:05 crc kubenswrapper[4786]: I0313 16:37:05.570199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gztgw" event={"ID":"64219e76-8dbd-4a87-8b6e-a6cc0818f23e","Type":"ContainerStarted","Data":"be284c37097c9acf3c2b561cd669ad0853568195a6f64d7bb5286bfbd8ab539f"} Mar 13 16:37:05 crc kubenswrapper[4786]: I0313 16:37:05.572366 4786 generic.go:334] "Generic (PLEG): container finished" podID="ba347180-a21a-4e7b-be37-99d9f2ce2cdb" containerID="fdf08bab267ccb7f6152d257e3ce4bf692ff63df2a66cf568de98dd900984de6" exitCode=0 Mar 13 16:37:05 crc kubenswrapper[4786]: I0313 16:37:05.572423 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-51aa-account-create-update-k7cfk" event={"ID":"ba347180-a21a-4e7b-be37-99d9f2ce2cdb","Type":"ContainerDied","Data":"fdf08bab267ccb7f6152d257e3ce4bf692ff63df2a66cf568de98dd900984de6"} Mar 13 16:37:05 crc kubenswrapper[4786]: I0313 16:37:05.572458 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-51aa-account-create-update-k7cfk" event={"ID":"ba347180-a21a-4e7b-be37-99d9f2ce2cdb","Type":"ContainerStarted","Data":"5a37a8413242817cd40b0b7f9190684325c9b838026924c56c914ddb98190249"} Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.050407 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.066805 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.106544 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-operator-scripts\") pod \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\" (UID: \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\") " Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.106948 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-operator-scripts\") pod \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\" (UID: \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\") " Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.106988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64219e76-8dbd-4a87-8b6e-a6cc0818f23e" (UID: "64219e76-8dbd-4a87-8b6e-a6cc0818f23e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.107032 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgcrk\" (UniqueName: \"kubernetes.io/projected/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-kube-api-access-rgcrk\") pod \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\" (UID: \"ba347180-a21a-4e7b-be37-99d9f2ce2cdb\") " Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.107068 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qktkz\" (UniqueName: \"kubernetes.io/projected/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-kube-api-access-qktkz\") pod \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\" (UID: \"64219e76-8dbd-4a87-8b6e-a6cc0818f23e\") " Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.107453 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.107467 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba347180-a21a-4e7b-be37-99d9f2ce2cdb" (UID: "ba347180-a21a-4e7b-be37-99d9f2ce2cdb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.117342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-kube-api-access-qktkz" (OuterVolumeSpecName: "kube-api-access-qktkz") pod "64219e76-8dbd-4a87-8b6e-a6cc0818f23e" (UID: "64219e76-8dbd-4a87-8b6e-a6cc0818f23e"). InnerVolumeSpecName "kube-api-access-qktkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.118182 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-kube-api-access-rgcrk" (OuterVolumeSpecName: "kube-api-access-rgcrk") pod "ba347180-a21a-4e7b-be37-99d9f2ce2cdb" (UID: "ba347180-a21a-4e7b-be37-99d9f2ce2cdb"). InnerVolumeSpecName "kube-api-access-rgcrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.209805 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgcrk\" (UniqueName: \"kubernetes.io/projected/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-kube-api-access-rgcrk\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.209853 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qktkz\" (UniqueName: \"kubernetes.io/projected/64219e76-8dbd-4a87-8b6e-a6cc0818f23e-kube-api-access-qktkz\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.209894 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba347180-a21a-4e7b-be37-99d9f2ce2cdb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.593220 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gztgw" event={"ID":"64219e76-8dbd-4a87-8b6e-a6cc0818f23e","Type":"ContainerDied","Data":"be284c37097c9acf3c2b561cd669ad0853568195a6f64d7bb5286bfbd8ab539f"} Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.593254 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gztgw" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.593269 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be284c37097c9acf3c2b561cd669ad0853568195a6f64d7bb5286bfbd8ab539f" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.597248 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-51aa-account-create-update-k7cfk" Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.597231 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-51aa-account-create-update-k7cfk" event={"ID":"ba347180-a21a-4e7b-be37-99d9f2ce2cdb","Type":"ContainerDied","Data":"5a37a8413242817cd40b0b7f9190684325c9b838026924c56c914ddb98190249"} Mar 13 16:37:07 crc kubenswrapper[4786]: I0313 16:37:07.597415 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a37a8413242817cd40b0b7f9190684325c9b838026924c56c914ddb98190249" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.098184 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.627022 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-chmdg"] Mar 13 16:37:09 crc kubenswrapper[4786]: E0313 16:37:09.627412 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba347180-a21a-4e7b-be37-99d9f2ce2cdb" containerName="mariadb-account-create-update" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.627436 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba347180-a21a-4e7b-be37-99d9f2ce2cdb" containerName="mariadb-account-create-update" Mar 13 16:37:09 crc kubenswrapper[4786]: E0313 16:37:09.627469 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64219e76-8dbd-4a87-8b6e-a6cc0818f23e" containerName="mariadb-database-create" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.627478 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="64219e76-8dbd-4a87-8b6e-a6cc0818f23e" containerName="mariadb-database-create" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.627651 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="64219e76-8dbd-4a87-8b6e-a6cc0818f23e" containerName="mariadb-database-create" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.627680 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba347180-a21a-4e7b-be37-99d9f2ce2cdb" containerName="mariadb-account-create-update" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.628273 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.630335 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.630461 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9rcs9" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.630473 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.630533 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.654327 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-chmdg"] Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.756568 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2shk\" (UniqueName: \"kubernetes.io/projected/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-kube-api-access-c2shk\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.757484 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-combined-ca-bundle\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.757829 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-config-data\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.859743 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2shk\" (UniqueName: \"kubernetes.io/projected/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-kube-api-access-c2shk\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.860056 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-combined-ca-bundle\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.860290 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-config-data\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.867024 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-combined-ca-bundle\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.868500 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-config-data\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.876901 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2shk\" (UniqueName: \"kubernetes.io/projected/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-kube-api-access-c2shk\") pod \"keystone-db-sync-chmdg\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:09 crc kubenswrapper[4786]: I0313 16:37:09.950339 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:10 crc kubenswrapper[4786]: I0313 16:37:10.412379 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-chmdg"] Mar 13 16:37:10 crc kubenswrapper[4786]: I0313 16:37:10.621157 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-chmdg" event={"ID":"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91","Type":"ContainerStarted","Data":"3838c659a549c7e7c045b5cd9d636f886abfeb6f13aa9ce9e948cc7bf18c231c"} Mar 13 16:37:10 crc kubenswrapper[4786]: I0313 16:37:10.621200 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-chmdg" event={"ID":"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91","Type":"ContainerStarted","Data":"b11df7f9df43411300a3d5bcd23473ee4c388041888be10aeb3e9305f9208835"} Mar 13 16:37:10 crc kubenswrapper[4786]: I0313 16:37:10.638843 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-chmdg" podStartSLOduration=1.638826596 podStartE2EDuration="1.638826596s" podCreationTimestamp="2026-03-13 16:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:37:10.634555401 +0000 UTC m=+5660.797767252" watchObservedRunningTime="2026-03-13 16:37:10.638826596 +0000 UTC m=+5660.802038407" Mar 13 16:37:12 crc kubenswrapper[4786]: I0313 16:37:12.640996 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ea63697-eb03-42fe-b25f-cb6b5e2d7a91" containerID="3838c659a549c7e7c045b5cd9d636f886abfeb6f13aa9ce9e948cc7bf18c231c" exitCode=0 Mar 13 16:37:12 crc kubenswrapper[4786]: I0313 16:37:12.641079 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-chmdg" event={"ID":"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91","Type":"ContainerDied","Data":"3838c659a549c7e7c045b5cd9d636f886abfeb6f13aa9ce9e948cc7bf18c231c"} Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.065485 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.240973 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-combined-ca-bundle\") pod \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.241140 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-config-data\") pod \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.241240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2shk\" (UniqueName: \"kubernetes.io/projected/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-kube-api-access-c2shk\") pod \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\" (UID: \"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91\") " Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.254190 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-kube-api-access-c2shk" (OuterVolumeSpecName: "kube-api-access-c2shk") pod "2ea63697-eb03-42fe-b25f-cb6b5e2d7a91" (UID: "2ea63697-eb03-42fe-b25f-cb6b5e2d7a91"). InnerVolumeSpecName "kube-api-access-c2shk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.273165 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ea63697-eb03-42fe-b25f-cb6b5e2d7a91" (UID: "2ea63697-eb03-42fe-b25f-cb6b5e2d7a91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.315400 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-config-data" (OuterVolumeSpecName: "config-data") pod "2ea63697-eb03-42fe-b25f-cb6b5e2d7a91" (UID: "2ea63697-eb03-42fe-b25f-cb6b5e2d7a91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.345019 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.345241 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2shk\" (UniqueName: \"kubernetes.io/projected/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-kube-api-access-c2shk\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.346797 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.422318 4786 scope.go:117] "RemoveContainer" containerID="794477728eded07d73ed0fdd30b7ab2a4dc285d047896b7e45c9b348b0629cc6" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.664911 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-chmdg" event={"ID":"2ea63697-eb03-42fe-b25f-cb6b5e2d7a91","Type":"ContainerDied","Data":"b11df7f9df43411300a3d5bcd23473ee4c388041888be10aeb3e9305f9208835"} Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.665271 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11df7f9df43411300a3d5bcd23473ee4c388041888be10aeb3e9305f9208835" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.665348 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-chmdg" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.937090 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f78d9d695-wqfkf"] Mar 13 16:37:14 crc kubenswrapper[4786]: E0313 16:37:14.937481 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea63697-eb03-42fe-b25f-cb6b5e2d7a91" containerName="keystone-db-sync" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.937502 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea63697-eb03-42fe-b25f-cb6b5e2d7a91" containerName="keystone-db-sync" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.937707 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea63697-eb03-42fe-b25f-cb6b5e2d7a91" containerName="keystone-db-sync" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.938794 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.963614 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f78d9d695-wqfkf"] Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.980092 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-scvvm"] Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.981510 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.985605 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.985922 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.986190 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9rcs9" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.987122 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.987291 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 16:37:14 crc kubenswrapper[4786]: I0313 16:37:14.993923 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-scvvm"] Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.065797 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-sb\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.065903 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-config\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.065944 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-dns-svc\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.065978 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-nb\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.066021 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n4jf\" (UniqueName: \"kubernetes.io/projected/c22e3434-1abb-49b3-916d-0d36fa09b794-kube-api-access-7n4jf\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167233 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-sb\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167305 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqj2p\" (UniqueName: \"kubernetes.io/projected/031268dd-996f-41fa-8c05-661cb674991f-kube-api-access-vqj2p\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167346 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-config\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167389 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-dns-svc\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167413 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-combined-ca-bundle\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167627 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-fernet-keys\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167765 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-nb\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167838 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-config-data\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.167948 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-scripts\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.168020 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-credential-keys\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.168095 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n4jf\" (UniqueName: \"kubernetes.io/projected/c22e3434-1abb-49b3-916d-0d36fa09b794-kube-api-access-7n4jf\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.168374 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-dns-svc\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.168452 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-sb\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.168469 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-config\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.168475 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-nb\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.201819 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n4jf\" (UniqueName: \"kubernetes.io/projected/c22e3434-1abb-49b3-916d-0d36fa09b794-kube-api-access-7n4jf\") pod \"dnsmasq-dns-f78d9d695-wqfkf\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.263965 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.269213 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-fernet-keys\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.269255 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-combined-ca-bundle\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.269297 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-config-data\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.269340 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-scripts\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.269398 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-credential-keys\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.269478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqj2p\" (UniqueName: \"kubernetes.io/projected/031268dd-996f-41fa-8c05-661cb674991f-kube-api-access-vqj2p\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.274707 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-config-data\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.279336 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-credential-keys\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.279573 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-scripts\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.289006 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-combined-ca-bundle\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.294479 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqj2p\" (UniqueName: \"kubernetes.io/projected/031268dd-996f-41fa-8c05-661cb674991f-kube-api-access-vqj2p\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.294837 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-fernet-keys\") pod \"keystone-bootstrap-scvvm\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.335334 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.805053 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f78d9d695-wqfkf"] Mar 13 16:37:15 crc kubenswrapper[4786]: I0313 16:37:15.817324 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-scvvm"] Mar 13 16:37:16 crc kubenswrapper[4786]: I0313 16:37:16.683005 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-scvvm" event={"ID":"031268dd-996f-41fa-8c05-661cb674991f","Type":"ContainerStarted","Data":"d8fa9696b8f2e0decbc7c6db8476fd4b4c3fa2c6de75f1eb17187c3a2c4440ed"} Mar 13 16:37:16 crc kubenswrapper[4786]: I0313 16:37:16.683316 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-scvvm" event={"ID":"031268dd-996f-41fa-8c05-661cb674991f","Type":"ContainerStarted","Data":"c2d14dfe3fd90bc406319056851603558d325709bbcc26d19ab853ac695d1ee7"} Mar 13 16:37:16 crc kubenswrapper[4786]: I0313 16:37:16.687427 4786 generic.go:334] "Generic (PLEG): container finished" podID="c22e3434-1abb-49b3-916d-0d36fa09b794" containerID="a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98" exitCode=0 Mar 13 16:37:16 crc kubenswrapper[4786]: I0313 16:37:16.687480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" event={"ID":"c22e3434-1abb-49b3-916d-0d36fa09b794","Type":"ContainerDied","Data":"a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98"} Mar 13 16:37:16 crc kubenswrapper[4786]: I0313 16:37:16.687576 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" event={"ID":"c22e3434-1abb-49b3-916d-0d36fa09b794","Type":"ContainerStarted","Data":"21f9f6d85aee3e1133d85cd6982e48cb968cf85f02773f6b4fc69bfe18dc3ae7"} Mar 13 16:37:16 crc kubenswrapper[4786]: I0313 16:37:16.709442 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-scvvm" podStartSLOduration=2.709419746 podStartE2EDuration="2.709419746s" podCreationTimestamp="2026-03-13 16:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:37:16.69862691 +0000 UTC m=+5666.861838761" watchObservedRunningTime="2026-03-13 16:37:16.709419746 +0000 UTC m=+5666.872631567" Mar 13 16:37:17 crc kubenswrapper[4786]: I0313 16:37:17.700993 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" event={"ID":"c22e3434-1abb-49b3-916d-0d36fa09b794","Type":"ContainerStarted","Data":"375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567"} Mar 13 16:37:17 crc kubenswrapper[4786]: I0313 16:37:17.701272 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:17 crc kubenswrapper[4786]: I0313 16:37:17.719706 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" podStartSLOduration=3.7196712610000002 podStartE2EDuration="3.719671261s" podCreationTimestamp="2026-03-13 16:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:37:17.719168928 +0000 UTC m=+5667.882380749" watchObservedRunningTime="2026-03-13 16:37:17.719671261 +0000 UTC m=+5667.882883112" Mar 13 16:37:19 crc kubenswrapper[4786]: I0313 16:37:19.551618 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:37:19 crc kubenswrapper[4786]: E0313 16:37:19.610129 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod031268dd_996f_41fa_8c05_661cb674991f.slice/crio-d8fa9696b8f2e0decbc7c6db8476fd4b4c3fa2c6de75f1eb17187c3a2c4440ed.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod031268dd_996f_41fa_8c05_661cb674991f.slice/crio-conmon-d8fa9696b8f2e0decbc7c6db8476fd4b4c3fa2c6de75f1eb17187c3a2c4440ed.scope\": RecentStats: unable to find data in memory cache]" Mar 13 16:37:19 crc kubenswrapper[4786]: I0313 16:37:19.731839 4786 generic.go:334] "Generic (PLEG): container finished" podID="031268dd-996f-41fa-8c05-661cb674991f" containerID="d8fa9696b8f2e0decbc7c6db8476fd4b4c3fa2c6de75f1eb17187c3a2c4440ed" exitCode=0 Mar 13 16:37:19 crc kubenswrapper[4786]: I0313 16:37:19.731897 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-scvvm" event={"ID":"031268dd-996f-41fa-8c05-661cb674991f","Type":"ContainerDied","Data":"d8fa9696b8f2e0decbc7c6db8476fd4b4c3fa2c6de75f1eb17187c3a2c4440ed"} Mar 13 16:37:20 crc kubenswrapper[4786]: I0313 16:37:20.745952 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"06f63c372c8088421f902d282e4d63c4055021fcedd22c0ec607e784bab36af6"} Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.185774 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.305572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-combined-ca-bundle\") pod \"031268dd-996f-41fa-8c05-661cb674991f\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.306051 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqj2p\" (UniqueName: \"kubernetes.io/projected/031268dd-996f-41fa-8c05-661cb674991f-kube-api-access-vqj2p\") pod \"031268dd-996f-41fa-8c05-661cb674991f\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.306102 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-scripts\") pod \"031268dd-996f-41fa-8c05-661cb674991f\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.306177 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-fernet-keys\") pod \"031268dd-996f-41fa-8c05-661cb674991f\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.307666 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-config-data\") pod \"031268dd-996f-41fa-8c05-661cb674991f\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.307703 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-credential-keys\") pod \"031268dd-996f-41fa-8c05-661cb674991f\" (UID: \"031268dd-996f-41fa-8c05-661cb674991f\") " Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.314099 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "031268dd-996f-41fa-8c05-661cb674991f" (UID: "031268dd-996f-41fa-8c05-661cb674991f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.314229 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "031268dd-996f-41fa-8c05-661cb674991f" (UID: "031268dd-996f-41fa-8c05-661cb674991f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.314454 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031268dd-996f-41fa-8c05-661cb674991f-kube-api-access-vqj2p" (OuterVolumeSpecName: "kube-api-access-vqj2p") pod "031268dd-996f-41fa-8c05-661cb674991f" (UID: "031268dd-996f-41fa-8c05-661cb674991f"). InnerVolumeSpecName "kube-api-access-vqj2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.315273 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-scripts" (OuterVolumeSpecName: "scripts") pod "031268dd-996f-41fa-8c05-661cb674991f" (UID: "031268dd-996f-41fa-8c05-661cb674991f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.331500 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-config-data" (OuterVolumeSpecName: "config-data") pod "031268dd-996f-41fa-8c05-661cb674991f" (UID: "031268dd-996f-41fa-8c05-661cb674991f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.349388 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "031268dd-996f-41fa-8c05-661cb674991f" (UID: "031268dd-996f-41fa-8c05-661cb674991f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.410577 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.410636 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.410652 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.410667 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqj2p\" (UniqueName: \"kubernetes.io/projected/031268dd-996f-41fa-8c05-661cb674991f-kube-api-access-vqj2p\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.410695 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.410713 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031268dd-996f-41fa-8c05-661cb674991f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.763914 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-scvvm" event={"ID":"031268dd-996f-41fa-8c05-661cb674991f","Type":"ContainerDied","Data":"c2d14dfe3fd90bc406319056851603558d325709bbcc26d19ab853ac695d1ee7"} Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.764267 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2d14dfe3fd90bc406319056851603558d325709bbcc26d19ab853ac695d1ee7" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.763988 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-scvvm" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.855204 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-scvvm"] Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.863215 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-scvvm"] Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.926810 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s7m2q"] Mar 13 16:37:21 crc kubenswrapper[4786]: E0313 16:37:21.929189 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031268dd-996f-41fa-8c05-661cb674991f" containerName="keystone-bootstrap" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.929391 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="031268dd-996f-41fa-8c05-661cb674991f" containerName="keystone-bootstrap" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.931771 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="031268dd-996f-41fa-8c05-661cb674991f" containerName="keystone-bootstrap" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.949887 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7m2q"] Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.950120 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.955080 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.955964 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9rcs9" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.958589 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.958884 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 16:37:21 crc kubenswrapper[4786]: I0313 16:37:21.959028 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.026156 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-scripts\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.026256 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-fernet-keys\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.026275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-credential-keys\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.026455 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwcf\" (UniqueName: \"kubernetes.io/projected/add6353c-ce2d-41c9-a724-d5c834495653-kube-api-access-xxwcf\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.026533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-config-data\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.026579 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-combined-ca-bundle\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.128325 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-scripts\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.128387 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-fernet-keys\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.128404 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-credential-keys\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.128483 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwcf\" (UniqueName: \"kubernetes.io/projected/add6353c-ce2d-41c9-a724-d5c834495653-kube-api-access-xxwcf\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.128511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-config-data\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.128541 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-combined-ca-bundle\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.134060 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-combined-ca-bundle\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.134269 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-fernet-keys\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.134918 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-scripts\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.135759 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-credential-keys\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.136975 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-config-data\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.155136 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwcf\" (UniqueName: \"kubernetes.io/projected/add6353c-ce2d-41c9-a724-d5c834495653-kube-api-access-xxwcf\") pod \"keystone-bootstrap-s7m2q\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.275091 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.565771 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031268dd-996f-41fa-8c05-661cb674991f" path="/var/lib/kubelet/pods/031268dd-996f-41fa-8c05-661cb674991f/volumes" Mar 13 16:37:22 crc kubenswrapper[4786]: I0313 16:37:22.778308 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7m2q"] Mar 13 16:37:22 crc kubenswrapper[4786]: W0313 16:37:22.794309 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadd6353c_ce2d_41c9_a724_d5c834495653.slice/crio-35b87d88e2fccfbe5332e9c04284dfd8d54e5313d1d6f952bfe4c107db4df370 WatchSource:0}: Error finding container 35b87d88e2fccfbe5332e9c04284dfd8d54e5313d1d6f952bfe4c107db4df370: Status 404 returned error can't find the container with id 35b87d88e2fccfbe5332e9c04284dfd8d54e5313d1d6f952bfe4c107db4df370 Mar 13 16:37:23 crc kubenswrapper[4786]: I0313 16:37:23.799414 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m2q" event={"ID":"add6353c-ce2d-41c9-a724-d5c834495653","Type":"ContainerStarted","Data":"5f594217d322ddbad8b5f3d6331cb156e8fedf101e5c7c7b31b080746007d084"} Mar 13 16:37:23 crc kubenswrapper[4786]: I0313 16:37:23.799882 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m2q" event={"ID":"add6353c-ce2d-41c9-a724-d5c834495653","Type":"ContainerStarted","Data":"35b87d88e2fccfbe5332e9c04284dfd8d54e5313d1d6f952bfe4c107db4df370"} Mar 13 16:37:23 crc kubenswrapper[4786]: I0313 16:37:23.839571 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s7m2q" podStartSLOduration=2.839541192 podStartE2EDuration="2.839541192s" podCreationTimestamp="2026-03-13 16:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:37:23.817976481 +0000 UTC m=+5673.981188342" watchObservedRunningTime="2026-03-13 16:37:23.839541192 +0000 UTC m=+5674.002753043" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.265758 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.348679 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56ff768df5-zt69v"] Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.349040 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" podUID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" containerName="dnsmasq-dns" containerID="cri-o://3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe" gracePeriod=10 Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.823496 4786 generic.go:334] "Generic (PLEG): container finished" podID="add6353c-ce2d-41c9-a724-d5c834495653" containerID="5f594217d322ddbad8b5f3d6331cb156e8fedf101e5c7c7b31b080746007d084" exitCode=0 Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.823723 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m2q" event={"ID":"add6353c-ce2d-41c9-a724-d5c834495653","Type":"ContainerDied","Data":"5f594217d322ddbad8b5f3d6331cb156e8fedf101e5c7c7b31b080746007d084"} Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.828680 4786 generic.go:334] "Generic (PLEG): container finished" podID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" containerID="3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe" exitCode=0 Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.828700 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.828765 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" event={"ID":"b6119049-0a0d-4811-8d7f-bfbfe13882c8","Type":"ContainerDied","Data":"3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe"} Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.828947 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" event={"ID":"b6119049-0a0d-4811-8d7f-bfbfe13882c8","Type":"ContainerDied","Data":"f0f29df52c4f00ca46a2c115c25992ecd22a5b58c7a70bac6ec99735b59d843f"} Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.828978 4786 scope.go:117] "RemoveContainer" containerID="3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.848204 4786 scope.go:117] "RemoveContainer" containerID="bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.873667 4786 scope.go:117] "RemoveContainer" containerID="3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe" Mar 13 16:37:25 crc kubenswrapper[4786]: E0313 16:37:25.874388 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe\": container with ID starting with 3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe not found: ID does not exist" containerID="3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.874437 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe"} err="failed to get container status \"3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe\": rpc error: code = NotFound desc = could not find container \"3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe\": container with ID starting with 3ae54edc631a57a5effb08bd30e0e67c6086f6968bad07ba90d09a948b7559fe not found: ID does not exist" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.874467 4786 scope.go:117] "RemoveContainer" containerID="bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792" Mar 13 16:37:25 crc kubenswrapper[4786]: E0313 16:37:25.874733 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792\": container with ID starting with bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792 not found: ID does not exist" containerID="bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.874771 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792"} err="failed to get container status \"bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792\": rpc error: code = NotFound desc = could not find container \"bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792\": container with ID starting with bde53839d0c18a7e47bae615bc8e99c1328927c51adc95980e4b743a6303e792 not found: ID does not exist" Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.990910 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rplk\" (UniqueName: \"kubernetes.io/projected/b6119049-0a0d-4811-8d7f-bfbfe13882c8-kube-api-access-2rplk\") pod \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.990972 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-config\") pod \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.991083 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-sb\") pod \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.991144 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-dns-svc\") pod \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.991177 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-nb\") pod \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\" (UID: \"b6119049-0a0d-4811-8d7f-bfbfe13882c8\") " Mar 13 16:37:25 crc kubenswrapper[4786]: I0313 16:37:25.996927 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6119049-0a0d-4811-8d7f-bfbfe13882c8-kube-api-access-2rplk" (OuterVolumeSpecName: "kube-api-access-2rplk") pod "b6119049-0a0d-4811-8d7f-bfbfe13882c8" (UID: "b6119049-0a0d-4811-8d7f-bfbfe13882c8"). InnerVolumeSpecName "kube-api-access-2rplk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.036780 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6119049-0a0d-4811-8d7f-bfbfe13882c8" (UID: "b6119049-0a0d-4811-8d7f-bfbfe13882c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.046195 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-config" (OuterVolumeSpecName: "config") pod "b6119049-0a0d-4811-8d7f-bfbfe13882c8" (UID: "b6119049-0a0d-4811-8d7f-bfbfe13882c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.052765 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6119049-0a0d-4811-8d7f-bfbfe13882c8" (UID: "b6119049-0a0d-4811-8d7f-bfbfe13882c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.054600 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6119049-0a0d-4811-8d7f-bfbfe13882c8" (UID: "b6119049-0a0d-4811-8d7f-bfbfe13882c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.093167 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.093213 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.093227 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.093242 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rplk\" (UniqueName: \"kubernetes.io/projected/b6119049-0a0d-4811-8d7f-bfbfe13882c8-kube-api-access-2rplk\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.093255 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6119049-0a0d-4811-8d7f-bfbfe13882c8-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.843960 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56ff768df5-zt69v" Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.888412 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56ff768df5-zt69v"] Mar 13 16:37:26 crc kubenswrapper[4786]: I0313 16:37:26.897600 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56ff768df5-zt69v"] Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.304349 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.417995 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-config-data\") pod \"add6353c-ce2d-41c9-a724-d5c834495653\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.418047 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-credential-keys\") pod \"add6353c-ce2d-41c9-a724-d5c834495653\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.418080 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-fernet-keys\") pod \"add6353c-ce2d-41c9-a724-d5c834495653\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.418145 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-combined-ca-bundle\") pod \"add6353c-ce2d-41c9-a724-d5c834495653\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.418202 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-scripts\") pod \"add6353c-ce2d-41c9-a724-d5c834495653\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.418244 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwcf\" (UniqueName: \"kubernetes.io/projected/add6353c-ce2d-41c9-a724-d5c834495653-kube-api-access-xxwcf\") pod \"add6353c-ce2d-41c9-a724-d5c834495653\" (UID: \"add6353c-ce2d-41c9-a724-d5c834495653\") " Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.421769 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add6353c-ce2d-41c9-a724-d5c834495653-kube-api-access-xxwcf" (OuterVolumeSpecName: "kube-api-access-xxwcf") pod "add6353c-ce2d-41c9-a724-d5c834495653" (UID: "add6353c-ce2d-41c9-a724-d5c834495653"). InnerVolumeSpecName "kube-api-access-xxwcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.422342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "add6353c-ce2d-41c9-a724-d5c834495653" (UID: "add6353c-ce2d-41c9-a724-d5c834495653"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.422878 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-scripts" (OuterVolumeSpecName: "scripts") pod "add6353c-ce2d-41c9-a724-d5c834495653" (UID: "add6353c-ce2d-41c9-a724-d5c834495653"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.423688 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "add6353c-ce2d-41c9-a724-d5c834495653" (UID: "add6353c-ce2d-41c9-a724-d5c834495653"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.437120 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-config-data" (OuterVolumeSpecName: "config-data") pod "add6353c-ce2d-41c9-a724-d5c834495653" (UID: "add6353c-ce2d-41c9-a724-d5c834495653"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.439687 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "add6353c-ce2d-41c9-a724-d5c834495653" (UID: "add6353c-ce2d-41c9-a724-d5c834495653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.520271 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.520313 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxwcf\" (UniqueName: \"kubernetes.io/projected/add6353c-ce2d-41c9-a724-d5c834495653-kube-api-access-xxwcf\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.520329 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.520343 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.520355 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.520366 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add6353c-ce2d-41c9-a724-d5c834495653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.855293 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m2q" event={"ID":"add6353c-ce2d-41c9-a724-d5c834495653","Type":"ContainerDied","Data":"35b87d88e2fccfbe5332e9c04284dfd8d54e5313d1d6f952bfe4c107db4df370"} Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.855344 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35b87d88e2fccfbe5332e9c04284dfd8d54e5313d1d6f952bfe4c107db4df370" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.855542 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m2q" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.939986 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d9c4d76b9-54mn5"] Mar 13 16:37:27 crc kubenswrapper[4786]: E0313 16:37:27.940289 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" containerName="dnsmasq-dns" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.940299 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" containerName="dnsmasq-dns" Mar 13 16:37:27 crc kubenswrapper[4786]: E0313 16:37:27.940317 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" containerName="init" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.940322 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" containerName="init" Mar 13 16:37:27 crc kubenswrapper[4786]: E0313 16:37:27.940343 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add6353c-ce2d-41c9-a724-d5c834495653" containerName="keystone-bootstrap" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.940349 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="add6353c-ce2d-41c9-a724-d5c834495653" containerName="keystone-bootstrap" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.940488 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="add6353c-ce2d-41c9-a724-d5c834495653" containerName="keystone-bootstrap" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.940503 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" containerName="dnsmasq-dns" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.941082 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.944600 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.945098 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.945347 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.945561 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9rcs9" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.945604 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.945668 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 16:37:27 crc kubenswrapper[4786]: I0313 16:37:27.961652 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d9c4d76b9-54mn5"] Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.129664 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-config-data\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.129735 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-combined-ca-bundle\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.129763 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-internal-tls-certs\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.129789 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-fernet-keys\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.129834 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74cmv\" (UniqueName: \"kubernetes.io/projected/1549fff2-8323-404e-a727-80d3bb71f7c8-kube-api-access-74cmv\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.129940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-public-tls-certs\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.129971 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-scripts\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.130018 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-credential-keys\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.232834 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-scripts\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.233012 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-credential-keys\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.233091 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-config-data\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.234168 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-combined-ca-bundle\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.234231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-internal-tls-certs\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.234277 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-fernet-keys\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.234413 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74cmv\" (UniqueName: \"kubernetes.io/projected/1549fff2-8323-404e-a727-80d3bb71f7c8-kube-api-access-74cmv\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.234554 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-public-tls-certs\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.237589 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-scripts\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.237994 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-config-data\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.237993 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-credential-keys\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.239110 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-internal-tls-certs\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.240470 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-combined-ca-bundle\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.240536 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-fernet-keys\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.250435 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1549fff2-8323-404e-a727-80d3bb71f7c8-public-tls-certs\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.262675 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74cmv\" (UniqueName: \"kubernetes.io/projected/1549fff2-8323-404e-a727-80d3bb71f7c8-kube-api-access-74cmv\") pod \"keystone-5d9c4d76b9-54mn5\" (UID: \"1549fff2-8323-404e-a727-80d3bb71f7c8\") " pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.561723 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:28 crc kubenswrapper[4786]: I0313 16:37:28.584536 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6119049-0a0d-4811-8d7f-bfbfe13882c8" path="/var/lib/kubelet/pods/b6119049-0a0d-4811-8d7f-bfbfe13882c8/volumes" Mar 13 16:37:29 crc kubenswrapper[4786]: I0313 16:37:29.049842 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d9c4d76b9-54mn5"] Mar 13 16:37:29 crc kubenswrapper[4786]: I0313 16:37:29.873724 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d9c4d76b9-54mn5" event={"ID":"1549fff2-8323-404e-a727-80d3bb71f7c8","Type":"ContainerStarted","Data":"86306af39d0103eed3f3d4a1fe87f7434c348810ea08efb23ed3178939b7c8ce"} Mar 13 16:37:29 crc kubenswrapper[4786]: I0313 16:37:29.873763 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d9c4d76b9-54mn5" event={"ID":"1549fff2-8323-404e-a727-80d3bb71f7c8","Type":"ContainerStarted","Data":"23d8c0c5b4672475733bb0873f2ff3d9ab61a56335ca589d57610bbf04a91a98"} Mar 13 16:37:29 crc kubenswrapper[4786]: I0313 16:37:29.873920 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:37:29 crc kubenswrapper[4786]: I0313 16:37:29.900302 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5d9c4d76b9-54mn5" podStartSLOduration=2.9002800239999997 podStartE2EDuration="2.900280024s" podCreationTimestamp="2026-03-13 16:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:37:29.896892258 +0000 UTC m=+5680.060104089" watchObservedRunningTime="2026-03-13 16:37:29.900280024 +0000 UTC m=+5680.063491845" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.307159 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gffvj"] Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.310230 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.318480 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gffvj"] Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.491159 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-utilities\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.491235 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-catalog-content\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.491619 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtsv8\" (UniqueName: \"kubernetes.io/projected/c16335b4-a7fc-441a-b147-511dc0a49dd2-kube-api-access-qtsv8\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.593915 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-utilities\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.593962 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-catalog-content\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.593984 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtsv8\" (UniqueName: \"kubernetes.io/projected/c16335b4-a7fc-441a-b147-511dc0a49dd2-kube-api-access-qtsv8\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.594466 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-utilities\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.594501 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-catalog-content\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.612921 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtsv8\" (UniqueName: \"kubernetes.io/projected/c16335b4-a7fc-441a-b147-511dc0a49dd2-kube-api-access-qtsv8\") pod \"redhat-operators-gffvj\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:31 crc kubenswrapper[4786]: I0313 16:37:31.636811 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:32 crc kubenswrapper[4786]: I0313 16:37:32.072254 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gffvj"] Mar 13 16:37:32 crc kubenswrapper[4786]: W0313 16:37:32.077242 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc16335b4_a7fc_441a_b147_511dc0a49dd2.slice/crio-05c8dbff6233777afc8116ed1c003b0c79798aadb4288da32682ca8ea8c4546d WatchSource:0}: Error finding container 05c8dbff6233777afc8116ed1c003b0c79798aadb4288da32682ca8ea8c4546d: Status 404 returned error can't find the container with id 05c8dbff6233777afc8116ed1c003b0c79798aadb4288da32682ca8ea8c4546d Mar 13 16:37:32 crc kubenswrapper[4786]: I0313 16:37:32.902336 4786 generic.go:334] "Generic (PLEG): container finished" podID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerID="08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0" exitCode=0 Mar 13 16:37:32 crc kubenswrapper[4786]: I0313 16:37:32.902446 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gffvj" event={"ID":"c16335b4-a7fc-441a-b147-511dc0a49dd2","Type":"ContainerDied","Data":"08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0"} Mar 13 16:37:32 crc kubenswrapper[4786]: I0313 16:37:32.902648 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gffvj" event={"ID":"c16335b4-a7fc-441a-b147-511dc0a49dd2","Type":"ContainerStarted","Data":"05c8dbff6233777afc8116ed1c003b0c79798aadb4288da32682ca8ea8c4546d"} Mar 13 16:37:34 crc kubenswrapper[4786]: I0313 16:37:34.924468 4786 generic.go:334] "Generic (PLEG): container finished" podID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerID="863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec" exitCode=0 Mar 13 16:37:34 crc kubenswrapper[4786]: I0313 16:37:34.924530 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gffvj" event={"ID":"c16335b4-a7fc-441a-b147-511dc0a49dd2","Type":"ContainerDied","Data":"863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec"} Mar 13 16:37:35 crc kubenswrapper[4786]: I0313 16:37:35.941375 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gffvj" event={"ID":"c16335b4-a7fc-441a-b147-511dc0a49dd2","Type":"ContainerStarted","Data":"c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853"} Mar 13 16:37:35 crc kubenswrapper[4786]: I0313 16:37:35.973390 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gffvj" podStartSLOduration=2.533602763 podStartE2EDuration="4.973372334s" podCreationTimestamp="2026-03-13 16:37:31 +0000 UTC" firstStartedPulling="2026-03-13 16:37:32.90428837 +0000 UTC m=+5683.067500181" lastFinishedPulling="2026-03-13 16:37:35.344057931 +0000 UTC m=+5685.507269752" observedRunningTime="2026-03-13 16:37:35.965675241 +0000 UTC m=+5686.128887092" watchObservedRunningTime="2026-03-13 16:37:35.973372334 +0000 UTC m=+5686.136584155" Mar 13 16:37:41 crc kubenswrapper[4786]: I0313 16:37:41.637887 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:41 crc kubenswrapper[4786]: I0313 16:37:41.638542 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:42 crc kubenswrapper[4786]: I0313 16:37:42.715134 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gffvj" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="registry-server" probeResult="failure" output=< Mar 13 16:37:42 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 16:37:42 crc kubenswrapper[4786]: > Mar 13 16:37:51 crc kubenswrapper[4786]: I0313 16:37:51.682723 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:51 crc kubenswrapper[4786]: I0313 16:37:51.742104 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:52 crc kubenswrapper[4786]: I0313 16:37:52.889758 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gffvj"] Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.108020 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gffvj" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="registry-server" containerID="cri-o://c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853" gracePeriod=2 Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.555520 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.703736 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-utilities\") pod \"c16335b4-a7fc-441a-b147-511dc0a49dd2\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.703799 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtsv8\" (UniqueName: \"kubernetes.io/projected/c16335b4-a7fc-441a-b147-511dc0a49dd2-kube-api-access-qtsv8\") pod \"c16335b4-a7fc-441a-b147-511dc0a49dd2\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.703906 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-catalog-content\") pod \"c16335b4-a7fc-441a-b147-511dc0a49dd2\" (UID: \"c16335b4-a7fc-441a-b147-511dc0a49dd2\") " Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.706765 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-utilities" (OuterVolumeSpecName: "utilities") pod "c16335b4-a7fc-441a-b147-511dc0a49dd2" (UID: "c16335b4-a7fc-441a-b147-511dc0a49dd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.711079 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c16335b4-a7fc-441a-b147-511dc0a49dd2-kube-api-access-qtsv8" (OuterVolumeSpecName: "kube-api-access-qtsv8") pod "c16335b4-a7fc-441a-b147-511dc0a49dd2" (UID: "c16335b4-a7fc-441a-b147-511dc0a49dd2"). InnerVolumeSpecName "kube-api-access-qtsv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.806980 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.807024 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtsv8\" (UniqueName: \"kubernetes.io/projected/c16335b4-a7fc-441a-b147-511dc0a49dd2-kube-api-access-qtsv8\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.866637 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c16335b4-a7fc-441a-b147-511dc0a49dd2" (UID: "c16335b4-a7fc-441a-b147-511dc0a49dd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:37:53 crc kubenswrapper[4786]: I0313 16:37:53.908451 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c16335b4-a7fc-441a-b147-511dc0a49dd2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.127891 4786 generic.go:334] "Generic (PLEG): container finished" podID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerID="c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853" exitCode=0 Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.127931 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gffvj" event={"ID":"c16335b4-a7fc-441a-b147-511dc0a49dd2","Type":"ContainerDied","Data":"c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853"} Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.127954 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gffvj" event={"ID":"c16335b4-a7fc-441a-b147-511dc0a49dd2","Type":"ContainerDied","Data":"05c8dbff6233777afc8116ed1c003b0c79798aadb4288da32682ca8ea8c4546d"} Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.127972 4786 scope.go:117] "RemoveContainer" containerID="c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.128032 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gffvj" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.162881 4786 scope.go:117] "RemoveContainer" containerID="863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.173212 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gffvj"] Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.184505 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gffvj"] Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.186625 4786 scope.go:117] "RemoveContainer" containerID="08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.211610 4786 scope.go:117] "RemoveContainer" containerID="c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853" Mar 13 16:37:54 crc kubenswrapper[4786]: E0313 16:37:54.212215 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853\": container with ID starting with c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853 not found: ID does not exist" containerID="c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.212271 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853"} err="failed to get container status \"c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853\": rpc error: code = NotFound desc = could not find container \"c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853\": container with ID starting with c0c97f1260e06b2d21a295e3b1b002683fc8cad47eb1054983f5640fd5557853 not found: ID does not exist" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.212301 4786 scope.go:117] "RemoveContainer" containerID="863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec" Mar 13 16:37:54 crc kubenswrapper[4786]: E0313 16:37:54.212760 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec\": container with ID starting with 863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec not found: ID does not exist" containerID="863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.212804 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec"} err="failed to get container status \"863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec\": rpc error: code = NotFound desc = could not find container \"863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec\": container with ID starting with 863abb36f8b1bf29effa7d1953f208c6206a8d9fcdb3cc3c9c0f3dbcc7578aec not found: ID does not exist" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.212830 4786 scope.go:117] "RemoveContainer" containerID="08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0" Mar 13 16:37:54 crc kubenswrapper[4786]: E0313 16:37:54.213185 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0\": container with ID starting with 08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0 not found: ID does not exist" containerID="08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.213218 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0"} err="failed to get container status \"08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0\": rpc error: code = NotFound desc = could not find container \"08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0\": container with ID starting with 08170e3dbe237815ea23ab662f2bb0bd0c09046c2c57e1aeb1b73fcb2f4686a0 not found: ID does not exist" Mar 13 16:37:54 crc kubenswrapper[4786]: I0313 16:37:54.568080 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" path="/var/lib/kubelet/pods/c16335b4-a7fc-441a-b147-511dc0a49dd2/volumes" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.022124 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5d9c4d76b9-54mn5" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.201505 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556998-r25pl"] Mar 13 16:38:00 crc kubenswrapper[4786]: E0313 16:38:00.201948 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="registry-server" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.201971 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="registry-server" Mar 13 16:38:00 crc kubenswrapper[4786]: E0313 16:38:00.201989 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="extract-content" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.201997 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="extract-content" Mar 13 16:38:00 crc kubenswrapper[4786]: E0313 16:38:00.202012 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="extract-utilities" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.202020 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="extract-utilities" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.202203 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c16335b4-a7fc-441a-b147-511dc0a49dd2" containerName="registry-server" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.202846 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556998-r25pl" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.212809 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556998-r25pl"] Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.215233 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.215409 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.215533 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.332249 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhfj\" (UniqueName: \"kubernetes.io/projected/75dae8f7-1cd2-4d03-873e-76a806f00915-kube-api-access-2dhfj\") pod \"auto-csr-approver-29556998-r25pl\" (UID: \"75dae8f7-1cd2-4d03-873e-76a806f00915\") " pod="openshift-infra/auto-csr-approver-29556998-r25pl" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.433698 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhfj\" (UniqueName: \"kubernetes.io/projected/75dae8f7-1cd2-4d03-873e-76a806f00915-kube-api-access-2dhfj\") pod \"auto-csr-approver-29556998-r25pl\" (UID: \"75dae8f7-1cd2-4d03-873e-76a806f00915\") " pod="openshift-infra/auto-csr-approver-29556998-r25pl" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.452437 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhfj\" (UniqueName: \"kubernetes.io/projected/75dae8f7-1cd2-4d03-873e-76a806f00915-kube-api-access-2dhfj\") pod \"auto-csr-approver-29556998-r25pl\" (UID: \"75dae8f7-1cd2-4d03-873e-76a806f00915\") " pod="openshift-infra/auto-csr-approver-29556998-r25pl" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.534910 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556998-r25pl" Mar 13 16:38:00 crc kubenswrapper[4786]: I0313 16:38:00.962195 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556998-r25pl"] Mar 13 16:38:01 crc kubenswrapper[4786]: I0313 16:38:01.199898 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556998-r25pl" event={"ID":"75dae8f7-1cd2-4d03-873e-76a806f00915","Type":"ContainerStarted","Data":"44b1154085a73692d37a19a0fc565e7368dbc71488ed6e31f371895e5d09bad4"} Mar 13 16:38:03 crc kubenswrapper[4786]: I0313 16:38:03.222004 4786 generic.go:334] "Generic (PLEG): container finished" podID="75dae8f7-1cd2-4d03-873e-76a806f00915" containerID="9b397e3b818451fe5004ef34c11806826c4e40f95728dcc8d622a36d8f55ecc8" exitCode=0 Mar 13 16:38:03 crc kubenswrapper[4786]: I0313 16:38:03.222109 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556998-r25pl" event={"ID":"75dae8f7-1cd2-4d03-873e-76a806f00915","Type":"ContainerDied","Data":"9b397e3b818451fe5004ef34c11806826c4e40f95728dcc8d622a36d8f55ecc8"} Mar 13 16:38:03 crc kubenswrapper[4786]: I0313 16:38:03.976896 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 16:38:03 crc kubenswrapper[4786]: I0313 16:38:03.978457 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:38:03 crc kubenswrapper[4786]: I0313 16:38:03.980764 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-t72lx" Mar 13 16:38:03 crc kubenswrapper[4786]: I0313 16:38:03.980771 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 16:38:03 crc kubenswrapper[4786]: I0313 16:38:03.981586 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 16:38:03 crc kubenswrapper[4786]: I0313 16:38:03.999527 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.037881 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 16:38:04 crc kubenswrapper[4786]: E0313 16:38:04.038581 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-zz7xb openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.046115 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.062450 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.064812 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.071419 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.127530 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.127675 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.127712 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.127734 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7xb\" (UniqueName: \"kubernetes.io/projected/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-kube-api-access-zz7xb\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.229686 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7xb\" (UniqueName: \"kubernetes.io/projected/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-kube-api-access-zz7xb\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.229719 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.229773 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config-secret\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.229809 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.229844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.229874 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.229920 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-596mc\" (UniqueName: \"kubernetes.io/projected/4771f9ed-a556-43de-b0b2-5f91c6b02768-kube-api-access-596mc\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.229956 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.230990 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.231001 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: E0313 16:38:04.234210 4786 projected.go:194] Error preparing data for projected volume kube-api-access-zz7xb for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0) does not match the UID in record. The object might have been deleted and then recreated Mar 13 16:38:04 crc kubenswrapper[4786]: E0313 16:38:04.234474 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-kube-api-access-zz7xb podName:9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0 nodeName:}" failed. No retries permitted until 2026-03-13 16:38:04.73445246 +0000 UTC m=+5714.897664281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zz7xb" (UniqueName: "kubernetes.io/projected/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-kube-api-access-zz7xb") pod "openstackclient" (UID: "9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0) does not match the UID in record. The object might have been deleted and then recreated Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.236203 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0" podUID="4771f9ed-a556-43de-b0b2-5f91c6b02768" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.236882 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config-secret\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.241604 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.301326 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.330891 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-596mc\" (UniqueName: \"kubernetes.io/projected/4771f9ed-a556-43de-b0b2-5f91c6b02768-kube-api-access-596mc\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.331042 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config-secret\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.331130 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.331162 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.335033 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.335427 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.339749 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config-secret\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.360813 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-596mc\" (UniqueName: \"kubernetes.io/projected/4771f9ed-a556-43de-b0b2-5f91c6b02768-kube-api-access-596mc\") pod \"openstackclient\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.384607 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.435610 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-combined-ca-bundle\") pod \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.435701 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config-secret\") pod \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.435823 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config\") pod \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\" (UID: \"9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0\") " Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.436241 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz7xb\" (UniqueName: \"kubernetes.io/projected/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-kube-api-access-zz7xb\") on node \"crc\" DevicePath \"\"" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.436689 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0" (UID: "9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.439298 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0" (UID: "9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.446035 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0" (UID: "9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.490544 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556998-r25pl" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.538500 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.538917 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.538933 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.564373 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0" path="/var/lib/kubelet/pods/9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0/volumes" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.640334 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dhfj\" (UniqueName: \"kubernetes.io/projected/75dae8f7-1cd2-4d03-873e-76a806f00915-kube-api-access-2dhfj\") pod \"75dae8f7-1cd2-4d03-873e-76a806f00915\" (UID: \"75dae8f7-1cd2-4d03-873e-76a806f00915\") " Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.643935 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75dae8f7-1cd2-4d03-873e-76a806f00915-kube-api-access-2dhfj" (OuterVolumeSpecName: "kube-api-access-2dhfj") pod "75dae8f7-1cd2-4d03-873e-76a806f00915" (UID: "75dae8f7-1cd2-4d03-873e-76a806f00915"). InnerVolumeSpecName "kube-api-access-2dhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.742232 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dhfj\" (UniqueName: \"kubernetes.io/projected/75dae8f7-1cd2-4d03-873e-76a806f00915-kube-api-access-2dhfj\") on node \"crc\" DevicePath \"\"" Mar 13 16:38:04 crc kubenswrapper[4786]: W0313 16:38:04.864737 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4771f9ed_a556_43de_b0b2_5f91c6b02768.slice/crio-dcc281c0adfeeab3d7b4c300473c549b9297ddd578a474c32a919c23205faf0d WatchSource:0}: Error finding container dcc281c0adfeeab3d7b4c300473c549b9297ddd578a474c32a919c23205faf0d: Status 404 returned error can't find the container with id dcc281c0adfeeab3d7b4c300473c549b9297ddd578a474c32a919c23205faf0d Mar 13 16:38:04 crc kubenswrapper[4786]: I0313 16:38:04.865802 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.240972 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556998-r25pl" event={"ID":"75dae8f7-1cd2-4d03-873e-76a806f00915","Type":"ContainerDied","Data":"44b1154085a73692d37a19a0fc565e7368dbc71488ed6e31f371895e5d09bad4"} Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.242010 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44b1154085a73692d37a19a0fc565e7368dbc71488ed6e31f371895e5d09bad4" Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.240993 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556998-r25pl" Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.242952 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.244774 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4771f9ed-a556-43de-b0b2-5f91c6b02768","Type":"ContainerStarted","Data":"73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560"} Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.244819 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4771f9ed-a556-43de-b0b2-5f91c6b02768","Type":"ContainerStarted","Data":"dcc281c0adfeeab3d7b4c300473c549b9297ddd578a474c32a919c23205faf0d"} Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.263946 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.263908217 podStartE2EDuration="1.263908217s" podCreationTimestamp="2026-03-13 16:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:38:05.258719137 +0000 UTC m=+5715.421930948" watchObservedRunningTime="2026-03-13 16:38:05.263908217 +0000 UTC m=+5715.427120028" Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.307295 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9d51a4d8-fd27-4e34-848d-0b7c2b94a4f0" podUID="4771f9ed-a556-43de-b0b2-5f91c6b02768" Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.578255 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556992-mjb92"] Mar 13 16:38:05 crc kubenswrapper[4786]: I0313 16:38:05.587034 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556992-mjb92"] Mar 13 16:38:06 crc kubenswrapper[4786]: I0313 16:38:06.565614 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65251d4-c3a2-4aab-8e06-790dfff1ef83" path="/var/lib/kubelet/pods/f65251d4-c3a2-4aab-8e06-790dfff1ef83/volumes" Mar 13 16:38:14 crc kubenswrapper[4786]: I0313 16:38:14.525652 4786 scope.go:117] "RemoveContainer" containerID="7e484f1a72de0c7f13c8b1e84c50a6bd2f41ce9d30245628de9f99a2d2ee8a37" Mar 13 16:39:30 crc kubenswrapper[4786]: E0313 16:39:30.777009 4786 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.12:57110->38.102.83.12:40601: write tcp 38.102.83.12:57110->38.102.83.12:40601: write: broken pipe Mar 13 16:39:37 crc kubenswrapper[4786]: I0313 16:39:37.869017 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:39:37 crc kubenswrapper[4786]: I0313 16:39:37.869544 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.891736 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tdp5t"] Mar 13 16:39:46 crc kubenswrapper[4786]: E0313 16:39:46.892832 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dae8f7-1cd2-4d03-873e-76a806f00915" containerName="oc" Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.892873 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dae8f7-1cd2-4d03-873e-76a806f00915" containerName="oc" Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.893096 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="75dae8f7-1cd2-4d03-873e-76a806f00915" containerName="oc" Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.893769 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.899257 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-63da-account-create-update-l2b6z"] Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.900415 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.902510 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.911969 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tdp5t"] Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.921070 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-63da-account-create-update-l2b6z"] Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.988742 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c25059-cab8-4a39-9e13-4cf776e9177b-operator-scripts\") pod \"neutron-db-create-tdp5t\" (UID: \"06c25059-cab8-4a39-9e13-4cf776e9177b\") " pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:46 crc kubenswrapper[4786]: I0313 16:39:46.989021 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxz8\" (UniqueName: \"kubernetes.io/projected/06c25059-cab8-4a39-9e13-4cf776e9177b-kube-api-access-4rxz8\") pod \"neutron-db-create-tdp5t\" (UID: \"06c25059-cab8-4a39-9e13-4cf776e9177b\") " pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.091196 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c25059-cab8-4a39-9e13-4cf776e9177b-operator-scripts\") pod \"neutron-db-create-tdp5t\" (UID: \"06c25059-cab8-4a39-9e13-4cf776e9177b\") " pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.091303 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f53a8f-18fc-493d-99ad-66de081d79ca-operator-scripts\") pod \"neutron-63da-account-create-update-l2b6z\" (UID: \"67f53a8f-18fc-493d-99ad-66de081d79ca\") " pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.091334 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxz8\" (UniqueName: \"kubernetes.io/projected/06c25059-cab8-4a39-9e13-4cf776e9177b-kube-api-access-4rxz8\") pod \"neutron-db-create-tdp5t\" (UID: \"06c25059-cab8-4a39-9e13-4cf776e9177b\") " pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.091379 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-787sj\" (UniqueName: \"kubernetes.io/projected/67f53a8f-18fc-493d-99ad-66de081d79ca-kube-api-access-787sj\") pod \"neutron-63da-account-create-update-l2b6z\" (UID: \"67f53a8f-18fc-493d-99ad-66de081d79ca\") " pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.092577 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c25059-cab8-4a39-9e13-4cf776e9177b-operator-scripts\") pod \"neutron-db-create-tdp5t\" (UID: \"06c25059-cab8-4a39-9e13-4cf776e9177b\") " pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.113985 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxz8\" (UniqueName: \"kubernetes.io/projected/06c25059-cab8-4a39-9e13-4cf776e9177b-kube-api-access-4rxz8\") pod \"neutron-db-create-tdp5t\" (UID: \"06c25059-cab8-4a39-9e13-4cf776e9177b\") " pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.193005 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f53a8f-18fc-493d-99ad-66de081d79ca-operator-scripts\") pod \"neutron-63da-account-create-update-l2b6z\" (UID: \"67f53a8f-18fc-493d-99ad-66de081d79ca\") " pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.193341 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-787sj\" (UniqueName: \"kubernetes.io/projected/67f53a8f-18fc-493d-99ad-66de081d79ca-kube-api-access-787sj\") pod \"neutron-63da-account-create-update-l2b6z\" (UID: \"67f53a8f-18fc-493d-99ad-66de081d79ca\") " pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.193932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f53a8f-18fc-493d-99ad-66de081d79ca-operator-scripts\") pod \"neutron-63da-account-create-update-l2b6z\" (UID: \"67f53a8f-18fc-493d-99ad-66de081d79ca\") " pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.219375 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-787sj\" (UniqueName: \"kubernetes.io/projected/67f53a8f-18fc-493d-99ad-66de081d79ca-kube-api-access-787sj\") pod \"neutron-63da-account-create-update-l2b6z\" (UID: \"67f53a8f-18fc-493d-99ad-66de081d79ca\") " pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.221424 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.234313 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.712297 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tdp5t"] Mar 13 16:39:47 crc kubenswrapper[4786]: W0313 16:39:47.718625 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06c25059_cab8_4a39_9e13_4cf776e9177b.slice/crio-0cc405274bf07c5b1f9d9ad9169248946ce505b754191d1e90a9b2eb700282a9 WatchSource:0}: Error finding container 0cc405274bf07c5b1f9d9ad9169248946ce505b754191d1e90a9b2eb700282a9: Status 404 returned error can't find the container with id 0cc405274bf07c5b1f9d9ad9169248946ce505b754191d1e90a9b2eb700282a9 Mar 13 16:39:47 crc kubenswrapper[4786]: I0313 16:39:47.771475 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-63da-account-create-update-l2b6z"] Mar 13 16:39:47 crc kubenswrapper[4786]: W0313 16:39:47.777251 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67f53a8f_18fc_493d_99ad_66de081d79ca.slice/crio-e4d579f5d5b72073889595bb0b6bdddc29b7d49aa09d9f4e870a0d587c8206bc WatchSource:0}: Error finding container e4d579f5d5b72073889595bb0b6bdddc29b7d49aa09d9f4e870a0d587c8206bc: Status 404 returned error can't find the container with id e4d579f5d5b72073889595bb0b6bdddc29b7d49aa09d9f4e870a0d587c8206bc Mar 13 16:39:48 crc kubenswrapper[4786]: I0313 16:39:48.244563 4786 generic.go:334] "Generic (PLEG): container finished" podID="67f53a8f-18fc-493d-99ad-66de081d79ca" containerID="9f6d6f94fbc1513d679e0254b54d529ececcf48eb2b324723474242458ee2a6b" exitCode=0 Mar 13 16:39:48 crc kubenswrapper[4786]: I0313 16:39:48.244716 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-63da-account-create-update-l2b6z" event={"ID":"67f53a8f-18fc-493d-99ad-66de081d79ca","Type":"ContainerDied","Data":"9f6d6f94fbc1513d679e0254b54d529ececcf48eb2b324723474242458ee2a6b"} Mar 13 16:39:48 crc kubenswrapper[4786]: I0313 16:39:48.244970 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-63da-account-create-update-l2b6z" event={"ID":"67f53a8f-18fc-493d-99ad-66de081d79ca","Type":"ContainerStarted","Data":"e4d579f5d5b72073889595bb0b6bdddc29b7d49aa09d9f4e870a0d587c8206bc"} Mar 13 16:39:48 crc kubenswrapper[4786]: I0313 16:39:48.248265 4786 generic.go:334] "Generic (PLEG): container finished" podID="06c25059-cab8-4a39-9e13-4cf776e9177b" containerID="0dc6495f7cca92af457f8bb053b6071890b93a8e43773c4e9177ba4e3ad02528" exitCode=0 Mar 13 16:39:48 crc kubenswrapper[4786]: I0313 16:39:48.248451 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tdp5t" event={"ID":"06c25059-cab8-4a39-9e13-4cf776e9177b","Type":"ContainerDied","Data":"0dc6495f7cca92af457f8bb053b6071890b93a8e43773c4e9177ba4e3ad02528"} Mar 13 16:39:48 crc kubenswrapper[4786]: I0313 16:39:48.248527 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tdp5t" event={"ID":"06c25059-cab8-4a39-9e13-4cf776e9177b","Type":"ContainerStarted","Data":"0cc405274bf07c5b1f9d9ad9169248946ce505b754191d1e90a9b2eb700282a9"} Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.776044 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.782648 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.948274 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-787sj\" (UniqueName: \"kubernetes.io/projected/67f53a8f-18fc-493d-99ad-66de081d79ca-kube-api-access-787sj\") pod \"67f53a8f-18fc-493d-99ad-66de081d79ca\" (UID: \"67f53a8f-18fc-493d-99ad-66de081d79ca\") " Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.948364 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c25059-cab8-4a39-9e13-4cf776e9177b-operator-scripts\") pod \"06c25059-cab8-4a39-9e13-4cf776e9177b\" (UID: \"06c25059-cab8-4a39-9e13-4cf776e9177b\") " Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.948409 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f53a8f-18fc-493d-99ad-66de081d79ca-operator-scripts\") pod \"67f53a8f-18fc-493d-99ad-66de081d79ca\" (UID: \"67f53a8f-18fc-493d-99ad-66de081d79ca\") " Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.948536 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rxz8\" (UniqueName: \"kubernetes.io/projected/06c25059-cab8-4a39-9e13-4cf776e9177b-kube-api-access-4rxz8\") pod \"06c25059-cab8-4a39-9e13-4cf776e9177b\" (UID: \"06c25059-cab8-4a39-9e13-4cf776e9177b\") " Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.949269 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06c25059-cab8-4a39-9e13-4cf776e9177b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06c25059-cab8-4a39-9e13-4cf776e9177b" (UID: "06c25059-cab8-4a39-9e13-4cf776e9177b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.949487 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f53a8f-18fc-493d-99ad-66de081d79ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67f53a8f-18fc-493d-99ad-66de081d79ca" (UID: "67f53a8f-18fc-493d-99ad-66de081d79ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.956140 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f53a8f-18fc-493d-99ad-66de081d79ca-kube-api-access-787sj" (OuterVolumeSpecName: "kube-api-access-787sj") pod "67f53a8f-18fc-493d-99ad-66de081d79ca" (UID: "67f53a8f-18fc-493d-99ad-66de081d79ca"). InnerVolumeSpecName "kube-api-access-787sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:39:49 crc kubenswrapper[4786]: I0313 16:39:49.956932 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c25059-cab8-4a39-9e13-4cf776e9177b-kube-api-access-4rxz8" (OuterVolumeSpecName: "kube-api-access-4rxz8") pod "06c25059-cab8-4a39-9e13-4cf776e9177b" (UID: "06c25059-cab8-4a39-9e13-4cf776e9177b"). InnerVolumeSpecName "kube-api-access-4rxz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.050805 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rxz8\" (UniqueName: \"kubernetes.io/projected/06c25059-cab8-4a39-9e13-4cf776e9177b-kube-api-access-4rxz8\") on node \"crc\" DevicePath \"\"" Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.050975 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-787sj\" (UniqueName: \"kubernetes.io/projected/67f53a8f-18fc-493d-99ad-66de081d79ca-kube-api-access-787sj\") on node \"crc\" DevicePath \"\"" Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.051012 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c25059-cab8-4a39-9e13-4cf776e9177b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.051039 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f53a8f-18fc-493d-99ad-66de081d79ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.279533 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-63da-account-create-update-l2b6z" event={"ID":"67f53a8f-18fc-493d-99ad-66de081d79ca","Type":"ContainerDied","Data":"e4d579f5d5b72073889595bb0b6bdddc29b7d49aa09d9f4e870a0d587c8206bc"} Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.279791 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4d579f5d5b72073889595bb0b6bdddc29b7d49aa09d9f4e870a0d587c8206bc" Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.279615 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-63da-account-create-update-l2b6z" Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.281077 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tdp5t" event={"ID":"06c25059-cab8-4a39-9e13-4cf776e9177b","Type":"ContainerDied","Data":"0cc405274bf07c5b1f9d9ad9169248946ce505b754191d1e90a9b2eb700282a9"} Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.281134 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cc405274bf07c5b1f9d9ad9169248946ce505b754191d1e90a9b2eb700282a9" Mar 13 16:39:50 crc kubenswrapper[4786]: I0313 16:39:50.281101 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tdp5t" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.246786 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bqxwn"] Mar 13 16:39:52 crc kubenswrapper[4786]: E0313 16:39:52.247359 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f53a8f-18fc-493d-99ad-66de081d79ca" containerName="mariadb-account-create-update" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.247371 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f53a8f-18fc-493d-99ad-66de081d79ca" containerName="mariadb-account-create-update" Mar 13 16:39:52 crc kubenswrapper[4786]: E0313 16:39:52.247389 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c25059-cab8-4a39-9e13-4cf776e9177b" containerName="mariadb-database-create" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.247395 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c25059-cab8-4a39-9e13-4cf776e9177b" containerName="mariadb-database-create" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.247553 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f53a8f-18fc-493d-99ad-66de081d79ca" containerName="mariadb-account-create-update" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.247564 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c25059-cab8-4a39-9e13-4cf776e9177b" containerName="mariadb-database-create" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.248054 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.251264 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.252160 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.252387 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k7f72" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.269667 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bqxwn"] Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.300114 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-combined-ca-bundle\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.300203 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kf52\" (UniqueName: \"kubernetes.io/projected/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-kube-api-access-5kf52\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.300223 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-config\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.401156 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-config\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.401272 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-combined-ca-bundle\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.401340 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kf52\" (UniqueName: \"kubernetes.io/projected/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-kube-api-access-5kf52\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.413642 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-combined-ca-bundle\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.414394 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-config\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.415733 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kf52\" (UniqueName: \"kubernetes.io/projected/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-kube-api-access-5kf52\") pod \"neutron-db-sync-bqxwn\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:52 crc kubenswrapper[4786]: I0313 16:39:52.566977 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:39:53 crc kubenswrapper[4786]: I0313 16:39:53.038854 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bqxwn"] Mar 13 16:39:53 crc kubenswrapper[4786]: I0313 16:39:53.318080 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bqxwn" event={"ID":"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d","Type":"ContainerStarted","Data":"76700b42cdfa8cbc032b073013f4c5584eeaccc5adbe7b096b47eb5e69a5528e"} Mar 13 16:39:54 crc kubenswrapper[4786]: I0313 16:39:54.330019 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bqxwn" event={"ID":"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d","Type":"ContainerStarted","Data":"65715ca4bd8466cd622ab3526dc772011016a62406ce4f7f6492d2ff0262a82a"} Mar 13 16:39:54 crc kubenswrapper[4786]: I0313 16:39:54.352230 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bqxwn" podStartSLOduration=2.35220793 podStartE2EDuration="2.35220793s" podCreationTimestamp="2026-03-13 16:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:39:54.342755432 +0000 UTC m=+5824.505967303" watchObservedRunningTime="2026-03-13 16:39:54.35220793 +0000 UTC m=+5824.515419751" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.142256 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557000-q87vh"] Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.144531 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557000-q87vh" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.147374 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.147767 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.147941 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.150790 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557000-q87vh"] Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.175473 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k6n2\" (UniqueName: \"kubernetes.io/projected/6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3-kube-api-access-9k6n2\") pod \"auto-csr-approver-29557000-q87vh\" (UID: \"6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3\") " pod="openshift-infra/auto-csr-approver-29557000-q87vh" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.276900 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k6n2\" (UniqueName: \"kubernetes.io/projected/6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3-kube-api-access-9k6n2\") pod \"auto-csr-approver-29557000-q87vh\" (UID: \"6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3\") " pod="openshift-infra/auto-csr-approver-29557000-q87vh" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.303302 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k6n2\" (UniqueName: \"kubernetes.io/projected/6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3-kube-api-access-9k6n2\") pod \"auto-csr-approver-29557000-q87vh\" (UID: \"6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3\") " pod="openshift-infra/auto-csr-approver-29557000-q87vh" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.467837 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557000-q87vh" Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.944412 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557000-q87vh"] Mar 13 16:40:00 crc kubenswrapper[4786]: I0313 16:40:00.953102 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:40:01 crc kubenswrapper[4786]: I0313 16:40:01.398953 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557000-q87vh" event={"ID":"6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3","Type":"ContainerStarted","Data":"a6560a7ae61805d651470cbeb053a86c34395be1ff057c17a526b9baf87bb15e"} Mar 13 16:40:03 crc kubenswrapper[4786]: I0313 16:40:03.417306 4786 generic.go:334] "Generic (PLEG): container finished" podID="6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3" containerID="eb7386a704c52262879074bdbc92fe97e0437efc339d35aada03b0b5b97614f8" exitCode=0 Mar 13 16:40:03 crc kubenswrapper[4786]: I0313 16:40:03.417366 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557000-q87vh" event={"ID":"6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3","Type":"ContainerDied","Data":"eb7386a704c52262879074bdbc92fe97e0437efc339d35aada03b0b5b97614f8"} Mar 13 16:40:04 crc kubenswrapper[4786]: I0313 16:40:04.431177 4786 generic.go:334] "Generic (PLEG): container finished" podID="e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d" containerID="65715ca4bd8466cd622ab3526dc772011016a62406ce4f7f6492d2ff0262a82a" exitCode=0 Mar 13 16:40:04 crc kubenswrapper[4786]: I0313 16:40:04.431293 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bqxwn" event={"ID":"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d","Type":"ContainerDied","Data":"65715ca4bd8466cd622ab3526dc772011016a62406ce4f7f6492d2ff0262a82a"} Mar 13 16:40:04 crc kubenswrapper[4786]: I0313 16:40:04.933063 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557000-q87vh" Mar 13 16:40:04 crc kubenswrapper[4786]: I0313 16:40:04.978164 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k6n2\" (UniqueName: \"kubernetes.io/projected/6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3-kube-api-access-9k6n2\") pod \"6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3\" (UID: \"6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3\") " Mar 13 16:40:04 crc kubenswrapper[4786]: I0313 16:40:04.986101 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3-kube-api-access-9k6n2" (OuterVolumeSpecName: "kube-api-access-9k6n2") pod "6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3" (UID: "6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3"). InnerVolumeSpecName "kube-api-access-9k6n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.081475 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k6n2\" (UniqueName: \"kubernetes.io/projected/6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3-kube-api-access-9k6n2\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.447144 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557000-q87vh" event={"ID":"6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3","Type":"ContainerDied","Data":"a6560a7ae61805d651470cbeb053a86c34395be1ff057c17a526b9baf87bb15e"} Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.447221 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6560a7ae61805d651470cbeb053a86c34395be1ff057c17a526b9baf87bb15e" Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.447206 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557000-q87vh" Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.813794 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.896717 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-combined-ca-bundle\") pod \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.896884 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-config\") pod \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.897013 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kf52\" (UniqueName: \"kubernetes.io/projected/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-kube-api-access-5kf52\") pod \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\" (UID: \"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d\") " Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.904244 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-kube-api-access-5kf52" (OuterVolumeSpecName: "kube-api-access-5kf52") pod "e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d" (UID: "e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d"). InnerVolumeSpecName "kube-api-access-5kf52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.942166 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-config" (OuterVolumeSpecName: "config") pod "e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d" (UID: "e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:05 crc kubenswrapper[4786]: I0313 16:40:05.942672 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d" (UID: "e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.000210 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.000263 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.000282 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kf52\" (UniqueName: \"kubernetes.io/projected/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d-kube-api-access-5kf52\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.031241 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556994-nwtzb"] Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.043660 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556994-nwtzb"] Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.460913 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bqxwn" event={"ID":"e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d","Type":"ContainerDied","Data":"76700b42cdfa8cbc032b073013f4c5584eeaccc5adbe7b096b47eb5e69a5528e"} Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.461337 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76700b42cdfa8cbc032b073013f4c5584eeaccc5adbe7b096b47eb5e69a5528e" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.461010 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bqxwn" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.578721 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ad766d3-3c99-4eaa-a277-640201cf3c94" path="/var/lib/kubelet/pods/9ad766d3-3c99-4eaa-a277-640201cf3c94/volumes" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.690968 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dddd86fbf-vjdp4"] Mar 13 16:40:06 crc kubenswrapper[4786]: E0313 16:40:06.691814 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d" containerName="neutron-db-sync" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.691835 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d" containerName="neutron-db-sync" Mar 13 16:40:06 crc kubenswrapper[4786]: E0313 16:40:06.691899 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3" containerName="oc" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.691906 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3" containerName="oc" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.692284 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d" containerName="neutron-db-sync" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.692317 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3" containerName="oc" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.693923 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.711333 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-sb\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.712437 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-nb\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.712540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tsj\" (UniqueName: \"kubernetes.io/projected/34ae2a6c-c21f-475d-950c-e23b1f2722ce-kube-api-access-v6tsj\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.712586 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-config\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.712692 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-dns-svc\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.725054 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54b6c58848-h4dwn"] Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.742232 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dddd86fbf-vjdp4"] Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.742337 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.745192 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.745466 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.745594 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k7f72" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.748767 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.753379 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54b6c58848-h4dwn"] Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814273 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-nb\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814334 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-combined-ca-bundle\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814380 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-config\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814401 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6tsj\" (UniqueName: \"kubernetes.io/projected/34ae2a6c-c21f-475d-950c-e23b1f2722ce-kube-api-access-v6tsj\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814422 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmk5b\" (UniqueName: \"kubernetes.io/projected/bd883d40-6b86-4af6-a1e9-165bed4db882-kube-api-access-bmk5b\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814441 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-ovndb-tls-certs\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814462 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-config\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814488 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-httpd-config\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814515 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-dns-svc\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.814563 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-sb\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.815596 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-sb\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.815754 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-config\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.816054 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-nb\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.816171 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-dns-svc\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:06 crc kubenswrapper[4786]: I0313 16:40:06.829522 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6tsj\" (UniqueName: \"kubernetes.io/projected/34ae2a6c-c21f-475d-950c-e23b1f2722ce-kube-api-access-v6tsj\") pod \"dnsmasq-dns-6dddd86fbf-vjdp4\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.500503 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.503260 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-combined-ca-bundle\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.503315 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-config\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.503341 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmk5b\" (UniqueName: \"kubernetes.io/projected/bd883d40-6b86-4af6-a1e9-165bed4db882-kube-api-access-bmk5b\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.503362 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-ovndb-tls-certs\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.503394 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-httpd-config\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.508655 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-httpd-config\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.510265 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-config\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.524462 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-combined-ca-bundle\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.533516 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmk5b\" (UniqueName: \"kubernetes.io/projected/bd883d40-6b86-4af6-a1e9-165bed4db882-kube-api-access-bmk5b\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.537822 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-ovndb-tls-certs\") pod \"neutron-54b6c58848-h4dwn\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.661456 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.869134 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:40:07 crc kubenswrapper[4786]: I0313 16:40:07.869570 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.015897 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dddd86fbf-vjdp4"] Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.224018 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54b6c58848-h4dwn"] Mar 13 16:40:08 crc kubenswrapper[4786]: W0313 16:40:08.225239 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd883d40_6b86_4af6_a1e9_165bed4db882.slice/crio-ce880a2ad88978b33bdbbd81453ea191f3fbadbb29c618a44041ad56b3076f23 WatchSource:0}: Error finding container ce880a2ad88978b33bdbbd81453ea191f3fbadbb29c618a44041ad56b3076f23: Status 404 returned error can't find the container with id ce880a2ad88978b33bdbbd81453ea191f3fbadbb29c618a44041ad56b3076f23 Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.559602 4786 generic.go:334] "Generic (PLEG): container finished" podID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" containerID="cebef6588e079c2814e8e2dbabd7cd153356e73613eb16e3c80e85842b0f4cd1" exitCode=0 Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.565710 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" event={"ID":"34ae2a6c-c21f-475d-950c-e23b1f2722ce","Type":"ContainerDied","Data":"cebef6588e079c2814e8e2dbabd7cd153356e73613eb16e3c80e85842b0f4cd1"} Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.565761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" event={"ID":"34ae2a6c-c21f-475d-950c-e23b1f2722ce","Type":"ContainerStarted","Data":"8743a719fae08f25cf3aa33ca3f52a1431e570ca662603dd1ed75af2ea0e1e86"} Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.566170 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54b6c58848-h4dwn" event={"ID":"bd883d40-6b86-4af6-a1e9-165bed4db882","Type":"ContainerStarted","Data":"75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0"} Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.566255 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.566306 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54b6c58848-h4dwn" event={"ID":"bd883d40-6b86-4af6-a1e9-165bed4db882","Type":"ContainerStarted","Data":"331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4"} Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.566321 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54b6c58848-h4dwn" event={"ID":"bd883d40-6b86-4af6-a1e9-165bed4db882","Type":"ContainerStarted","Data":"ce880a2ad88978b33bdbbd81453ea191f3fbadbb29c618a44041ad56b3076f23"} Mar 13 16:40:08 crc kubenswrapper[4786]: I0313 16:40:08.617420 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54b6c58848-h4dwn" podStartSLOduration=2.6174006199999997 podStartE2EDuration="2.61740062s" podCreationTimestamp="2026-03-13 16:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:40:08.611165913 +0000 UTC m=+5838.774377734" watchObservedRunningTime="2026-03-13 16:40:08.61740062 +0000 UTC m=+5838.780612431" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.429377 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f5c6db85-hdk8q"] Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.431428 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.433740 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.433995 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.447268 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f5c6db85-hdk8q"] Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.537032 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-internal-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.537103 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-public-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.537147 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwxd\" (UniqueName: \"kubernetes.io/projected/d57b6476-92e9-4c2c-8577-6627254ae198-kube-api-access-wpwxd\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.537201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-combined-ca-bundle\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.537239 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-httpd-config\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.537286 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-ovndb-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.537336 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-config\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.578468 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" event={"ID":"34ae2a6c-c21f-475d-950c-e23b1f2722ce","Type":"ContainerStarted","Data":"3a37ee914c9f7f113fdfb17dc05b287ee3134aa606c950fffa6a7e2aa0172409"} Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.578526 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.600046 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" podStartSLOduration=3.6000281469999997 podStartE2EDuration="3.600028147s" podCreationTimestamp="2026-03-13 16:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:40:09.597495823 +0000 UTC m=+5839.760707624" watchObservedRunningTime="2026-03-13 16:40:09.600028147 +0000 UTC m=+5839.763239958" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.639196 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-internal-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.639291 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-public-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.639350 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwxd\" (UniqueName: \"kubernetes.io/projected/d57b6476-92e9-4c2c-8577-6627254ae198-kube-api-access-wpwxd\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.639430 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-combined-ca-bundle\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.639487 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-httpd-config\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.639559 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-ovndb-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.639621 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-config\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.645550 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-public-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.645555 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-httpd-config\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.646821 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-internal-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.647082 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-config\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.647370 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-ovndb-tls-certs\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.652399 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57b6476-92e9-4c2c-8577-6627254ae198-combined-ca-bundle\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.672122 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwxd\" (UniqueName: \"kubernetes.io/projected/d57b6476-92e9-4c2c-8577-6627254ae198-kube-api-access-wpwxd\") pod \"neutron-f5c6db85-hdk8q\" (UID: \"d57b6476-92e9-4c2c-8577-6627254ae198\") " pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:09 crc kubenswrapper[4786]: I0313 16:40:09.748892 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:10 crc kubenswrapper[4786]: I0313 16:40:10.296974 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f5c6db85-hdk8q"] Mar 13 16:40:10 crc kubenswrapper[4786]: W0313 16:40:10.297978 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd57b6476_92e9_4c2c_8577_6627254ae198.slice/crio-c15e0270640817dbc1989a261a113f8d161f93691614a44f4eb19e07fa33fb54 WatchSource:0}: Error finding container c15e0270640817dbc1989a261a113f8d161f93691614a44f4eb19e07fa33fb54: Status 404 returned error can't find the container with id c15e0270640817dbc1989a261a113f8d161f93691614a44f4eb19e07fa33fb54 Mar 13 16:40:10 crc kubenswrapper[4786]: I0313 16:40:10.587035 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5c6db85-hdk8q" event={"ID":"d57b6476-92e9-4c2c-8577-6627254ae198","Type":"ContainerStarted","Data":"212d8f73fbbcfa430f324079b1a6dae41296f9bccc8b605fcb8c89944efff5e4"} Mar 13 16:40:10 crc kubenswrapper[4786]: I0313 16:40:10.587277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5c6db85-hdk8q" event={"ID":"d57b6476-92e9-4c2c-8577-6627254ae198","Type":"ContainerStarted","Data":"c15e0270640817dbc1989a261a113f8d161f93691614a44f4eb19e07fa33fb54"} Mar 13 16:40:11 crc kubenswrapper[4786]: I0313 16:40:11.596542 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f5c6db85-hdk8q" event={"ID":"d57b6476-92e9-4c2c-8577-6627254ae198","Type":"ContainerStarted","Data":"7c02dff497754468774881ca9be6c27ba8d82ea6814fb90f4a239d6a2c548cfa"} Mar 13 16:40:11 crc kubenswrapper[4786]: I0313 16:40:11.632566 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f5c6db85-hdk8q" podStartSLOduration=2.632546687 podStartE2EDuration="2.632546687s" podCreationTimestamp="2026-03-13 16:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:40:11.621475408 +0000 UTC m=+5841.784687229" watchObservedRunningTime="2026-03-13 16:40:11.632546687 +0000 UTC m=+5841.795758508" Mar 13 16:40:12 crc kubenswrapper[4786]: I0313 16:40:12.610672 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:14 crc kubenswrapper[4786]: I0313 16:40:14.720773 4786 scope.go:117] "RemoveContainer" containerID="869bbac45baba2516d13326299b71472b7e835267e0871603b9baf630008ea2a" Mar 13 16:40:17 crc kubenswrapper[4786]: I0313 16:40:17.504166 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:40:17 crc kubenswrapper[4786]: I0313 16:40:17.582509 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f78d9d695-wqfkf"] Mar 13 16:40:17 crc kubenswrapper[4786]: I0313 16:40:17.582972 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" podUID="c22e3434-1abb-49b3-916d-0d36fa09b794" containerName="dnsmasq-dns" containerID="cri-o://375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567" gracePeriod=10 Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.126276 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.298042 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-sb\") pod \"c22e3434-1abb-49b3-916d-0d36fa09b794\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.298129 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-dns-svc\") pod \"c22e3434-1abb-49b3-916d-0d36fa09b794\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.298185 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-config\") pod \"c22e3434-1abb-49b3-916d-0d36fa09b794\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.298332 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n4jf\" (UniqueName: \"kubernetes.io/projected/c22e3434-1abb-49b3-916d-0d36fa09b794-kube-api-access-7n4jf\") pod \"c22e3434-1abb-49b3-916d-0d36fa09b794\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.298367 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-nb\") pod \"c22e3434-1abb-49b3-916d-0d36fa09b794\" (UID: \"c22e3434-1abb-49b3-916d-0d36fa09b794\") " Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.307728 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c22e3434-1abb-49b3-916d-0d36fa09b794-kube-api-access-7n4jf" (OuterVolumeSpecName: "kube-api-access-7n4jf") pod "c22e3434-1abb-49b3-916d-0d36fa09b794" (UID: "c22e3434-1abb-49b3-916d-0d36fa09b794"). InnerVolumeSpecName "kube-api-access-7n4jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.339125 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c22e3434-1abb-49b3-916d-0d36fa09b794" (UID: "c22e3434-1abb-49b3-916d-0d36fa09b794"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.348229 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-config" (OuterVolumeSpecName: "config") pod "c22e3434-1abb-49b3-916d-0d36fa09b794" (UID: "c22e3434-1abb-49b3-916d-0d36fa09b794"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.356098 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c22e3434-1abb-49b3-916d-0d36fa09b794" (UID: "c22e3434-1abb-49b3-916d-0d36fa09b794"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.361146 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c22e3434-1abb-49b3-916d-0d36fa09b794" (UID: "c22e3434-1abb-49b3-916d-0d36fa09b794"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.400036 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n4jf\" (UniqueName: \"kubernetes.io/projected/c22e3434-1abb-49b3-916d-0d36fa09b794-kube-api-access-7n4jf\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.400221 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.400312 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.400388 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.400510 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c22e3434-1abb-49b3-916d-0d36fa09b794-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.677623 4786 generic.go:334] "Generic (PLEG): container finished" podID="c22e3434-1abb-49b3-916d-0d36fa09b794" containerID="375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567" exitCode=0 Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.677714 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" event={"ID":"c22e3434-1abb-49b3-916d-0d36fa09b794","Type":"ContainerDied","Data":"375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567"} Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.677733 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.677795 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f78d9d695-wqfkf" event={"ID":"c22e3434-1abb-49b3-916d-0d36fa09b794","Type":"ContainerDied","Data":"21f9f6d85aee3e1133d85cd6982e48cb968cf85f02773f6b4fc69bfe18dc3ae7"} Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.677840 4786 scope.go:117] "RemoveContainer" containerID="375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.728249 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f78d9d695-wqfkf"] Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.737881 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f78d9d695-wqfkf"] Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.777000 4786 scope.go:117] "RemoveContainer" containerID="a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.823374 4786 scope.go:117] "RemoveContainer" containerID="375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567" Mar 13 16:40:18 crc kubenswrapper[4786]: E0313 16:40:18.825458 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567\": container with ID starting with 375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567 not found: ID does not exist" containerID="375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.825520 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567"} err="failed to get container status \"375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567\": rpc error: code = NotFound desc = could not find container \"375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567\": container with ID starting with 375eeb2181f58c45ff1caa4c4f6c01c105efebea2891524e4be991f6268e0567 not found: ID does not exist" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.825556 4786 scope.go:117] "RemoveContainer" containerID="a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98" Mar 13 16:40:18 crc kubenswrapper[4786]: E0313 16:40:18.826222 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98\": container with ID starting with a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98 not found: ID does not exist" containerID="a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98" Mar 13 16:40:18 crc kubenswrapper[4786]: I0313 16:40:18.826255 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98"} err="failed to get container status \"a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98\": rpc error: code = NotFound desc = could not find container \"a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98\": container with ID starting with a897b99d618f7b293aeb65ca1c3bf5e5e264ac4fc85ccff8b6c54cb33ea4fa98 not found: ID does not exist" Mar 13 16:40:20 crc kubenswrapper[4786]: I0313 16:40:20.568942 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c22e3434-1abb-49b3-916d-0d36fa09b794" path="/var/lib/kubelet/pods/c22e3434-1abb-49b3-916d-0d36fa09b794/volumes" Mar 13 16:40:31 crc kubenswrapper[4786]: I0313 16:40:31.085186 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5b9pv"] Mar 13 16:40:31 crc kubenswrapper[4786]: I0313 16:40:31.097174 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5b9pv"] Mar 13 16:40:32 crc kubenswrapper[4786]: I0313 16:40:32.570500 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3935245-a5fb-49b8-8b34-d474d92ba6fe" path="/var/lib/kubelet/pods/d3935245-a5fb-49b8-8b34-d474d92ba6fe/volumes" Mar 13 16:40:37 crc kubenswrapper[4786]: I0313 16:40:37.677633 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:37 crc kubenswrapper[4786]: I0313 16:40:37.868967 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:40:37 crc kubenswrapper[4786]: I0313 16:40:37.869595 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:40:37 crc kubenswrapper[4786]: I0313 16:40:37.869730 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:40:37 crc kubenswrapper[4786]: I0313 16:40:37.870435 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06f63c372c8088421f902d282e4d63c4055021fcedd22c0ec607e784bab36af6"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:40:37 crc kubenswrapper[4786]: I0313 16:40:37.870584 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://06f63c372c8088421f902d282e4d63c4055021fcedd22c0ec607e784bab36af6" gracePeriod=600 Mar 13 16:40:38 crc kubenswrapper[4786]: I0313 16:40:38.919816 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="06f63c372c8088421f902d282e4d63c4055021fcedd22c0ec607e784bab36af6" exitCode=0 Mar 13 16:40:38 crc kubenswrapper[4786]: I0313 16:40:38.919910 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"06f63c372c8088421f902d282e4d63c4055021fcedd22c0ec607e784bab36af6"} Mar 13 16:40:38 crc kubenswrapper[4786]: I0313 16:40:38.920313 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d"} Mar 13 16:40:38 crc kubenswrapper[4786]: I0313 16:40:38.920366 4786 scope.go:117] "RemoveContainer" containerID="3008d936c1f7910cfe6429aabfe620e4d8b123b092e07a4e117c97a6dfd4fade" Mar 13 16:40:39 crc kubenswrapper[4786]: I0313 16:40:39.764310 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f5c6db85-hdk8q" Mar 13 16:40:39 crc kubenswrapper[4786]: I0313 16:40:39.863825 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54b6c58848-h4dwn"] Mar 13 16:40:39 crc kubenswrapper[4786]: I0313 16:40:39.864391 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54b6c58848-h4dwn" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerName="neutron-api" containerID="cri-o://331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4" gracePeriod=30 Mar 13 16:40:39 crc kubenswrapper[4786]: I0313 16:40:39.864511 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54b6c58848-h4dwn" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerName="neutron-httpd" containerID="cri-o://75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0" gracePeriod=30 Mar 13 16:40:40 crc kubenswrapper[4786]: I0313 16:40:40.950343 4786 generic.go:334] "Generic (PLEG): container finished" podID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerID="75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0" exitCode=0 Mar 13 16:40:40 crc kubenswrapper[4786]: I0313 16:40:40.950440 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54b6c58848-h4dwn" event={"ID":"bd883d40-6b86-4af6-a1e9-165bed4db882","Type":"ContainerDied","Data":"75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0"} Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.691123 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.843207 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-ovndb-tls-certs\") pod \"bd883d40-6b86-4af6-a1e9-165bed4db882\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.843257 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmk5b\" (UniqueName: \"kubernetes.io/projected/bd883d40-6b86-4af6-a1e9-165bed4db882-kube-api-access-bmk5b\") pod \"bd883d40-6b86-4af6-a1e9-165bed4db882\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.843304 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-httpd-config\") pod \"bd883d40-6b86-4af6-a1e9-165bed4db882\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.843343 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-combined-ca-bundle\") pod \"bd883d40-6b86-4af6-a1e9-165bed4db882\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.843422 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-config\") pod \"bd883d40-6b86-4af6-a1e9-165bed4db882\" (UID: \"bd883d40-6b86-4af6-a1e9-165bed4db882\") " Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.851822 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bd883d40-6b86-4af6-a1e9-165bed4db882" (UID: "bd883d40-6b86-4af6-a1e9-165bed4db882"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.862956 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd883d40-6b86-4af6-a1e9-165bed4db882-kube-api-access-bmk5b" (OuterVolumeSpecName: "kube-api-access-bmk5b") pod "bd883d40-6b86-4af6-a1e9-165bed4db882" (UID: "bd883d40-6b86-4af6-a1e9-165bed4db882"). InnerVolumeSpecName "kube-api-access-bmk5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.894136 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-config" (OuterVolumeSpecName: "config") pod "bd883d40-6b86-4af6-a1e9-165bed4db882" (UID: "bd883d40-6b86-4af6-a1e9-165bed4db882"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.932655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd883d40-6b86-4af6-a1e9-165bed4db882" (UID: "bd883d40-6b86-4af6-a1e9-165bed4db882"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.945793 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.945827 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmk5b\" (UniqueName: \"kubernetes.io/projected/bd883d40-6b86-4af6-a1e9-165bed4db882-kube-api-access-bmk5b\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.945845 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.945879 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.951158 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bd883d40-6b86-4af6-a1e9-165bed4db882" (UID: "bd883d40-6b86-4af6-a1e9-165bed4db882"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.982729 4786 generic.go:334] "Generic (PLEG): container finished" podID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerID="331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4" exitCode=0 Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.982781 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54b6c58848-h4dwn" event={"ID":"bd883d40-6b86-4af6-a1e9-165bed4db882","Type":"ContainerDied","Data":"331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4"} Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.982816 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54b6c58848-h4dwn" event={"ID":"bd883d40-6b86-4af6-a1e9-165bed4db882","Type":"ContainerDied","Data":"ce880a2ad88978b33bdbbd81453ea191f3fbadbb29c618a44041ad56b3076f23"} Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.982804 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54b6c58848-h4dwn" Mar 13 16:40:43 crc kubenswrapper[4786]: I0313 16:40:43.982830 4786 scope.go:117] "RemoveContainer" containerID="75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0" Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.015157 4786 scope.go:117] "RemoveContainer" containerID="331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4" Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.021225 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54b6c58848-h4dwn"] Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.030221 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-54b6c58848-h4dwn"] Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.037704 4786 scope.go:117] "RemoveContainer" containerID="75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0" Mar 13 16:40:44 crc kubenswrapper[4786]: E0313 16:40:44.038362 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0\": container with ID starting with 75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0 not found: ID does not exist" containerID="75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0" Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.038405 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0"} err="failed to get container status \"75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0\": rpc error: code = NotFound desc = could not find container \"75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0\": container with ID starting with 75b8d92af217ca463e836b6c91a19f36c9b05e865a0831d08cba5e0e92978bc0 not found: ID does not exist" Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.038433 4786 scope.go:117] "RemoveContainer" containerID="331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4" Mar 13 16:40:44 crc kubenswrapper[4786]: E0313 16:40:44.038812 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4\": container with ID starting with 331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4 not found: ID does not exist" containerID="331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4" Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.038871 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4"} err="failed to get container status \"331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4\": rpc error: code = NotFound desc = could not find container \"331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4\": container with ID starting with 331c674fcb2ebdbabdbc27fe8048549bd90b2681ad06553d8d80906446d50fc4 not found: ID does not exist" Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.047363 4786 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd883d40-6b86-4af6-a1e9-165bed4db882-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:44 crc kubenswrapper[4786]: E0313 16:40:44.078699 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd883d40_6b86_4af6_a1e9_165bed4db882.slice\": RecentStats: unable to find data in memory cache]" Mar 13 16:40:44 crc kubenswrapper[4786]: I0313 16:40:44.568983 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" path="/var/lib/kubelet/pods/bd883d40-6b86-4af6-a1e9-165bed4db882/volumes" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.291592 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nh76t"] Mar 13 16:40:52 crc kubenswrapper[4786]: E0313 16:40:52.292414 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22e3434-1abb-49b3-916d-0d36fa09b794" containerName="init" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.292428 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22e3434-1abb-49b3-916d-0d36fa09b794" containerName="init" Mar 13 16:40:52 crc kubenswrapper[4786]: E0313 16:40:52.292443 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22e3434-1abb-49b3-916d-0d36fa09b794" containerName="dnsmasq-dns" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.292449 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22e3434-1abb-49b3-916d-0d36fa09b794" containerName="dnsmasq-dns" Mar 13 16:40:52 crc kubenswrapper[4786]: E0313 16:40:52.292472 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerName="neutron-api" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.292486 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerName="neutron-api" Mar 13 16:40:52 crc kubenswrapper[4786]: E0313 16:40:52.292500 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerName="neutron-httpd" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.292505 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerName="neutron-httpd" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.292653 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c22e3434-1abb-49b3-916d-0d36fa09b794" containerName="dnsmasq-dns" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.292668 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerName="neutron-httpd" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.292687 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd883d40-6b86-4af6-a1e9-165bed4db882" containerName="neutron-api" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.293250 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.295574 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.295808 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.296026 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.296150 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.296193 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2nbfs" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.307121 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nh76t"] Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.364225 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bc846dbf-cslfh"] Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.365746 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.401945 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402005 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-dns-svc\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbn7r\" (UniqueName: \"kubernetes.io/projected/d1cf7d67-e486-4be5-b1be-9a6465598de1-kube-api-access-jbn7r\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402083 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8cn7\" (UniqueName: \"kubernetes.io/projected/e53156aa-ccc2-4554-8ac4-718db21b1ca8-kube-api-access-m8cn7\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-config\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402221 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-ring-data-devices\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402252 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-combined-ca-bundle\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402313 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e53156aa-ccc2-4554-8ac4-718db21b1ca8-etc-swift\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402344 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-swiftconf\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402370 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-scripts\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402390 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.402416 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-dispersionconf\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.410258 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc846dbf-cslfh"] Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505104 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-ring-data-devices\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505143 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-combined-ca-bundle\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e53156aa-ccc2-4554-8ac4-718db21b1ca8-etc-swift\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-swiftconf\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505229 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-scripts\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505245 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505263 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-dispersionconf\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505285 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505303 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-dns-svc\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbn7r\" (UniqueName: \"kubernetes.io/projected/d1cf7d67-e486-4be5-b1be-9a6465598de1-kube-api-access-jbn7r\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505349 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8cn7\" (UniqueName: \"kubernetes.io/projected/e53156aa-ccc2-4554-8ac4-718db21b1ca8-kube-api-access-m8cn7\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.505376 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-config\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.506177 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-config\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.506675 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.508055 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-scripts\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.508456 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-ring-data-devices\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.508540 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.508702 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e53156aa-ccc2-4554-8ac4-718db21b1ca8-etc-swift\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.509764 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-dns-svc\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.513047 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-dispersionconf\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.520523 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-combined-ca-bundle\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.533226 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-swiftconf\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.535339 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8cn7\" (UniqueName: \"kubernetes.io/projected/e53156aa-ccc2-4554-8ac4-718db21b1ca8-kube-api-access-m8cn7\") pod \"swift-ring-rebalance-nh76t\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.542428 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbn7r\" (UniqueName: \"kubernetes.io/projected/d1cf7d67-e486-4be5-b1be-9a6465598de1-kube-api-access-jbn7r\") pod \"dnsmasq-dns-7bc846dbf-cslfh\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.619534 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:52 crc kubenswrapper[4786]: I0313 16:40:52.689256 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:53 crc kubenswrapper[4786]: I0313 16:40:53.021253 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nh76t"] Mar 13 16:40:53 crc kubenswrapper[4786]: I0313 16:40:53.081564 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nh76t" event={"ID":"e53156aa-ccc2-4554-8ac4-718db21b1ca8","Type":"ContainerStarted","Data":"68ab7e500c1d1f242c59b8ec182809aed7a69a5f26b2fc605116cd8b66738cf5"} Mar 13 16:40:53 crc kubenswrapper[4786]: I0313 16:40:53.237686 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bc846dbf-cslfh"] Mar 13 16:40:53 crc kubenswrapper[4786]: W0313 16:40:53.241471 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1cf7d67_e486_4be5_b1be_9a6465598de1.slice/crio-50397e6d0883954e835b18c00a308ea5b8a3f50481b4f02de0f724e31e64a843 WatchSource:0}: Error finding container 50397e6d0883954e835b18c00a308ea5b8a3f50481b4f02de0f724e31e64a843: Status 404 returned error can't find the container with id 50397e6d0883954e835b18c00a308ea5b8a3f50481b4f02de0f724e31e64a843 Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.090226 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nh76t" event={"ID":"e53156aa-ccc2-4554-8ac4-718db21b1ca8","Type":"ContainerStarted","Data":"5e0ff3bf3870c7393d8eada6c2c402e42e6c43c44ca8ff9dc3fdae5223db1013"} Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.092281 4786 generic.go:334] "Generic (PLEG): container finished" podID="d1cf7d67-e486-4be5-b1be-9a6465598de1" containerID="0d195be89c4091a19dff7648bca97c371c58acdd25ab0ad27bd491e120fcb3fa" exitCode=0 Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.092319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" event={"ID":"d1cf7d67-e486-4be5-b1be-9a6465598de1","Type":"ContainerDied","Data":"0d195be89c4091a19dff7648bca97c371c58acdd25ab0ad27bd491e120fcb3fa"} Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.092342 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" event={"ID":"d1cf7d67-e486-4be5-b1be-9a6465598de1","Type":"ContainerStarted","Data":"50397e6d0883954e835b18c00a308ea5b8a3f50481b4f02de0f724e31e64a843"} Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.119231 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nh76t" podStartSLOduration=2.119212636 podStartE2EDuration="2.119212636s" podCreationTimestamp="2026-03-13 16:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:40:54.117496103 +0000 UTC m=+5884.280707904" watchObservedRunningTime="2026-03-13 16:40:54.119212636 +0000 UTC m=+5884.282424447" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.370189 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-74858c9bf6-mgk7p"] Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.372022 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.374613 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.392315 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74858c9bf6-mgk7p"] Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.455971 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-log-httpd\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.456091 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-config-data\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.456140 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gbkt\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-kube-api-access-8gbkt\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.456368 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-run-httpd\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.456492 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-combined-ca-bundle\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.456564 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-etc-swift\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.557909 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-combined-ca-bundle\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.557951 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-etc-swift\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.558016 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-log-httpd\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.558058 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-config-data\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.558091 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gbkt\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-kube-api-access-8gbkt\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.558141 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-run-httpd\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.558593 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-run-httpd\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.558648 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-log-httpd\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.564017 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-combined-ca-bundle\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.564517 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-config-data\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.564960 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-etc-swift\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.584495 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gbkt\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-kube-api-access-8gbkt\") pod \"swift-proxy-74858c9bf6-mgk7p\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:54 crc kubenswrapper[4786]: I0313 16:40:54.686777 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:55 crc kubenswrapper[4786]: I0313 16:40:55.100683 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" event={"ID":"d1cf7d67-e486-4be5-b1be-9a6465598de1","Type":"ContainerStarted","Data":"12e93a7229724590eb99135af19d0d755824d9626ede8f3a29b08175b0cbf57f"} Mar 13 16:40:55 crc kubenswrapper[4786]: I0313 16:40:55.101025 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:40:55 crc kubenswrapper[4786]: I0313 16:40:55.122718 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" podStartSLOduration=3.1226974690000002 podStartE2EDuration="3.122697469s" podCreationTimestamp="2026-03-13 16:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:40:55.122688379 +0000 UTC m=+5885.285900190" watchObservedRunningTime="2026-03-13 16:40:55.122697469 +0000 UTC m=+5885.285909280" Mar 13 16:40:55 crc kubenswrapper[4786]: I0313 16:40:55.343571 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74858c9bf6-mgk7p"] Mar 13 16:40:55 crc kubenswrapper[4786]: W0313 16:40:55.395519 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ef85918_087d_4f24_9f74_91ff228f884f.slice/crio-8f62d9ce7238daada7be269d2723a8882d72cfaa587b1a2a17a0ddcf75e8fdf7 WatchSource:0}: Error finding container 8f62d9ce7238daada7be269d2723a8882d72cfaa587b1a2a17a0ddcf75e8fdf7: Status 404 returned error can't find the container with id 8f62d9ce7238daada7be269d2723a8882d72cfaa587b1a2a17a0ddcf75e8fdf7 Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.109338 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74858c9bf6-mgk7p" event={"ID":"2ef85918-087d-4f24-9f74-91ff228f884f","Type":"ContainerStarted","Data":"9ca5846c34dee28ed39e0151a839d2d4015d760ee6fb731a9426ebadbd155c8a"} Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.109970 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74858c9bf6-mgk7p" event={"ID":"2ef85918-087d-4f24-9f74-91ff228f884f","Type":"ContainerStarted","Data":"5296dec61e1f5c2f54429883df9a3091fb2afce4b6445ef3ab04f44837b2375a"} Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.109997 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.110007 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74858c9bf6-mgk7p" event={"ID":"2ef85918-087d-4f24-9f74-91ff228f884f","Type":"ContainerStarted","Data":"8f62d9ce7238daada7be269d2723a8882d72cfaa587b1a2a17a0ddcf75e8fdf7"} Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.132155 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-74858c9bf6-mgk7p" podStartSLOduration=2.132139141 podStartE2EDuration="2.132139141s" podCreationTimestamp="2026-03-13 16:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:40:56.126885999 +0000 UTC m=+5886.290097810" watchObservedRunningTime="2026-03-13 16:40:56.132139141 +0000 UTC m=+5886.295350952" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.914638 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-85f65c99b4-74hxg"] Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.916573 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.918989 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.933252 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.940938 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85f65c99b4-74hxg"] Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.998785 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-run-httpd\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.998874 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-combined-ca-bundle\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.998898 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-log-httpd\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.998916 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-public-tls-certs\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.998933 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-internal-tls-certs\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.998965 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvrbk\" (UniqueName: \"kubernetes.io/projected/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-kube-api-access-kvrbk\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.999092 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-etc-swift\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:56 crc kubenswrapper[4786]: I0313 16:40:56.999153 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-config-data\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.101094 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-internal-tls-certs\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.101158 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvrbk\" (UniqueName: \"kubernetes.io/projected/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-kube-api-access-kvrbk\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.101181 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-etc-swift\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.101205 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-config-data\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.101278 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-run-httpd\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.101324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-combined-ca-bundle\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.101341 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-log-httpd\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.101357 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-public-tls-certs\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.106326 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-log-httpd\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.106591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-run-httpd\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.111335 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-public-tls-certs\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.111957 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-etc-swift\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.120263 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-internal-tls-certs\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.124888 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-combined-ca-bundle\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.127590 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-config-data\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.143054 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.145553 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvrbk\" (UniqueName: \"kubernetes.io/projected/b0cf0344-26b6-4d4f-919a-a478a51ffa7f-kube-api-access-kvrbk\") pod \"swift-proxy-85f65c99b4-74hxg\" (UID: \"b0cf0344-26b6-4d4f-919a-a478a51ffa7f\") " pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.234362 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:57 crc kubenswrapper[4786]: I0313 16:40:57.875996 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85f65c99b4-74hxg"] Mar 13 16:40:58 crc kubenswrapper[4786]: I0313 16:40:58.154395 4786 generic.go:334] "Generic (PLEG): container finished" podID="e53156aa-ccc2-4554-8ac4-718db21b1ca8" containerID="5e0ff3bf3870c7393d8eada6c2c402e42e6c43c44ca8ff9dc3fdae5223db1013" exitCode=0 Mar 13 16:40:58 crc kubenswrapper[4786]: I0313 16:40:58.154480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nh76t" event={"ID":"e53156aa-ccc2-4554-8ac4-718db21b1ca8","Type":"ContainerDied","Data":"5e0ff3bf3870c7393d8eada6c2c402e42e6c43c44ca8ff9dc3fdae5223db1013"} Mar 13 16:40:58 crc kubenswrapper[4786]: I0313 16:40:58.159152 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f65c99b4-74hxg" event={"ID":"b0cf0344-26b6-4d4f-919a-a478a51ffa7f","Type":"ContainerStarted","Data":"7609a38abf93eb14b76e6df00f68c80665ac38ec545f3e8ac1e17e86c5a95d84"} Mar 13 16:40:58 crc kubenswrapper[4786]: I0313 16:40:58.159188 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f65c99b4-74hxg" event={"ID":"b0cf0344-26b6-4d4f-919a-a478a51ffa7f","Type":"ContainerStarted","Data":"744c7b7a6c615cd3d82a97463ee49d3a5b2ec7f284d42c4d0927a7cd016999c5"} Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.169970 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f65c99b4-74hxg" event={"ID":"b0cf0344-26b6-4d4f-919a-a478a51ffa7f","Type":"ContainerStarted","Data":"38e0850410cca691a29f0a1fe80fe8086189ab0e15c58324e22bbde2af549012"} Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.170255 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.170289 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.189196 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-85f65c99b4-74hxg" podStartSLOduration=3.189182092 podStartE2EDuration="3.189182092s" podCreationTimestamp="2026-03-13 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:40:59.187387207 +0000 UTC m=+5889.350599018" watchObservedRunningTime="2026-03-13 16:40:59.189182092 +0000 UTC m=+5889.352393903" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.582878 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.709551 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-dispersionconf\") pod \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.709630 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-ring-data-devices\") pod \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.709651 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-combined-ca-bundle\") pod \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.709668 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-swiftconf\") pod \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.709687 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8cn7\" (UniqueName: \"kubernetes.io/projected/e53156aa-ccc2-4554-8ac4-718db21b1ca8-kube-api-access-m8cn7\") pod \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.709723 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-scripts\") pod \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.709759 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e53156aa-ccc2-4554-8ac4-718db21b1ca8-etc-swift\") pod \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\" (UID: \"e53156aa-ccc2-4554-8ac4-718db21b1ca8\") " Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.710693 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e53156aa-ccc2-4554-8ac4-718db21b1ca8" (UID: "e53156aa-ccc2-4554-8ac4-718db21b1ca8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.710871 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e53156aa-ccc2-4554-8ac4-718db21b1ca8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e53156aa-ccc2-4554-8ac4-718db21b1ca8" (UID: "e53156aa-ccc2-4554-8ac4-718db21b1ca8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.711446 4786 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.711488 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e53156aa-ccc2-4554-8ac4-718db21b1ca8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.718344 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e53156aa-ccc2-4554-8ac4-718db21b1ca8" (UID: "e53156aa-ccc2-4554-8ac4-718db21b1ca8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.722009 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53156aa-ccc2-4554-8ac4-718db21b1ca8-kube-api-access-m8cn7" (OuterVolumeSpecName: "kube-api-access-m8cn7") pod "e53156aa-ccc2-4554-8ac4-718db21b1ca8" (UID: "e53156aa-ccc2-4554-8ac4-718db21b1ca8"). InnerVolumeSpecName "kube-api-access-m8cn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.731429 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-scripts" (OuterVolumeSpecName: "scripts") pod "e53156aa-ccc2-4554-8ac4-718db21b1ca8" (UID: "e53156aa-ccc2-4554-8ac4-718db21b1ca8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.736498 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e53156aa-ccc2-4554-8ac4-718db21b1ca8" (UID: "e53156aa-ccc2-4554-8ac4-718db21b1ca8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.740038 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e53156aa-ccc2-4554-8ac4-718db21b1ca8" (UID: "e53156aa-ccc2-4554-8ac4-718db21b1ca8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.815439 4786 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.815756 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.815937 4786 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e53156aa-ccc2-4554-8ac4-718db21b1ca8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.816061 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8cn7\" (UniqueName: \"kubernetes.io/projected/e53156aa-ccc2-4554-8ac4-718db21b1ca8-kube-api-access-m8cn7\") on node \"crc\" DevicePath \"\"" Mar 13 16:40:59 crc kubenswrapper[4786]: I0313 16:40:59.816176 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e53156aa-ccc2-4554-8ac4-718db21b1ca8-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:00 crc kubenswrapper[4786]: I0313 16:41:00.180935 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nh76t" Mar 13 16:41:00 crc kubenswrapper[4786]: I0313 16:41:00.180936 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nh76t" event={"ID":"e53156aa-ccc2-4554-8ac4-718db21b1ca8","Type":"ContainerDied","Data":"68ab7e500c1d1f242c59b8ec182809aed7a69a5f26b2fc605116cd8b66738cf5"} Mar 13 16:41:00 crc kubenswrapper[4786]: I0313 16:41:00.181023 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68ab7e500c1d1f242c59b8ec182809aed7a69a5f26b2fc605116cd8b66738cf5" Mar 13 16:41:02 crc kubenswrapper[4786]: I0313 16:41:02.691158 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:41:02 crc kubenswrapper[4786]: I0313 16:41:02.782255 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dddd86fbf-vjdp4"] Mar 13 16:41:02 crc kubenswrapper[4786]: I0313 16:41:02.782569 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" podUID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" containerName="dnsmasq-dns" containerID="cri-o://3a37ee914c9f7f113fdfb17dc05b287ee3134aa606c950fffa6a7e2aa0172409" gracePeriod=10 Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.207445 4786 generic.go:334] "Generic (PLEG): container finished" podID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" containerID="3a37ee914c9f7f113fdfb17dc05b287ee3134aa606c950fffa6a7e2aa0172409" exitCode=0 Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.207485 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" event={"ID":"34ae2a6c-c21f-475d-950c-e23b1f2722ce","Type":"ContainerDied","Data":"3a37ee914c9f7f113fdfb17dc05b287ee3134aa606c950fffa6a7e2aa0172409"} Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.310824 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.409539 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-dns-svc\") pod \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.409626 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-sb\") pod \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.409761 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-nb\") pod \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.409887 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6tsj\" (UniqueName: \"kubernetes.io/projected/34ae2a6c-c21f-475d-950c-e23b1f2722ce-kube-api-access-v6tsj\") pod \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.409924 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-config\") pod \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\" (UID: \"34ae2a6c-c21f-475d-950c-e23b1f2722ce\") " Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.429208 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ae2a6c-c21f-475d-950c-e23b1f2722ce-kube-api-access-v6tsj" (OuterVolumeSpecName: "kube-api-access-v6tsj") pod "34ae2a6c-c21f-475d-950c-e23b1f2722ce" (UID: "34ae2a6c-c21f-475d-950c-e23b1f2722ce"). InnerVolumeSpecName "kube-api-access-v6tsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.473241 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34ae2a6c-c21f-475d-950c-e23b1f2722ce" (UID: "34ae2a6c-c21f-475d-950c-e23b1f2722ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.494745 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34ae2a6c-c21f-475d-950c-e23b1f2722ce" (UID: "34ae2a6c-c21f-475d-950c-e23b1f2722ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.497420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34ae2a6c-c21f-475d-950c-e23b1f2722ce" (UID: "34ae2a6c-c21f-475d-950c-e23b1f2722ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.511956 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.512054 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.512112 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.512173 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6tsj\" (UniqueName: \"kubernetes.io/projected/34ae2a6c-c21f-475d-950c-e23b1f2722ce-kube-api-access-v6tsj\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.512915 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-config" (OuterVolumeSpecName: "config") pod "34ae2a6c-c21f-475d-950c-e23b1f2722ce" (UID: "34ae2a6c-c21f-475d-950c-e23b1f2722ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:03 crc kubenswrapper[4786]: I0313 16:41:03.613107 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae2a6c-c21f-475d-950c-e23b1f2722ce-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.218882 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" event={"ID":"34ae2a6c-c21f-475d-950c-e23b1f2722ce","Type":"ContainerDied","Data":"8743a719fae08f25cf3aa33ca3f52a1431e570ca662603dd1ed75af2ea0e1e86"} Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.219263 4786 scope.go:117] "RemoveContainer" containerID="3a37ee914c9f7f113fdfb17dc05b287ee3134aa606c950fffa6a7e2aa0172409" Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.218994 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dddd86fbf-vjdp4" Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.263147 4786 scope.go:117] "RemoveContainer" containerID="cebef6588e079c2814e8e2dbabd7cd153356e73613eb16e3c80e85842b0f4cd1" Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.264304 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dddd86fbf-vjdp4"] Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.271554 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dddd86fbf-vjdp4"] Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.568838 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" path="/var/lib/kubelet/pods/34ae2a6c-c21f-475d-950c-e23b1f2722ce/volumes" Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.690326 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:41:04 crc kubenswrapper[4786]: I0313 16:41:04.690574 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:41:07 crc kubenswrapper[4786]: I0313 16:41:07.242664 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:41:07 crc kubenswrapper[4786]: I0313 16:41:07.247771 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85f65c99b4-74hxg" Mar 13 16:41:07 crc kubenswrapper[4786]: I0313 16:41:07.367057 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-74858c9bf6-mgk7p"] Mar 13 16:41:07 crc kubenswrapper[4786]: I0313 16:41:07.367379 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-74858c9bf6-mgk7p" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" containerName="proxy-httpd" containerID="cri-o://5296dec61e1f5c2f54429883df9a3091fb2afce4b6445ef3ab04f44837b2375a" gracePeriod=30 Mar 13 16:41:07 crc kubenswrapper[4786]: I0313 16:41:07.367499 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-74858c9bf6-mgk7p" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" containerName="proxy-server" containerID="cri-o://9ca5846c34dee28ed39e0151a839d2d4015d760ee6fb731a9426ebadbd155c8a" gracePeriod=30 Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.254064 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ef85918-087d-4f24-9f74-91ff228f884f" containerID="9ca5846c34dee28ed39e0151a839d2d4015d760ee6fb731a9426ebadbd155c8a" exitCode=0 Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.254307 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ef85918-087d-4f24-9f74-91ff228f884f" containerID="5296dec61e1f5c2f54429883df9a3091fb2afce4b6445ef3ab04f44837b2375a" exitCode=0 Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.255311 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74858c9bf6-mgk7p" event={"ID":"2ef85918-087d-4f24-9f74-91ff228f884f","Type":"ContainerDied","Data":"9ca5846c34dee28ed39e0151a839d2d4015d760ee6fb731a9426ebadbd155c8a"} Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.255346 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74858c9bf6-mgk7p" event={"ID":"2ef85918-087d-4f24-9f74-91ff228f884f","Type":"ContainerDied","Data":"5296dec61e1f5c2f54429883df9a3091fb2afce4b6445ef3ab04f44837b2375a"} Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.532908 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.611139 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-etc-swift\") pod \"2ef85918-087d-4f24-9f74-91ff228f884f\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.611213 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gbkt\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-kube-api-access-8gbkt\") pod \"2ef85918-087d-4f24-9f74-91ff228f884f\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.611239 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-log-httpd\") pod \"2ef85918-087d-4f24-9f74-91ff228f884f\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.611260 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-combined-ca-bundle\") pod \"2ef85918-087d-4f24-9f74-91ff228f884f\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.611280 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-config-data\") pod \"2ef85918-087d-4f24-9f74-91ff228f884f\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.611329 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-run-httpd\") pod \"2ef85918-087d-4f24-9f74-91ff228f884f\" (UID: \"2ef85918-087d-4f24-9f74-91ff228f884f\") " Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.612385 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ef85918-087d-4f24-9f74-91ff228f884f" (UID: "2ef85918-087d-4f24-9f74-91ff228f884f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.612470 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ef85918-087d-4f24-9f74-91ff228f884f" (UID: "2ef85918-087d-4f24-9f74-91ff228f884f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.613318 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.613337 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ef85918-087d-4f24-9f74-91ff228f884f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.624274 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2ef85918-087d-4f24-9f74-91ff228f884f" (UID: "2ef85918-087d-4f24-9f74-91ff228f884f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.624986 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-kube-api-access-8gbkt" (OuterVolumeSpecName: "kube-api-access-8gbkt") pod "2ef85918-087d-4f24-9f74-91ff228f884f" (UID: "2ef85918-087d-4f24-9f74-91ff228f884f"). InnerVolumeSpecName "kube-api-access-8gbkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.660583 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ef85918-087d-4f24-9f74-91ff228f884f" (UID: "2ef85918-087d-4f24-9f74-91ff228f884f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.669574 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-config-data" (OuterVolumeSpecName: "config-data") pod "2ef85918-087d-4f24-9f74-91ff228f884f" (UID: "2ef85918-087d-4f24-9f74-91ff228f884f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.715339 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.715535 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gbkt\" (UniqueName: \"kubernetes.io/projected/2ef85918-087d-4f24-9f74-91ff228f884f-kube-api-access-8gbkt\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.715611 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:08 crc kubenswrapper[4786]: I0313 16:41:08.715694 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef85918-087d-4f24-9f74-91ff228f884f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:09 crc kubenswrapper[4786]: I0313 16:41:09.263896 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74858c9bf6-mgk7p" event={"ID":"2ef85918-087d-4f24-9f74-91ff228f884f","Type":"ContainerDied","Data":"8f62d9ce7238daada7be269d2723a8882d72cfaa587b1a2a17a0ddcf75e8fdf7"} Mar 13 16:41:09 crc kubenswrapper[4786]: I0313 16:41:09.263951 4786 scope.go:117] "RemoveContainer" containerID="9ca5846c34dee28ed39e0151a839d2d4015d760ee6fb731a9426ebadbd155c8a" Mar 13 16:41:09 crc kubenswrapper[4786]: I0313 16:41:09.264090 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74858c9bf6-mgk7p" Mar 13 16:41:09 crc kubenswrapper[4786]: I0313 16:41:09.293803 4786 scope.go:117] "RemoveContainer" containerID="5296dec61e1f5c2f54429883df9a3091fb2afce4b6445ef3ab04f44837b2375a" Mar 13 16:41:09 crc kubenswrapper[4786]: I0313 16:41:09.302906 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-74858c9bf6-mgk7p"] Mar 13 16:41:09 crc kubenswrapper[4786]: I0313 16:41:09.315829 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-74858c9bf6-mgk7p"] Mar 13 16:41:10 crc kubenswrapper[4786]: I0313 16:41:10.562479 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" path="/var/lib/kubelet/pods/2ef85918-087d-4f24-9f74-91ff228f884f/volumes" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.406256 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7f99v"] Mar 13 16:41:13 crc kubenswrapper[4786]: E0313 16:41:13.406887 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53156aa-ccc2-4554-8ac4-718db21b1ca8" containerName="swift-ring-rebalance" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.406900 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53156aa-ccc2-4554-8ac4-718db21b1ca8" containerName="swift-ring-rebalance" Mar 13 16:41:13 crc kubenswrapper[4786]: E0313 16:41:13.406911 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" containerName="proxy-server" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.406917 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" containerName="proxy-server" Mar 13 16:41:13 crc kubenswrapper[4786]: E0313 16:41:13.406930 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" containerName="dnsmasq-dns" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.406937 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" containerName="dnsmasq-dns" Mar 13 16:41:13 crc kubenswrapper[4786]: E0313 16:41:13.406952 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" containerName="proxy-httpd" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.406957 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" containerName="proxy-httpd" Mar 13 16:41:13 crc kubenswrapper[4786]: E0313 16:41:13.406976 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" containerName="init" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.406984 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" containerName="init" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.407152 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53156aa-ccc2-4554-8ac4-718db21b1ca8" containerName="swift-ring-rebalance" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.407167 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" containerName="proxy-server" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.407180 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef85918-087d-4f24-9f74-91ff228f884f" containerName="proxy-httpd" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.407190 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ae2a6c-c21f-475d-950c-e23b1f2722ce" containerName="dnsmasq-dns" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.407693 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.416118 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7f99v"] Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.496787 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlszl\" (UniqueName: \"kubernetes.io/projected/155b42b2-bff4-4522-88b5-79a356cf2c9b-kube-api-access-nlszl\") pod \"cinder-db-create-7f99v\" (UID: \"155b42b2-bff4-4522-88b5-79a356cf2c9b\") " pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.497426 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b42b2-bff4-4522-88b5-79a356cf2c9b-operator-scripts\") pod \"cinder-db-create-7f99v\" (UID: \"155b42b2-bff4-4522-88b5-79a356cf2c9b\") " pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.500848 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a45a-account-create-update-fc84c"] Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.501812 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.503740 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.516006 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a45a-account-create-update-fc84c"] Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.598813 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tnwm\" (UniqueName: \"kubernetes.io/projected/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-kube-api-access-8tnwm\") pod \"cinder-a45a-account-create-update-fc84c\" (UID: \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\") " pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.598884 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlszl\" (UniqueName: \"kubernetes.io/projected/155b42b2-bff4-4522-88b5-79a356cf2c9b-kube-api-access-nlszl\") pod \"cinder-db-create-7f99v\" (UID: \"155b42b2-bff4-4522-88b5-79a356cf2c9b\") " pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.599046 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b42b2-bff4-4522-88b5-79a356cf2c9b-operator-scripts\") pod \"cinder-db-create-7f99v\" (UID: \"155b42b2-bff4-4522-88b5-79a356cf2c9b\") " pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.599088 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-operator-scripts\") pod \"cinder-a45a-account-create-update-fc84c\" (UID: \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\") " pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.599841 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b42b2-bff4-4522-88b5-79a356cf2c9b-operator-scripts\") pod \"cinder-db-create-7f99v\" (UID: \"155b42b2-bff4-4522-88b5-79a356cf2c9b\") " pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.617799 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlszl\" (UniqueName: \"kubernetes.io/projected/155b42b2-bff4-4522-88b5-79a356cf2c9b-kube-api-access-nlszl\") pod \"cinder-db-create-7f99v\" (UID: \"155b42b2-bff4-4522-88b5-79a356cf2c9b\") " pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.700481 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-operator-scripts\") pod \"cinder-a45a-account-create-update-fc84c\" (UID: \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\") " pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.701401 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-operator-scripts\") pod \"cinder-a45a-account-create-update-fc84c\" (UID: \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\") " pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.701479 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tnwm\" (UniqueName: \"kubernetes.io/projected/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-kube-api-access-8tnwm\") pod \"cinder-a45a-account-create-update-fc84c\" (UID: \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\") " pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.723357 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tnwm\" (UniqueName: \"kubernetes.io/projected/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-kube-api-access-8tnwm\") pod \"cinder-a45a-account-create-update-fc84c\" (UID: \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\") " pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.728092 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:13 crc kubenswrapper[4786]: I0313 16:41:13.817376 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:14 crc kubenswrapper[4786]: W0313 16:41:14.003952 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod155b42b2_bff4_4522_88b5_79a356cf2c9b.slice/crio-73d43245c37377c108503653ec483aa802289e0c9e31baaeb6464aa24b6de4bc WatchSource:0}: Error finding container 73d43245c37377c108503653ec483aa802289e0c9e31baaeb6464aa24b6de4bc: Status 404 returned error can't find the container with id 73d43245c37377c108503653ec483aa802289e0c9e31baaeb6464aa24b6de4bc Mar 13 16:41:14 crc kubenswrapper[4786]: I0313 16:41:14.004991 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7f99v"] Mar 13 16:41:14 crc kubenswrapper[4786]: I0313 16:41:14.314831 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a45a-account-create-update-fc84c"] Mar 13 16:41:14 crc kubenswrapper[4786]: I0313 16:41:14.323926 4786 generic.go:334] "Generic (PLEG): container finished" podID="155b42b2-bff4-4522-88b5-79a356cf2c9b" containerID="15bb4362b12a5a94f398ac340672a96253d6cf023c83afd7ec81124664d2a4db" exitCode=0 Mar 13 16:41:14 crc kubenswrapper[4786]: I0313 16:41:14.323965 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7f99v" event={"ID":"155b42b2-bff4-4522-88b5-79a356cf2c9b","Type":"ContainerDied","Data":"15bb4362b12a5a94f398ac340672a96253d6cf023c83afd7ec81124664d2a4db"} Mar 13 16:41:14 crc kubenswrapper[4786]: I0313 16:41:14.324007 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7f99v" event={"ID":"155b42b2-bff4-4522-88b5-79a356cf2c9b","Type":"ContainerStarted","Data":"73d43245c37377c108503653ec483aa802289e0c9e31baaeb6464aa24b6de4bc"} Mar 13 16:41:14 crc kubenswrapper[4786]: W0313 16:41:14.347805 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81bcc1d5_7c5e_4848_a120_fd1438a9eefa.slice/crio-e19a3b60affc7b1ac2d8e423f9aa16376722206a176bfe1fe3b3556f14847aea WatchSource:0}: Error finding container e19a3b60affc7b1ac2d8e423f9aa16376722206a176bfe1fe3b3556f14847aea: Status 404 returned error can't find the container with id e19a3b60affc7b1ac2d8e423f9aa16376722206a176bfe1fe3b3556f14847aea Mar 13 16:41:14 crc kubenswrapper[4786]: E0313 16:41:14.741992 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81bcc1d5_7c5e_4848_a120_fd1438a9eefa.slice/crio-conmon-3b1c82d2be967ff6faec2c05e8a6292934791fc3115b2186a5905385625879eb.scope\": RecentStats: unable to find data in memory cache]" Mar 13 16:41:14 crc kubenswrapper[4786]: I0313 16:41:14.825773 4786 scope.go:117] "RemoveContainer" containerID="dbeee43237c6aa5b8bf163df05c1d1a8f45a082222fe2a8a798c4a14327c53d6" Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.335371 4786 generic.go:334] "Generic (PLEG): container finished" podID="81bcc1d5-7c5e-4848-a120-fd1438a9eefa" containerID="3b1c82d2be967ff6faec2c05e8a6292934791fc3115b2186a5905385625879eb" exitCode=0 Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.335446 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a45a-account-create-update-fc84c" event={"ID":"81bcc1d5-7c5e-4848-a120-fd1438a9eefa","Type":"ContainerDied","Data":"3b1c82d2be967ff6faec2c05e8a6292934791fc3115b2186a5905385625879eb"} Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.335480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a45a-account-create-update-fc84c" event={"ID":"81bcc1d5-7c5e-4848-a120-fd1438a9eefa","Type":"ContainerStarted","Data":"e19a3b60affc7b1ac2d8e423f9aa16376722206a176bfe1fe3b3556f14847aea"} Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.709889 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.837107 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b42b2-bff4-4522-88b5-79a356cf2c9b-operator-scripts\") pod \"155b42b2-bff4-4522-88b5-79a356cf2c9b\" (UID: \"155b42b2-bff4-4522-88b5-79a356cf2c9b\") " Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.837577 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlszl\" (UniqueName: \"kubernetes.io/projected/155b42b2-bff4-4522-88b5-79a356cf2c9b-kube-api-access-nlszl\") pod \"155b42b2-bff4-4522-88b5-79a356cf2c9b\" (UID: \"155b42b2-bff4-4522-88b5-79a356cf2c9b\") " Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.837836 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/155b42b2-bff4-4522-88b5-79a356cf2c9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "155b42b2-bff4-4522-88b5-79a356cf2c9b" (UID: "155b42b2-bff4-4522-88b5-79a356cf2c9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.838184 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b42b2-bff4-4522-88b5-79a356cf2c9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.845246 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/155b42b2-bff4-4522-88b5-79a356cf2c9b-kube-api-access-nlszl" (OuterVolumeSpecName: "kube-api-access-nlszl") pod "155b42b2-bff4-4522-88b5-79a356cf2c9b" (UID: "155b42b2-bff4-4522-88b5-79a356cf2c9b"). InnerVolumeSpecName "kube-api-access-nlszl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:15 crc kubenswrapper[4786]: I0313 16:41:15.940088 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlszl\" (UniqueName: \"kubernetes.io/projected/155b42b2-bff4-4522-88b5-79a356cf2c9b-kube-api-access-nlszl\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.348021 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7f99v" event={"ID":"155b42b2-bff4-4522-88b5-79a356cf2c9b","Type":"ContainerDied","Data":"73d43245c37377c108503653ec483aa802289e0c9e31baaeb6464aa24b6de4bc"} Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.348097 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d43245c37377c108503653ec483aa802289e0c9e31baaeb6464aa24b6de4bc" Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.348043 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7f99v" Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.744535 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.855056 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-operator-scripts\") pod \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\" (UID: \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\") " Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.855265 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tnwm\" (UniqueName: \"kubernetes.io/projected/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-kube-api-access-8tnwm\") pod \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\" (UID: \"81bcc1d5-7c5e-4848-a120-fd1438a9eefa\") " Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.855740 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81bcc1d5-7c5e-4848-a120-fd1438a9eefa" (UID: "81bcc1d5-7c5e-4848-a120-fd1438a9eefa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.863976 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-kube-api-access-8tnwm" (OuterVolumeSpecName: "kube-api-access-8tnwm") pod "81bcc1d5-7c5e-4848-a120-fd1438a9eefa" (UID: "81bcc1d5-7c5e-4848-a120-fd1438a9eefa"). InnerVolumeSpecName "kube-api-access-8tnwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.957635 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:16 crc kubenswrapper[4786]: I0313 16:41:16.957683 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tnwm\" (UniqueName: \"kubernetes.io/projected/81bcc1d5-7c5e-4848-a120-fd1438a9eefa-kube-api-access-8tnwm\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:17 crc kubenswrapper[4786]: I0313 16:41:17.356548 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a45a-account-create-update-fc84c" event={"ID":"81bcc1d5-7c5e-4848-a120-fd1438a9eefa","Type":"ContainerDied","Data":"e19a3b60affc7b1ac2d8e423f9aa16376722206a176bfe1fe3b3556f14847aea"} Mar 13 16:41:17 crc kubenswrapper[4786]: I0313 16:41:17.356594 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e19a3b60affc7b1ac2d8e423f9aa16376722206a176bfe1fe3b3556f14847aea" Mar 13 16:41:17 crc kubenswrapper[4786]: I0313 16:41:17.356684 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a45a-account-create-update-fc84c" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.861423 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jvlvx"] Mar 13 16:41:18 crc kubenswrapper[4786]: E0313 16:41:18.861773 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bcc1d5-7c5e-4848-a120-fd1438a9eefa" containerName="mariadb-account-create-update" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.861789 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bcc1d5-7c5e-4848-a120-fd1438a9eefa" containerName="mariadb-account-create-update" Mar 13 16:41:18 crc kubenswrapper[4786]: E0313 16:41:18.861812 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155b42b2-bff4-4522-88b5-79a356cf2c9b" containerName="mariadb-database-create" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.861818 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="155b42b2-bff4-4522-88b5-79a356cf2c9b" containerName="mariadb-database-create" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.861975 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="155b42b2-bff4-4522-88b5-79a356cf2c9b" containerName="mariadb-database-create" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.861995 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="81bcc1d5-7c5e-4848-a120-fd1438a9eefa" containerName="mariadb-account-create-update" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.862502 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.864645 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.864880 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kwvcp" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.865422 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.880338 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jvlvx"] Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.895127 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dbd3a91-0884-41f5-804a-6407a7521819-etc-machine-id\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.895172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-db-sync-config-data\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.895201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zldvs\" (UniqueName: \"kubernetes.io/projected/4dbd3a91-0884-41f5-804a-6407a7521819-kube-api-access-zldvs\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.895239 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-config-data\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.895258 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-scripts\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.895294 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-combined-ca-bundle\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.996504 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-config-data\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.996572 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-scripts\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.996612 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-combined-ca-bundle\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.996693 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dbd3a91-0884-41f5-804a-6407a7521819-etc-machine-id\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.996710 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-db-sync-config-data\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.996732 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zldvs\" (UniqueName: \"kubernetes.io/projected/4dbd3a91-0884-41f5-804a-6407a7521819-kube-api-access-zldvs\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:18 crc kubenswrapper[4786]: I0313 16:41:18.997415 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dbd3a91-0884-41f5-804a-6407a7521819-etc-machine-id\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:19 crc kubenswrapper[4786]: I0313 16:41:19.002183 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-db-sync-config-data\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:19 crc kubenswrapper[4786]: I0313 16:41:19.003080 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-config-data\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:19 crc kubenswrapper[4786]: I0313 16:41:19.004587 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-combined-ca-bundle\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:19 crc kubenswrapper[4786]: I0313 16:41:19.008944 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-scripts\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:19 crc kubenswrapper[4786]: I0313 16:41:19.018544 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zldvs\" (UniqueName: \"kubernetes.io/projected/4dbd3a91-0884-41f5-804a-6407a7521819-kube-api-access-zldvs\") pod \"cinder-db-sync-jvlvx\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:19 crc kubenswrapper[4786]: I0313 16:41:19.181100 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:19 crc kubenswrapper[4786]: I0313 16:41:19.686478 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jvlvx"] Mar 13 16:41:20 crc kubenswrapper[4786]: I0313 16:41:20.408029 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jvlvx" event={"ID":"4dbd3a91-0884-41f5-804a-6407a7521819","Type":"ContainerStarted","Data":"46e422ce6f8fe6e7edeeb9bd139c5d8c1037f1be2b8b9aac5df840444ac19833"} Mar 13 16:41:21 crc kubenswrapper[4786]: I0313 16:41:21.420877 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jvlvx" event={"ID":"4dbd3a91-0884-41f5-804a-6407a7521819","Type":"ContainerStarted","Data":"f79b97c3ccaefcdb9b4d9b14d4b9f3c87a7a7ec9dbe88fcd0bbed4d1fd25eed6"} Mar 13 16:41:21 crc kubenswrapper[4786]: I0313 16:41:21.438222 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jvlvx" podStartSLOduration=3.4382074449999998 podStartE2EDuration="3.438207445s" podCreationTimestamp="2026-03-13 16:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:41:21.437330323 +0000 UTC m=+5911.600542134" watchObservedRunningTime="2026-03-13 16:41:21.438207445 +0000 UTC m=+5911.601419256" Mar 13 16:41:23 crc kubenswrapper[4786]: I0313 16:41:23.445257 4786 generic.go:334] "Generic (PLEG): container finished" podID="4dbd3a91-0884-41f5-804a-6407a7521819" containerID="f79b97c3ccaefcdb9b4d9b14d4b9f3c87a7a7ec9dbe88fcd0bbed4d1fd25eed6" exitCode=0 Mar 13 16:41:23 crc kubenswrapper[4786]: I0313 16:41:23.445368 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jvlvx" event={"ID":"4dbd3a91-0884-41f5-804a-6407a7521819","Type":"ContainerDied","Data":"f79b97c3ccaefcdb9b4d9b14d4b9f3c87a7a7ec9dbe88fcd0bbed4d1fd25eed6"} Mar 13 16:41:24 crc kubenswrapper[4786]: I0313 16:41:24.879252 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.007818 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-config-data\") pod \"4dbd3a91-0884-41f5-804a-6407a7521819\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.008019 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-scripts\") pod \"4dbd3a91-0884-41f5-804a-6407a7521819\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.008087 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-db-sync-config-data\") pod \"4dbd3a91-0884-41f5-804a-6407a7521819\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.008113 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-combined-ca-bundle\") pod \"4dbd3a91-0884-41f5-804a-6407a7521819\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.008230 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dbd3a91-0884-41f5-804a-6407a7521819-etc-machine-id\") pod \"4dbd3a91-0884-41f5-804a-6407a7521819\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.008276 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zldvs\" (UniqueName: \"kubernetes.io/projected/4dbd3a91-0884-41f5-804a-6407a7521819-kube-api-access-zldvs\") pod \"4dbd3a91-0884-41f5-804a-6407a7521819\" (UID: \"4dbd3a91-0884-41f5-804a-6407a7521819\") " Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.008353 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4dbd3a91-0884-41f5-804a-6407a7521819-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4dbd3a91-0884-41f5-804a-6407a7521819" (UID: "4dbd3a91-0884-41f5-804a-6407a7521819"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.008957 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dbd3a91-0884-41f5-804a-6407a7521819-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.013800 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4dbd3a91-0884-41f5-804a-6407a7521819" (UID: "4dbd3a91-0884-41f5-804a-6407a7521819"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.014553 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-scripts" (OuterVolumeSpecName: "scripts") pod "4dbd3a91-0884-41f5-804a-6407a7521819" (UID: "4dbd3a91-0884-41f5-804a-6407a7521819"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.023073 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbd3a91-0884-41f5-804a-6407a7521819-kube-api-access-zldvs" (OuterVolumeSpecName: "kube-api-access-zldvs") pod "4dbd3a91-0884-41f5-804a-6407a7521819" (UID: "4dbd3a91-0884-41f5-804a-6407a7521819"). InnerVolumeSpecName "kube-api-access-zldvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.033601 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dbd3a91-0884-41f5-804a-6407a7521819" (UID: "4dbd3a91-0884-41f5-804a-6407a7521819"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.066942 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-config-data" (OuterVolumeSpecName: "config-data") pod "4dbd3a91-0884-41f5-804a-6407a7521819" (UID: "4dbd3a91-0884-41f5-804a-6407a7521819"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.111322 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.111374 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.111391 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zldvs\" (UniqueName: \"kubernetes.io/projected/4dbd3a91-0884-41f5-804a-6407a7521819-kube-api-access-zldvs\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.111449 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.111465 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dbd3a91-0884-41f5-804a-6407a7521819-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.471277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jvlvx" event={"ID":"4dbd3a91-0884-41f5-804a-6407a7521819","Type":"ContainerDied","Data":"46e422ce6f8fe6e7edeeb9bd139c5d8c1037f1be2b8b9aac5df840444ac19833"} Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.471318 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46e422ce6f8fe6e7edeeb9bd139c5d8c1037f1be2b8b9aac5df840444ac19833" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.471354 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jvlvx" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.961917 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bd7ddb875-t88qs"] Mar 13 16:41:25 crc kubenswrapper[4786]: E0313 16:41:25.962209 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbd3a91-0884-41f5-804a-6407a7521819" containerName="cinder-db-sync" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.962220 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbd3a91-0884-41f5-804a-6407a7521819" containerName="cinder-db-sync" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.962405 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbd3a91-0884-41f5-804a-6407a7521819" containerName="cinder-db-sync" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.963253 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:25 crc kubenswrapper[4786]: I0313 16:41:25.988464 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bd7ddb875-t88qs"] Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.026802 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2m7x\" (UniqueName: \"kubernetes.io/projected/37633ce3-d166-4549-a66b-d4696d0cb76d-kube-api-access-j2m7x\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.026854 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-dns-svc\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.026974 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-nb\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.027027 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-config\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.027112 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-sb\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.105759 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.108961 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.114340 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.114524 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kwvcp" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.114643 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.114810 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.122809 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.128894 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-sb\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.128960 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2m7x\" (UniqueName: \"kubernetes.io/projected/37633ce3-d166-4549-a66b-d4696d0cb76d-kube-api-access-j2m7x\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.128988 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-dns-svc\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.129057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-nb\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.129092 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-config\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.129803 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-sb\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.129953 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-config\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.130226 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-nb\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.135269 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-dns-svc\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.152821 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2m7x\" (UniqueName: \"kubernetes.io/projected/37633ce3-d166-4549-a66b-d4696d0cb76d-kube-api-access-j2m7x\") pod \"dnsmasq-dns-5bd7ddb875-t88qs\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.231128 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c5805a-b679-47ca-b261-503e8945fb66-logs\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.231442 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data-custom\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.231499 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.231582 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-scripts\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.231612 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3c5805a-b679-47ca-b261-503e8945fb66-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.231639 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l56l4\" (UniqueName: \"kubernetes.io/projected/c3c5805a-b679-47ca-b261-503e8945fb66-kube-api-access-l56l4\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.231659 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.290023 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.332655 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-scripts\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.332702 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3c5805a-b679-47ca-b261-503e8945fb66-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.332727 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l56l4\" (UniqueName: \"kubernetes.io/projected/c3c5805a-b679-47ca-b261-503e8945fb66-kube-api-access-l56l4\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.332746 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.332793 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c5805a-b679-47ca-b261-503e8945fb66-logs\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.332828 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data-custom\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.332878 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.333752 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3c5805a-b679-47ca-b261-503e8945fb66-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.334298 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c5805a-b679-47ca-b261-503e8945fb66-logs\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.337199 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-scripts\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.337520 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.337633 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data-custom\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.337737 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.353372 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l56l4\" (UniqueName: \"kubernetes.io/projected/c3c5805a-b679-47ca-b261-503e8945fb66-kube-api-access-l56l4\") pod \"cinder-api-0\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.423585 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.777553 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bd7ddb875-t88qs"] Mar 13 16:41:26 crc kubenswrapper[4786]: W0313 16:41:26.778432 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37633ce3_d166_4549_a66b_d4696d0cb76d.slice/crio-10a8796d1b5f107cd29ed0f0b1bd174b2ba0a19f80e1e0a2e2e8e8c1b5d8bbf4 WatchSource:0}: Error finding container 10a8796d1b5f107cd29ed0f0b1bd174b2ba0a19f80e1e0a2e2e8e8c1b5d8bbf4: Status 404 returned error can't find the container with id 10a8796d1b5f107cd29ed0f0b1bd174b2ba0a19f80e1e0a2e2e8e8c1b5d8bbf4 Mar 13 16:41:26 crc kubenswrapper[4786]: I0313 16:41:26.924028 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:27 crc kubenswrapper[4786]: I0313 16:41:27.495446 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3c5805a-b679-47ca-b261-503e8945fb66","Type":"ContainerStarted","Data":"1d6f30ec98ab1e40e1fb2bae7f197b7f683b43d549d744c4da5ddb31abe333ab"} Mar 13 16:41:27 crc kubenswrapper[4786]: I0313 16:41:27.499091 4786 generic.go:334] "Generic (PLEG): container finished" podID="37633ce3-d166-4549-a66b-d4696d0cb76d" containerID="202ae79cb82f581e59ce6d2465ec8ae010cdc3207faec5fa00b1692a7d2815c8" exitCode=0 Mar 13 16:41:27 crc kubenswrapper[4786]: I0313 16:41:27.499116 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" event={"ID":"37633ce3-d166-4549-a66b-d4696d0cb76d","Type":"ContainerDied","Data":"202ae79cb82f581e59ce6d2465ec8ae010cdc3207faec5fa00b1692a7d2815c8"} Mar 13 16:41:27 crc kubenswrapper[4786]: I0313 16:41:27.499132 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" event={"ID":"37633ce3-d166-4549-a66b-d4696d0cb76d","Type":"ContainerStarted","Data":"10a8796d1b5f107cd29ed0f0b1bd174b2ba0a19f80e1e0a2e2e8e8c1b5d8bbf4"} Mar 13 16:41:28 crc kubenswrapper[4786]: I0313 16:41:28.258664 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:28 crc kubenswrapper[4786]: I0313 16:41:28.512996 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3c5805a-b679-47ca-b261-503e8945fb66","Type":"ContainerStarted","Data":"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea"} Mar 13 16:41:28 crc kubenswrapper[4786]: I0313 16:41:28.513063 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3c5805a-b679-47ca-b261-503e8945fb66","Type":"ContainerStarted","Data":"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880"} Mar 13 16:41:28 crc kubenswrapper[4786]: I0313 16:41:28.513531 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 16:41:28 crc kubenswrapper[4786]: I0313 16:41:28.516048 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" event={"ID":"37633ce3-d166-4549-a66b-d4696d0cb76d","Type":"ContainerStarted","Data":"c152a135d84b04d752e27b57f8302098ae6505e6a0b268c565e5208190e92915"} Mar 13 16:41:28 crc kubenswrapper[4786]: I0313 16:41:28.516209 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:28 crc kubenswrapper[4786]: I0313 16:41:28.546616 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.546598689 podStartE2EDuration="2.546598689s" podCreationTimestamp="2026-03-13 16:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:41:28.541480491 +0000 UTC m=+5918.704692302" watchObservedRunningTime="2026-03-13 16:41:28.546598689 +0000 UTC m=+5918.709810500" Mar 13 16:41:28 crc kubenswrapper[4786]: I0313 16:41:28.578423 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" podStartSLOduration=3.577937438 podStartE2EDuration="3.577937438s" podCreationTimestamp="2026-03-13 16:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:41:28.574529463 +0000 UTC m=+5918.737741274" watchObservedRunningTime="2026-03-13 16:41:28.577937438 +0000 UTC m=+5918.741149249" Mar 13 16:41:29 crc kubenswrapper[4786]: I0313 16:41:29.522111 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" containerName="cinder-api-log" containerID="cri-o://f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880" gracePeriod=30 Mar 13 16:41:29 crc kubenswrapper[4786]: I0313 16:41:29.522193 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" containerName="cinder-api" containerID="cri-o://da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea" gracePeriod=30 Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.107971 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.199337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data-custom\") pod \"c3c5805a-b679-47ca-b261-503e8945fb66\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.199408 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c5805a-b679-47ca-b261-503e8945fb66-logs\") pod \"c3c5805a-b679-47ca-b261-503e8945fb66\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.199527 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l56l4\" (UniqueName: \"kubernetes.io/projected/c3c5805a-b679-47ca-b261-503e8945fb66-kube-api-access-l56l4\") pod \"c3c5805a-b679-47ca-b261-503e8945fb66\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.199597 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-combined-ca-bundle\") pod \"c3c5805a-b679-47ca-b261-503e8945fb66\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.199655 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data\") pod \"c3c5805a-b679-47ca-b261-503e8945fb66\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.199696 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-scripts\") pod \"c3c5805a-b679-47ca-b261-503e8945fb66\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.199747 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3c5805a-b679-47ca-b261-503e8945fb66-etc-machine-id\") pod \"c3c5805a-b679-47ca-b261-503e8945fb66\" (UID: \"c3c5805a-b679-47ca-b261-503e8945fb66\") " Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.200200 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c5805a-b679-47ca-b261-503e8945fb66-logs" (OuterVolumeSpecName: "logs") pod "c3c5805a-b679-47ca-b261-503e8945fb66" (UID: "c3c5805a-b679-47ca-b261-503e8945fb66"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.200348 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3c5805a-b679-47ca-b261-503e8945fb66-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c3c5805a-b679-47ca-b261-503e8945fb66" (UID: "c3c5805a-b679-47ca-b261-503e8945fb66"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.204983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c5805a-b679-47ca-b261-503e8945fb66-kube-api-access-l56l4" (OuterVolumeSpecName: "kube-api-access-l56l4") pod "c3c5805a-b679-47ca-b261-503e8945fb66" (UID: "c3c5805a-b679-47ca-b261-503e8945fb66"). InnerVolumeSpecName "kube-api-access-l56l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.205426 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c3c5805a-b679-47ca-b261-503e8945fb66" (UID: "c3c5805a-b679-47ca-b261-503e8945fb66"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.221145 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-scripts" (OuterVolumeSpecName: "scripts") pod "c3c5805a-b679-47ca-b261-503e8945fb66" (UID: "c3c5805a-b679-47ca-b261-503e8945fb66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.223106 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3c5805a-b679-47ca-b261-503e8945fb66" (UID: "c3c5805a-b679-47ca-b261-503e8945fb66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.254014 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data" (OuterVolumeSpecName: "config-data") pod "c3c5805a-b679-47ca-b261-503e8945fb66" (UID: "c3c5805a-b679-47ca-b261-503e8945fb66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.306920 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.306967 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c5805a-b679-47ca-b261-503e8945fb66-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.306986 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l56l4\" (UniqueName: \"kubernetes.io/projected/c3c5805a-b679-47ca-b261-503e8945fb66-kube-api-access-l56l4\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.307005 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.307020 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.307036 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c5805a-b679-47ca-b261-503e8945fb66-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.307051 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3c5805a-b679-47ca-b261-503e8945fb66-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.538548 4786 generic.go:334] "Generic (PLEG): container finished" podID="c3c5805a-b679-47ca-b261-503e8945fb66" containerID="da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea" exitCode=0 Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.538588 4786 generic.go:334] "Generic (PLEG): container finished" podID="c3c5805a-b679-47ca-b261-503e8945fb66" containerID="f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880" exitCode=143 Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.538620 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3c5805a-b679-47ca-b261-503e8945fb66","Type":"ContainerDied","Data":"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea"} Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.538627 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.538666 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3c5805a-b679-47ca-b261-503e8945fb66","Type":"ContainerDied","Data":"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880"} Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.538688 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3c5805a-b679-47ca-b261-503e8945fb66","Type":"ContainerDied","Data":"1d6f30ec98ab1e40e1fb2bae7f197b7f683b43d549d744c4da5ddb31abe333ab"} Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.538714 4786 scope.go:117] "RemoveContainer" containerID="da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.596767 4786 scope.go:117] "RemoveContainer" containerID="f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.596930 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.605982 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.618220 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:30 crc kubenswrapper[4786]: E0313 16:41:30.622020 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" containerName="cinder-api" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.622044 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" containerName="cinder-api" Mar 13 16:41:30 crc kubenswrapper[4786]: E0313 16:41:30.622086 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" containerName="cinder-api-log" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.622093 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" containerName="cinder-api-log" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.622258 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" containerName="cinder-api-log" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.622278 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" containerName="cinder-api" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.623110 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.627106 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.627156 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.627374 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.627425 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.627553 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.627720 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.628150 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kwvcp" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.632176 4786 scope.go:117] "RemoveContainer" containerID="da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea" Mar 13 16:41:30 crc kubenswrapper[4786]: E0313 16:41:30.632728 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea\": container with ID starting with da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea not found: ID does not exist" containerID="da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.632778 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea"} err="failed to get container status \"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea\": rpc error: code = NotFound desc = could not find container \"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea\": container with ID starting with da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea not found: ID does not exist" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.632799 4786 scope.go:117] "RemoveContainer" containerID="f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880" Mar 13 16:41:30 crc kubenswrapper[4786]: E0313 16:41:30.633167 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880\": container with ID starting with f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880 not found: ID does not exist" containerID="f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.633201 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880"} err="failed to get container status \"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880\": rpc error: code = NotFound desc = could not find container \"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880\": container with ID starting with f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880 not found: ID does not exist" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.633231 4786 scope.go:117] "RemoveContainer" containerID="da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.633725 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea"} err="failed to get container status \"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea\": rpc error: code = NotFound desc = could not find container \"da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea\": container with ID starting with da8307cc9a2787993ab9bb0acc7bfe8c8712396e75a95e2d84a7ec5bb2c3bcea not found: ID does not exist" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.633744 4786 scope.go:117] "RemoveContainer" containerID="f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.633994 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880"} err="failed to get container status \"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880\": rpc error: code = NotFound desc = could not find container \"f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880\": container with ID starting with f98adbc60aebc3d15f68af911ebde02c3bc8a04e0bb999beed750c976457f880 not found: ID does not exist" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.714067 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.714122 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4r74\" (UniqueName: \"kubernetes.io/projected/864727bc-9673-4dbd-aef3-4b063a33c3d5-kube-api-access-c4r74\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.714338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.714550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/864727bc-9673-4dbd-aef3-4b063a33c3d5-logs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.714810 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.715151 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.715548 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/864727bc-9673-4dbd-aef3-4b063a33c3d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.715925 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-scripts\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.716150 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.818477 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.818671 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/864727bc-9673-4dbd-aef3-4b063a33c3d5-logs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.818781 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.818841 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.818954 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/864727bc-9673-4dbd-aef3-4b063a33c3d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.819079 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-scripts\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.819126 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.819201 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.819249 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4r74\" (UniqueName: \"kubernetes.io/projected/864727bc-9673-4dbd-aef3-4b063a33c3d5-kube-api-access-c4r74\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.819422 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/864727bc-9673-4dbd-aef3-4b063a33c3d5-logs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.821099 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/864727bc-9673-4dbd-aef3-4b063a33c3d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.824848 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-scripts\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.828089 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.828963 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.829200 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.829980 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.833939 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.837975 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4r74\" (UniqueName: \"kubernetes.io/projected/864727bc-9673-4dbd-aef3-4b063a33c3d5-kube-api-access-c4r74\") pod \"cinder-api-0\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " pod="openstack/cinder-api-0" Mar 13 16:41:30 crc kubenswrapper[4786]: I0313 16:41:30.975830 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:41:31 crc kubenswrapper[4786]: W0313 16:41:31.474430 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod864727bc_9673_4dbd_aef3_4b063a33c3d5.slice/crio-c1af774f031101d988a607a14167955d61f61897164e2a1964629a5e1d27a2e7 WatchSource:0}: Error finding container c1af774f031101d988a607a14167955d61f61897164e2a1964629a5e1d27a2e7: Status 404 returned error can't find the container with id c1af774f031101d988a607a14167955d61f61897164e2a1964629a5e1d27a2e7 Mar 13 16:41:31 crc kubenswrapper[4786]: I0313 16:41:31.480200 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:41:31 crc kubenswrapper[4786]: I0313 16:41:31.560822 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"864727bc-9673-4dbd-aef3-4b063a33c3d5","Type":"ContainerStarted","Data":"c1af774f031101d988a607a14167955d61f61897164e2a1964629a5e1d27a2e7"} Mar 13 16:41:32 crc kubenswrapper[4786]: I0313 16:41:32.568619 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c5805a-b679-47ca-b261-503e8945fb66" path="/var/lib/kubelet/pods/c3c5805a-b679-47ca-b261-503e8945fb66/volumes" Mar 13 16:41:32 crc kubenswrapper[4786]: I0313 16:41:32.582182 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"864727bc-9673-4dbd-aef3-4b063a33c3d5","Type":"ContainerStarted","Data":"82776681351c06889584fd98d2ec2d4ad966afed42a03732e224d8bb933c9ab6"} Mar 13 16:41:33 crc kubenswrapper[4786]: I0313 16:41:33.597198 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"864727bc-9673-4dbd-aef3-4b063a33c3d5","Type":"ContainerStarted","Data":"5f9aab923a6dd1bb3f533889c6c8834a9fc5abd864325752cd6d68296e96d4bd"} Mar 13 16:41:33 crc kubenswrapper[4786]: I0313 16:41:33.597627 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 16:41:33 crc kubenswrapper[4786]: I0313 16:41:33.631238 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.631212016 podStartE2EDuration="3.631212016s" podCreationTimestamp="2026-03-13 16:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:41:33.620911176 +0000 UTC m=+5923.784123017" watchObservedRunningTime="2026-03-13 16:41:33.631212016 +0000 UTC m=+5923.794423827" Mar 13 16:41:36 crc kubenswrapper[4786]: I0313 16:41:36.292124 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:41:36 crc kubenswrapper[4786]: I0313 16:41:36.408728 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc846dbf-cslfh"] Mar 13 16:41:36 crc kubenswrapper[4786]: I0313 16:41:36.409090 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" podUID="d1cf7d67-e486-4be5-b1be-9a6465598de1" containerName="dnsmasq-dns" containerID="cri-o://12e93a7229724590eb99135af19d0d755824d9626ede8f3a29b08175b0cbf57f" gracePeriod=10 Mar 13 16:41:36 crc kubenswrapper[4786]: I0313 16:41:36.636164 4786 generic.go:334] "Generic (PLEG): container finished" podID="d1cf7d67-e486-4be5-b1be-9a6465598de1" containerID="12e93a7229724590eb99135af19d0d755824d9626ede8f3a29b08175b0cbf57f" exitCode=0 Mar 13 16:41:36 crc kubenswrapper[4786]: I0313 16:41:36.636253 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" event={"ID":"d1cf7d67-e486-4be5-b1be-9a6465598de1","Type":"ContainerDied","Data":"12e93a7229724590eb99135af19d0d755824d9626ede8f3a29b08175b0cbf57f"} Mar 13 16:41:36 crc kubenswrapper[4786]: I0313 16:41:36.979049 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.054125 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-nb\") pod \"d1cf7d67-e486-4be5-b1be-9a6465598de1\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.054196 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-dns-svc\") pod \"d1cf7d67-e486-4be5-b1be-9a6465598de1\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.054297 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbn7r\" (UniqueName: \"kubernetes.io/projected/d1cf7d67-e486-4be5-b1be-9a6465598de1-kube-api-access-jbn7r\") pod \"d1cf7d67-e486-4be5-b1be-9a6465598de1\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.054318 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-sb\") pod \"d1cf7d67-e486-4be5-b1be-9a6465598de1\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.054344 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-config\") pod \"d1cf7d67-e486-4be5-b1be-9a6465598de1\" (UID: \"d1cf7d67-e486-4be5-b1be-9a6465598de1\") " Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.072242 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1cf7d67-e486-4be5-b1be-9a6465598de1-kube-api-access-jbn7r" (OuterVolumeSpecName: "kube-api-access-jbn7r") pod "d1cf7d67-e486-4be5-b1be-9a6465598de1" (UID: "d1cf7d67-e486-4be5-b1be-9a6465598de1"). InnerVolumeSpecName "kube-api-access-jbn7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.112963 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-config" (OuterVolumeSpecName: "config") pod "d1cf7d67-e486-4be5-b1be-9a6465598de1" (UID: "d1cf7d67-e486-4be5-b1be-9a6465598de1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.121723 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d1cf7d67-e486-4be5-b1be-9a6465598de1" (UID: "d1cf7d67-e486-4be5-b1be-9a6465598de1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.123373 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d1cf7d67-e486-4be5-b1be-9a6465598de1" (UID: "d1cf7d67-e486-4be5-b1be-9a6465598de1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.133830 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1cf7d67-e486-4be5-b1be-9a6465598de1" (UID: "d1cf7d67-e486-4be5-b1be-9a6465598de1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.156698 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.156735 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.156746 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbn7r\" (UniqueName: \"kubernetes.io/projected/d1cf7d67-e486-4be5-b1be-9a6465598de1-kube-api-access-jbn7r\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.156758 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.156767 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cf7d67-e486-4be5-b1be-9a6465598de1-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.651392 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" event={"ID":"d1cf7d67-e486-4be5-b1be-9a6465598de1","Type":"ContainerDied","Data":"50397e6d0883954e835b18c00a308ea5b8a3f50481b4f02de0f724e31e64a843"} Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.651797 4786 scope.go:117] "RemoveContainer" containerID="12e93a7229724590eb99135af19d0d755824d9626ede8f3a29b08175b0cbf57f" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.651489 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bc846dbf-cslfh" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.709186 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bc846dbf-cslfh"] Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.714300 4786 scope.go:117] "RemoveContainer" containerID="0d195be89c4091a19dff7648bca97c371c58acdd25ab0ad27bd491e120fcb3fa" Mar 13 16:41:37 crc kubenswrapper[4786]: I0313 16:41:37.718101 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bc846dbf-cslfh"] Mar 13 16:41:38 crc kubenswrapper[4786]: I0313 16:41:38.570448 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1cf7d67-e486-4be5-b1be-9a6465598de1" path="/var/lib/kubelet/pods/d1cf7d67-e486-4be5-b1be-9a6465598de1/volumes" Mar 13 16:41:42 crc kubenswrapper[4786]: I0313 16:41:42.715009 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.090344 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-68djb"] Mar 13 16:41:45 crc kubenswrapper[4786]: E0313 16:41:45.090985 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cf7d67-e486-4be5-b1be-9a6465598de1" containerName="dnsmasq-dns" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.090999 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cf7d67-e486-4be5-b1be-9a6465598de1" containerName="dnsmasq-dns" Mar 13 16:41:45 crc kubenswrapper[4786]: E0313 16:41:45.091012 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cf7d67-e486-4be5-b1be-9a6465598de1" containerName="init" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.091018 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cf7d67-e486-4be5-b1be-9a6465598de1" containerName="init" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.091166 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1cf7d67-e486-4be5-b1be-9a6465598de1" containerName="dnsmasq-dns" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.092491 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.116355 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-68djb"] Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.220519 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-utilities\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.220609 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-catalog-content\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.220644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2w8\" (UniqueName: \"kubernetes.io/projected/2ee7dfa4-45b7-4f8a-9e6f-234039712676-kube-api-access-rp2w8\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.322040 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-utilities\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.322176 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-catalog-content\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.322230 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2w8\" (UniqueName: \"kubernetes.io/projected/2ee7dfa4-45b7-4f8a-9e6f-234039712676-kube-api-access-rp2w8\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.322677 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-utilities\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.322807 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-catalog-content\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.345614 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2w8\" (UniqueName: \"kubernetes.io/projected/2ee7dfa4-45b7-4f8a-9e6f-234039712676-kube-api-access-rp2w8\") pod \"certified-operators-68djb\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.423019 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:45 crc kubenswrapper[4786]: I0313 16:41:45.905762 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-68djb"] Mar 13 16:41:46 crc kubenswrapper[4786]: I0313 16:41:46.773348 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerID="9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049" exitCode=0 Mar 13 16:41:46 crc kubenswrapper[4786]: I0313 16:41:46.773718 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68djb" event={"ID":"2ee7dfa4-45b7-4f8a-9e6f-234039712676","Type":"ContainerDied","Data":"9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049"} Mar 13 16:41:46 crc kubenswrapper[4786]: I0313 16:41:46.774139 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68djb" event={"ID":"2ee7dfa4-45b7-4f8a-9e6f-234039712676","Type":"ContainerStarted","Data":"5354ceb8ce29794caa364671a622ca1e7b4248a7e6d125b783e5032b94bd2438"} Mar 13 16:41:47 crc kubenswrapper[4786]: I0313 16:41:47.787550 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68djb" event={"ID":"2ee7dfa4-45b7-4f8a-9e6f-234039712676","Type":"ContainerStarted","Data":"d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20"} Mar 13 16:41:48 crc kubenswrapper[4786]: I0313 16:41:48.808766 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerID="d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20" exitCode=0 Mar 13 16:41:48 crc kubenswrapper[4786]: I0313 16:41:48.808828 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68djb" event={"ID":"2ee7dfa4-45b7-4f8a-9e6f-234039712676","Type":"ContainerDied","Data":"d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20"} Mar 13 16:41:49 crc kubenswrapper[4786]: I0313 16:41:49.823158 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68djb" event={"ID":"2ee7dfa4-45b7-4f8a-9e6f-234039712676","Type":"ContainerStarted","Data":"52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c"} Mar 13 16:41:49 crc kubenswrapper[4786]: I0313 16:41:49.855319 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-68djb" podStartSLOduration=2.353507717 podStartE2EDuration="4.855299179s" podCreationTimestamp="2026-03-13 16:41:45 +0000 UTC" firstStartedPulling="2026-03-13 16:41:46.783445935 +0000 UTC m=+5936.946657776" lastFinishedPulling="2026-03-13 16:41:49.285237387 +0000 UTC m=+5939.448449238" observedRunningTime="2026-03-13 16:41:49.847309548 +0000 UTC m=+5940.010521369" watchObservedRunningTime="2026-03-13 16:41:49.855299179 +0000 UTC m=+5940.018511000" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.065056 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4fzfm"] Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.070832 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.084423 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fzfm"] Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.178189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmksf\" (UniqueName: \"kubernetes.io/projected/8ae944f8-6adc-4a54-8a13-7d971c2d503a-kube-api-access-wmksf\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.178259 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-catalog-content\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.178359 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-utilities\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.279779 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-catalog-content\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.280244 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-utilities\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.280319 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-catalog-content\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.280393 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmksf\" (UniqueName: \"kubernetes.io/projected/8ae944f8-6adc-4a54-8a13-7d971c2d503a-kube-api-access-wmksf\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.280629 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-utilities\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.300261 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmksf\" (UniqueName: \"kubernetes.io/projected/8ae944f8-6adc-4a54-8a13-7d971c2d503a-kube-api-access-wmksf\") pod \"redhat-marketplace-4fzfm\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.403264 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:41:53 crc kubenswrapper[4786]: W0313 16:41:53.855697 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ae944f8_6adc_4a54_8a13_7d971c2d503a.slice/crio-5ac0c236398acc4cdee66456aa3c82ab790ff953b91332fc776dc6409304f469 WatchSource:0}: Error finding container 5ac0c236398acc4cdee66456aa3c82ab790ff953b91332fc776dc6409304f469: Status 404 returned error can't find the container with id 5ac0c236398acc4cdee66456aa3c82ab790ff953b91332fc776dc6409304f469 Mar 13 16:41:53 crc kubenswrapper[4786]: I0313 16:41:53.860273 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fzfm"] Mar 13 16:41:54 crc kubenswrapper[4786]: I0313 16:41:54.886115 4786 generic.go:334] "Generic (PLEG): container finished" podID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerID="0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15" exitCode=0 Mar 13 16:41:54 crc kubenswrapper[4786]: I0313 16:41:54.886542 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fzfm" event={"ID":"8ae944f8-6adc-4a54-8a13-7d971c2d503a","Type":"ContainerDied","Data":"0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15"} Mar 13 16:41:54 crc kubenswrapper[4786]: I0313 16:41:54.886718 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fzfm" event={"ID":"8ae944f8-6adc-4a54-8a13-7d971c2d503a","Type":"ContainerStarted","Data":"5ac0c236398acc4cdee66456aa3c82ab790ff953b91332fc776dc6409304f469"} Mar 13 16:41:55 crc kubenswrapper[4786]: I0313 16:41:55.424081 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:55 crc kubenswrapper[4786]: I0313 16:41:55.424555 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:55 crc kubenswrapper[4786]: I0313 16:41:55.515773 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:55 crc kubenswrapper[4786]: I0313 16:41:55.899024 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fzfm" event={"ID":"8ae944f8-6adc-4a54-8a13-7d971c2d503a","Type":"ContainerStarted","Data":"f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e"} Mar 13 16:41:55 crc kubenswrapper[4786]: I0313 16:41:55.970493 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:56 crc kubenswrapper[4786]: I0313 16:41:56.911206 4786 generic.go:334] "Generic (PLEG): container finished" podID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerID="f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e" exitCode=0 Mar 13 16:41:56 crc kubenswrapper[4786]: I0313 16:41:56.911312 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fzfm" event={"ID":"8ae944f8-6adc-4a54-8a13-7d971c2d503a","Type":"ContainerDied","Data":"f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e"} Mar 13 16:41:57 crc kubenswrapper[4786]: I0313 16:41:57.838559 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-68djb"] Mar 13 16:41:57 crc kubenswrapper[4786]: I0313 16:41:57.936310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fzfm" event={"ID":"8ae944f8-6adc-4a54-8a13-7d971c2d503a","Type":"ContainerStarted","Data":"5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34"} Mar 13 16:41:57 crc kubenswrapper[4786]: I0313 16:41:57.936464 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-68djb" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerName="registry-server" containerID="cri-o://52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c" gracePeriod=2 Mar 13 16:41:57 crc kubenswrapper[4786]: I0313 16:41:57.976690 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4fzfm" podStartSLOduration=2.555234626 podStartE2EDuration="4.976656996s" podCreationTimestamp="2026-03-13 16:41:53 +0000 UTC" firstStartedPulling="2026-03-13 16:41:54.888268535 +0000 UTC m=+5945.051480386" lastFinishedPulling="2026-03-13 16:41:57.309690905 +0000 UTC m=+5947.472902756" observedRunningTime="2026-03-13 16:41:57.964039178 +0000 UTC m=+5948.127250999" watchObservedRunningTime="2026-03-13 16:41:57.976656996 +0000 UTC m=+5948.139868837" Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.495478 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.603928 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-utilities\") pod \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.604018 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp2w8\" (UniqueName: \"kubernetes.io/projected/2ee7dfa4-45b7-4f8a-9e6f-234039712676-kube-api-access-rp2w8\") pod \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.604148 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-catalog-content\") pod \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\" (UID: \"2ee7dfa4-45b7-4f8a-9e6f-234039712676\") " Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.605062 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-utilities" (OuterVolumeSpecName: "utilities") pod "2ee7dfa4-45b7-4f8a-9e6f-234039712676" (UID: "2ee7dfa4-45b7-4f8a-9e6f-234039712676"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.617219 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee7dfa4-45b7-4f8a-9e6f-234039712676-kube-api-access-rp2w8" (OuterVolumeSpecName: "kube-api-access-rp2w8") pod "2ee7dfa4-45b7-4f8a-9e6f-234039712676" (UID: "2ee7dfa4-45b7-4f8a-9e6f-234039712676"). InnerVolumeSpecName "kube-api-access-rp2w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.706429 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.706465 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp2w8\" (UniqueName: \"kubernetes.io/projected/2ee7dfa4-45b7-4f8a-9e6f-234039712676-kube-api-access-rp2w8\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.949152 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerID="52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c" exitCode=0 Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.949251 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68djb" event={"ID":"2ee7dfa4-45b7-4f8a-9e6f-234039712676","Type":"ContainerDied","Data":"52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c"} Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.949538 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68djb" event={"ID":"2ee7dfa4-45b7-4f8a-9e6f-234039712676","Type":"ContainerDied","Data":"5354ceb8ce29794caa364671a622ca1e7b4248a7e6d125b783e5032b94bd2438"} Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.949280 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68djb" Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.949589 4786 scope.go:117] "RemoveContainer" containerID="52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c" Mar 13 16:41:58 crc kubenswrapper[4786]: I0313 16:41:58.973223 4786 scope.go:117] "RemoveContainer" containerID="d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.000064 4786 scope.go:117] "RemoveContainer" containerID="9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.030937 4786 scope.go:117] "RemoveContainer" containerID="52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c" Mar 13 16:41:59 crc kubenswrapper[4786]: E0313 16:41:59.031438 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c\": container with ID starting with 52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c not found: ID does not exist" containerID="52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.031485 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c"} err="failed to get container status \"52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c\": rpc error: code = NotFound desc = could not find container \"52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c\": container with ID starting with 52641354f9194a8610d93ac4a58a78d950f90875739ebf4a76c5f2f028c75e1c not found: ID does not exist" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.031511 4786 scope.go:117] "RemoveContainer" containerID="d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20" Mar 13 16:41:59 crc kubenswrapper[4786]: E0313 16:41:59.031802 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20\": container with ID starting with d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20 not found: ID does not exist" containerID="d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.031841 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20"} err="failed to get container status \"d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20\": rpc error: code = NotFound desc = could not find container \"d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20\": container with ID starting with d69d62a5d46982e202815dc1eace880db0826bde08903e286d9c9eaa6cbdea20 not found: ID does not exist" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.031896 4786 scope.go:117] "RemoveContainer" containerID="9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049" Mar 13 16:41:59 crc kubenswrapper[4786]: E0313 16:41:59.032119 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049\": container with ID starting with 9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049 not found: ID does not exist" containerID="9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.032146 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049"} err="failed to get container status \"9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049\": rpc error: code = NotFound desc = could not find container \"9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049\": container with ID starting with 9874ccad927658928859218a35d5e4c9579285d7809725a6786d046459b14049 not found: ID does not exist" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.047490 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ee7dfa4-45b7-4f8a-9e6f-234039712676" (UID: "2ee7dfa4-45b7-4f8a-9e6f-234039712676"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.114532 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ee7dfa4-45b7-4f8a-9e6f-234039712676-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.283669 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-68djb"] Mar 13 16:41:59 crc kubenswrapper[4786]: I0313 16:41:59.307434 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-68djb"] Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.091794 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:00 crc kubenswrapper[4786]: E0313 16:42:00.092306 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerName="extract-utilities" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.092327 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerName="extract-utilities" Mar 13 16:42:00 crc kubenswrapper[4786]: E0313 16:42:00.092364 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerName="extract-content" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.092373 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerName="extract-content" Mar 13 16:42:00 crc kubenswrapper[4786]: E0313 16:42:00.092409 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerName="registry-server" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.092421 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerName="registry-server" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.092669 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" containerName="registry-server" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.093947 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.101262 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.124482 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.176766 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.176991 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9118d024-0154-4f26-9d7d-fc8516b90904-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.177076 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvk62\" (UniqueName: \"kubernetes.io/projected/9118d024-0154-4f26-9d7d-fc8516b90904-kube-api-access-jvk62\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.177204 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.177372 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.177515 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-scripts\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.193409 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557002-jfkcc"] Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.194477 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557002-jfkcc" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.196763 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.196777 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.197688 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.203674 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557002-jfkcc"] Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.279886 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.280032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9118d024-0154-4f26-9d7d-fc8516b90904-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.280103 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvk62\" (UniqueName: \"kubernetes.io/projected/9118d024-0154-4f26-9d7d-fc8516b90904-kube-api-access-jvk62\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.280171 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.280211 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2drk7\" (UniqueName: \"kubernetes.io/projected/ec517d8e-ab19-4446-b4a4-17bd55010656-kube-api-access-2drk7\") pod \"auto-csr-approver-29557002-jfkcc\" (UID: \"ec517d8e-ab19-4446-b4a4-17bd55010656\") " pod="openshift-infra/auto-csr-approver-29557002-jfkcc" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.280216 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9118d024-0154-4f26-9d7d-fc8516b90904-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.280301 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.280361 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-scripts\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.285740 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.286548 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.294546 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.296312 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-scripts\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.305699 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvk62\" (UniqueName: \"kubernetes.io/projected/9118d024-0154-4f26-9d7d-fc8516b90904-kube-api-access-jvk62\") pod \"cinder-scheduler-0\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.381741 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2drk7\" (UniqueName: \"kubernetes.io/projected/ec517d8e-ab19-4446-b4a4-17bd55010656-kube-api-access-2drk7\") pod \"auto-csr-approver-29557002-jfkcc\" (UID: \"ec517d8e-ab19-4446-b4a4-17bd55010656\") " pod="openshift-infra/auto-csr-approver-29557002-jfkcc" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.396703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2drk7\" (UniqueName: \"kubernetes.io/projected/ec517d8e-ab19-4446-b4a4-17bd55010656-kube-api-access-2drk7\") pod \"auto-csr-approver-29557002-jfkcc\" (UID: \"ec517d8e-ab19-4446-b4a4-17bd55010656\") " pod="openshift-infra/auto-csr-approver-29557002-jfkcc" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.424545 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.509521 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557002-jfkcc" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.571298 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee7dfa4-45b7-4f8a-9e6f-234039712676" path="/var/lib/kubelet/pods/2ee7dfa4-45b7-4f8a-9e6f-234039712676/volumes" Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.687762 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:00 crc kubenswrapper[4786]: W0313 16:42:00.701279 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9118d024_0154_4f26_9d7d_fc8516b90904.slice/crio-c529973bae7915630b07aa4a010f531fe78f146f6720db006b17a2c7765fdeb3 WatchSource:0}: Error finding container c529973bae7915630b07aa4a010f531fe78f146f6720db006b17a2c7765fdeb3: Status 404 returned error can't find the container with id c529973bae7915630b07aa4a010f531fe78f146f6720db006b17a2c7765fdeb3 Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.787750 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557002-jfkcc"] Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.971233 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557002-jfkcc" event={"ID":"ec517d8e-ab19-4446-b4a4-17bd55010656","Type":"ContainerStarted","Data":"978de514eb432de39165f2052201a4d16e814a8575dfcca534c6b8cbc5e1e00a"} Mar 13 16:42:00 crc kubenswrapper[4786]: I0313 16:42:00.973221 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9118d024-0154-4f26-9d7d-fc8516b90904","Type":"ContainerStarted","Data":"c529973bae7915630b07aa4a010f531fe78f146f6720db006b17a2c7765fdeb3"} Mar 13 16:42:01 crc kubenswrapper[4786]: I0313 16:42:01.585944 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:42:01 crc kubenswrapper[4786]: I0313 16:42:01.588047 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerName="cinder-api-log" containerID="cri-o://82776681351c06889584fd98d2ec2d4ad966afed42a03732e224d8bb933c9ab6" gracePeriod=30 Mar 13 16:42:01 crc kubenswrapper[4786]: I0313 16:42:01.588125 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerName="cinder-api" containerID="cri-o://5f9aab923a6dd1bb3f533889c6c8834a9fc5abd864325752cd6d68296e96d4bd" gracePeriod=30 Mar 13 16:42:01 crc kubenswrapper[4786]: I0313 16:42:01.985138 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9118d024-0154-4f26-9d7d-fc8516b90904","Type":"ContainerStarted","Data":"50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1"} Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.004586 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9118d024-0154-4f26-9d7d-fc8516b90904","Type":"ContainerStarted","Data":"3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8"} Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.012301 4786 generic.go:334] "Generic (PLEG): container finished" podID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerID="82776681351c06889584fd98d2ec2d4ad966afed42a03732e224d8bb933c9ab6" exitCode=143 Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.012396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"864727bc-9673-4dbd-aef3-4b063a33c3d5","Type":"ContainerDied","Data":"82776681351c06889584fd98d2ec2d4ad966afed42a03732e224d8bb933c9ab6"} Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.016277 4786 generic.go:334] "Generic (PLEG): container finished" podID="ec517d8e-ab19-4446-b4a4-17bd55010656" containerID="bcae50cc8be497e6f2da73ae7b1ec0602d18a604b764ff4ca71f0e0a5666ca3e" exitCode=0 Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.016349 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557002-jfkcc" event={"ID":"ec517d8e-ab19-4446-b4a4-17bd55010656","Type":"ContainerDied","Data":"bcae50cc8be497e6f2da73ae7b1ec0602d18a604b764ff4ca71f0e0a5666ca3e"} Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.057572 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.057550938 podStartE2EDuration="3.057550938s" podCreationTimestamp="2026-03-13 16:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:03.036392966 +0000 UTC m=+5953.199604817" watchObservedRunningTime="2026-03-13 16:42:03.057550938 +0000 UTC m=+5953.220762759" Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.403583 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.403648 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:42:03 crc kubenswrapper[4786]: I0313 16:42:03.490235 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:42:04 crc kubenswrapper[4786]: I0313 16:42:04.092965 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:42:04 crc kubenswrapper[4786]: I0313 16:42:04.166340 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fzfm"] Mar 13 16:42:04 crc kubenswrapper[4786]: I0313 16:42:04.403896 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557002-jfkcc" Mar 13 16:42:04 crc kubenswrapper[4786]: I0313 16:42:04.464539 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2drk7\" (UniqueName: \"kubernetes.io/projected/ec517d8e-ab19-4446-b4a4-17bd55010656-kube-api-access-2drk7\") pod \"ec517d8e-ab19-4446-b4a4-17bd55010656\" (UID: \"ec517d8e-ab19-4446-b4a4-17bd55010656\") " Mar 13 16:42:04 crc kubenswrapper[4786]: I0313 16:42:04.474003 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec517d8e-ab19-4446-b4a4-17bd55010656-kube-api-access-2drk7" (OuterVolumeSpecName: "kube-api-access-2drk7") pod "ec517d8e-ab19-4446-b4a4-17bd55010656" (UID: "ec517d8e-ab19-4446-b4a4-17bd55010656"). InnerVolumeSpecName "kube-api-access-2drk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:04 crc kubenswrapper[4786]: I0313 16:42:04.566166 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2drk7\" (UniqueName: \"kubernetes.io/projected/ec517d8e-ab19-4446-b4a4-17bd55010656-kube-api-access-2drk7\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.071082 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557002-jfkcc" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.071145 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557002-jfkcc" event={"ID":"ec517d8e-ab19-4446-b4a4-17bd55010656","Type":"ContainerDied","Data":"978de514eb432de39165f2052201a4d16e814a8575dfcca534c6b8cbc5e1e00a"} Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.071982 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978de514eb432de39165f2052201a4d16e814a8575dfcca534c6b8cbc5e1e00a" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.078770 4786 generic.go:334] "Generic (PLEG): container finished" podID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerID="5f9aab923a6dd1bb3f533889c6c8834a9fc5abd864325752cd6d68296e96d4bd" exitCode=0 Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.078827 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"864727bc-9673-4dbd-aef3-4b063a33c3d5","Type":"ContainerDied","Data":"5f9aab923a6dd1bb3f533889c6c8834a9fc5abd864325752cd6d68296e96d4bd"} Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.244549 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383091 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/864727bc-9673-4dbd-aef3-4b063a33c3d5-etc-machine-id\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383196 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-scripts\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383226 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383247 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-internal-tls-certs\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383257 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/864727bc-9673-4dbd-aef3-4b063a33c3d5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-combined-ca-bundle\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383379 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4r74\" (UniqueName: \"kubernetes.io/projected/864727bc-9673-4dbd-aef3-4b063a33c3d5-kube-api-access-c4r74\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383434 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-public-tls-certs\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383453 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/864727bc-9673-4dbd-aef3-4b063a33c3d5-logs\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383470 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data-custom\") pod \"864727bc-9673-4dbd-aef3-4b063a33c3d5\" (UID: \"864727bc-9673-4dbd-aef3-4b063a33c3d5\") " Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.383810 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/864727bc-9673-4dbd-aef3-4b063a33c3d5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.384596 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/864727bc-9673-4dbd-aef3-4b063a33c3d5-logs" (OuterVolumeSpecName: "logs") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.392717 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.398055 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-scripts" (OuterVolumeSpecName: "scripts") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.398897 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864727bc-9673-4dbd-aef3-4b063a33c3d5-kube-api-access-c4r74" (OuterVolumeSpecName: "kube-api-access-c4r74") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "kube-api-access-c4r74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.425365 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.435914 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.447932 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.479400 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.480602 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data" (OuterVolumeSpecName: "config-data") pod "864727bc-9673-4dbd-aef3-4b063a33c3d5" (UID: "864727bc-9673-4dbd-aef3-4b063a33c3d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.485503 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4r74\" (UniqueName: \"kubernetes.io/projected/864727bc-9673-4dbd-aef3-4b063a33c3d5-kube-api-access-c4r74\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.485539 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.485553 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/864727bc-9673-4dbd-aef3-4b063a33c3d5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.485566 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.485578 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.485590 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.485602 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.485614 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864727bc-9673-4dbd-aef3-4b063a33c3d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.487131 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556996-78wh6"] Mar 13 16:42:05 crc kubenswrapper[4786]: I0313 16:42:05.494826 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556996-78wh6"] Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.093163 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.093161 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"864727bc-9673-4dbd-aef3-4b063a33c3d5","Type":"ContainerDied","Data":"c1af774f031101d988a607a14167955d61f61897164e2a1964629a5e1d27a2e7"} Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.093311 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4fzfm" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerName="registry-server" containerID="cri-o://5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34" gracePeriod=2 Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.093617 4786 scope.go:117] "RemoveContainer" containerID="5f9aab923a6dd1bb3f533889c6c8834a9fc5abd864325752cd6d68296e96d4bd" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.144202 4786 scope.go:117] "RemoveContainer" containerID="82776681351c06889584fd98d2ec2d4ad966afed42a03732e224d8bb933c9ab6" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.145706 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.176419 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.187782 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:42:06 crc kubenswrapper[4786]: E0313 16:42:06.188292 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerName="cinder-api-log" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.188315 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerName="cinder-api-log" Mar 13 16:42:06 crc kubenswrapper[4786]: E0313 16:42:06.188337 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec517d8e-ab19-4446-b4a4-17bd55010656" containerName="oc" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.188345 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec517d8e-ab19-4446-b4a4-17bd55010656" containerName="oc" Mar 13 16:42:06 crc kubenswrapper[4786]: E0313 16:42:06.188363 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerName="cinder-api" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.188369 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerName="cinder-api" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.188577 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec517d8e-ab19-4446-b4a4-17bd55010656" containerName="oc" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.188595 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerName="cinder-api-log" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.188623 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" containerName="cinder-api" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.189552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.192577 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.192896 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.193177 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.205261 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.302612 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-logs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.302926 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-scripts\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.303183 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.303243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.303279 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-config-data-custom\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.303610 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-config-data\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.303715 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.303920 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.304250 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48tq\" (UniqueName: \"kubernetes.io/projected/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-kube-api-access-g48tq\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.406926 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48tq\" (UniqueName: \"kubernetes.io/projected/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-kube-api-access-g48tq\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.407053 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-logs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.407082 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-scripts\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.407162 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.407188 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.407213 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-config-data-custom\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.407243 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-config-data\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.407285 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.407310 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.408125 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.408694 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-logs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.411932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.412717 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.413295 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-config-data-custom\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.415308 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.420396 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-scripts\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.428971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-config-data\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.432364 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48tq\" (UniqueName: \"kubernetes.io/projected/9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc-kube-api-access-g48tq\") pod \"cinder-api-0\" (UID: \"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc\") " pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.517978 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.526008 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.590251 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864727bc-9673-4dbd-aef3-4b063a33c3d5" path="/var/lib/kubelet/pods/864727bc-9673-4dbd-aef3-4b063a33c3d5/volumes" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.590830 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ecb255-40fb-45aa-8bdc-3080d956dec4" path="/var/lib/kubelet/pods/c6ecb255-40fb-45aa-8bdc-3080d956dec4/volumes" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.611030 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmksf\" (UniqueName: \"kubernetes.io/projected/8ae944f8-6adc-4a54-8a13-7d971c2d503a-kube-api-access-wmksf\") pod \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.611082 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-utilities\") pod \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.611160 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-catalog-content\") pod \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\" (UID: \"8ae944f8-6adc-4a54-8a13-7d971c2d503a\") " Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.616047 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae944f8-6adc-4a54-8a13-7d971c2d503a-kube-api-access-wmksf" (OuterVolumeSpecName: "kube-api-access-wmksf") pod "8ae944f8-6adc-4a54-8a13-7d971c2d503a" (UID: "8ae944f8-6adc-4a54-8a13-7d971c2d503a"). InnerVolumeSpecName "kube-api-access-wmksf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.619477 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-utilities" (OuterVolumeSpecName: "utilities") pod "8ae944f8-6adc-4a54-8a13-7d971c2d503a" (UID: "8ae944f8-6adc-4a54-8a13-7d971c2d503a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.639140 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ae944f8-6adc-4a54-8a13-7d971c2d503a" (UID: "8ae944f8-6adc-4a54-8a13-7d971c2d503a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.713580 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmksf\" (UniqueName: \"kubernetes.io/projected/8ae944f8-6adc-4a54-8a13-7d971c2d503a-kube-api-access-wmksf\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.713650 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:06 crc kubenswrapper[4786]: I0313 16:42:06.713665 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ae944f8-6adc-4a54-8a13-7d971c2d503a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.109640 4786 generic.go:334] "Generic (PLEG): container finished" podID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerID="5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34" exitCode=0 Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.109783 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4fzfm" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.109764 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fzfm" event={"ID":"8ae944f8-6adc-4a54-8a13-7d971c2d503a","Type":"ContainerDied","Data":"5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34"} Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.109985 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4fzfm" event={"ID":"8ae944f8-6adc-4a54-8a13-7d971c2d503a","Type":"ContainerDied","Data":"5ac0c236398acc4cdee66456aa3c82ab790ff953b91332fc776dc6409304f469"} Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.110033 4786 scope.go:117] "RemoveContainer" containerID="5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.155699 4786 scope.go:117] "RemoveContainer" containerID="f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.178370 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fzfm"] Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.178536 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4fzfm"] Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.697152 4786 scope.go:117] "RemoveContainer" containerID="0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.722428 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.841422 4786 scope.go:117] "RemoveContainer" containerID="5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34" Mar 13 16:42:07 crc kubenswrapper[4786]: E0313 16:42:07.841911 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34\": container with ID starting with 5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34 not found: ID does not exist" containerID="5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.841950 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34"} err="failed to get container status \"5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34\": rpc error: code = NotFound desc = could not find container \"5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34\": container with ID starting with 5a394828e4a23fc01f3a574fe80790adab58d5cb488fefb5a370e50a5d15fe34 not found: ID does not exist" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.841974 4786 scope.go:117] "RemoveContainer" containerID="f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e" Mar 13 16:42:07 crc kubenswrapper[4786]: E0313 16:42:07.842208 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e\": container with ID starting with f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e not found: ID does not exist" containerID="f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.842241 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e"} err="failed to get container status \"f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e\": rpc error: code = NotFound desc = could not find container \"f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e\": container with ID starting with f90bcfdba11257ba6667b0cd50713f26b590202e5a433e3c2b89f7f9b970ab8e not found: ID does not exist" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.842258 4786 scope.go:117] "RemoveContainer" containerID="0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15" Mar 13 16:42:07 crc kubenswrapper[4786]: E0313 16:42:07.842504 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15\": container with ID starting with 0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15 not found: ID does not exist" containerID="0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15" Mar 13 16:42:07 crc kubenswrapper[4786]: I0313 16:42:07.842527 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15"} err="failed to get container status \"0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15\": rpc error: code = NotFound desc = could not find container \"0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15\": container with ID starting with 0478548618f580157885f10925365a45b19998a0e00f1576222cedac44ebfb15 not found: ID does not exist" Mar 13 16:42:08 crc kubenswrapper[4786]: I0313 16:42:08.128212 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc","Type":"ContainerStarted","Data":"7ae9137329d46e8a14b7c691fceb51dcedf18e78cc1a38f690952f477f2cf79e"} Mar 13 16:42:08 crc kubenswrapper[4786]: I0313 16:42:08.566230 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" path="/var/lib/kubelet/pods/8ae944f8-6adc-4a54-8a13-7d971c2d503a/volumes" Mar 13 16:42:09 crc kubenswrapper[4786]: I0313 16:42:09.136548 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc","Type":"ContainerStarted","Data":"70cc1e82801391c7009568e6fb7bb3baaaeb0de3a2371d9cd2de43b2a015143a"} Mar 13 16:42:09 crc kubenswrapper[4786]: I0313 16:42:09.136967 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 16:42:09 crc kubenswrapper[4786]: I0313 16:42:09.136979 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc","Type":"ContainerStarted","Data":"7e18d0e3df702d077b1bf463ff25083e4b0ed6d711a1377e0c597b6162ac6aac"} Mar 13 16:42:09 crc kubenswrapper[4786]: I0313 16:42:09.162808 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.162782489 podStartE2EDuration="3.162782489s" podCreationTimestamp="2026-03-13 16:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:09.161480916 +0000 UTC m=+5959.324692727" watchObservedRunningTime="2026-03-13 16:42:09.162782489 +0000 UTC m=+5959.325994310" Mar 13 16:42:10 crc kubenswrapper[4786]: I0313 16:42:10.636022 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 16:42:10 crc kubenswrapper[4786]: I0313 16:42:10.719025 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:11 crc kubenswrapper[4786]: I0313 16:42:11.156344 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" containerName="cinder-scheduler" containerID="cri-o://50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1" gracePeriod=30 Mar 13 16:42:11 crc kubenswrapper[4786]: I0313 16:42:11.156381 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" containerName="probe" containerID="cri-o://3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8" gracePeriod=30 Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.179749 4786 generic.go:334] "Generic (PLEG): container finished" podID="9118d024-0154-4f26-9d7d-fc8516b90904" containerID="3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8" exitCode=0 Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.180129 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9118d024-0154-4f26-9d7d-fc8516b90904","Type":"ContainerDied","Data":"3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8"} Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.771387 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.810337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-combined-ca-bundle\") pod \"9118d024-0154-4f26-9d7d-fc8516b90904\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.818332 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-scripts\") pod \"9118d024-0154-4f26-9d7d-fc8516b90904\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.818708 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data-custom\") pod \"9118d024-0154-4f26-9d7d-fc8516b90904\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.818751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9118d024-0154-4f26-9d7d-fc8516b90904-etc-machine-id\") pod \"9118d024-0154-4f26-9d7d-fc8516b90904\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.818951 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data\") pod \"9118d024-0154-4f26-9d7d-fc8516b90904\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.819131 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvk62\" (UniqueName: \"kubernetes.io/projected/9118d024-0154-4f26-9d7d-fc8516b90904-kube-api-access-jvk62\") pod \"9118d024-0154-4f26-9d7d-fc8516b90904\" (UID: \"9118d024-0154-4f26-9d7d-fc8516b90904\") " Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.820388 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9118d024-0154-4f26-9d7d-fc8516b90904-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9118d024-0154-4f26-9d7d-fc8516b90904" (UID: "9118d024-0154-4f26-9d7d-fc8516b90904"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.821270 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9118d024-0154-4f26-9d7d-fc8516b90904-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.824475 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-scripts" (OuterVolumeSpecName: "scripts") pod "9118d024-0154-4f26-9d7d-fc8516b90904" (UID: "9118d024-0154-4f26-9d7d-fc8516b90904"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.824474 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9118d024-0154-4f26-9d7d-fc8516b90904-kube-api-access-jvk62" (OuterVolumeSpecName: "kube-api-access-jvk62") pod "9118d024-0154-4f26-9d7d-fc8516b90904" (UID: "9118d024-0154-4f26-9d7d-fc8516b90904"). InnerVolumeSpecName "kube-api-access-jvk62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.826013 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9118d024-0154-4f26-9d7d-fc8516b90904" (UID: "9118d024-0154-4f26-9d7d-fc8516b90904"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.866074 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9118d024-0154-4f26-9d7d-fc8516b90904" (UID: "9118d024-0154-4f26-9d7d-fc8516b90904"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.911695 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data" (OuterVolumeSpecName: "config-data") pod "9118d024-0154-4f26-9d7d-fc8516b90904" (UID: "9118d024-0154-4f26-9d7d-fc8516b90904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.926027 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.926348 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.926408 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.926480 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118d024-0154-4f26-9d7d-fc8516b90904-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:12 crc kubenswrapper[4786]: I0313 16:42:12.926540 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvk62\" (UniqueName: \"kubernetes.io/projected/9118d024-0154-4f26-9d7d-fc8516b90904-kube-api-access-jvk62\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.195161 4786 generic.go:334] "Generic (PLEG): container finished" podID="9118d024-0154-4f26-9d7d-fc8516b90904" containerID="50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1" exitCode=0 Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.195215 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9118d024-0154-4f26-9d7d-fc8516b90904","Type":"ContainerDied","Data":"50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1"} Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.195271 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9118d024-0154-4f26-9d7d-fc8516b90904","Type":"ContainerDied","Data":"c529973bae7915630b07aa4a010f531fe78f146f6720db006b17a2c7765fdeb3"} Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.195299 4786 scope.go:117] "RemoveContainer" containerID="3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.195290 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.235376 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.242829 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.244596 4786 scope.go:117] "RemoveContainer" containerID="50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.251877 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:13 crc kubenswrapper[4786]: E0313 16:42:13.252395 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" containerName="cinder-scheduler" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.252415 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" containerName="cinder-scheduler" Mar 13 16:42:13 crc kubenswrapper[4786]: E0313 16:42:13.252447 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerName="extract-content" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.252453 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerName="extract-content" Mar 13 16:42:13 crc kubenswrapper[4786]: E0313 16:42:13.252467 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerName="extract-utilities" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.252474 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerName="extract-utilities" Mar 13 16:42:13 crc kubenswrapper[4786]: E0313 16:42:13.252487 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerName="registry-server" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.252493 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerName="registry-server" Mar 13 16:42:13 crc kubenswrapper[4786]: E0313 16:42:13.252505 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" containerName="probe" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.252512 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" containerName="probe" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.252650 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" containerName="probe" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.252663 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae944f8-6adc-4a54-8a13-7d971c2d503a" containerName="registry-server" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.252679 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" containerName="cinder-scheduler" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.253818 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.255753 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.267930 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.302092 4786 scope.go:117] "RemoveContainer" containerID="3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8" Mar 13 16:42:13 crc kubenswrapper[4786]: E0313 16:42:13.302639 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8\": container with ID starting with 3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8 not found: ID does not exist" containerID="3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.302671 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8"} err="failed to get container status \"3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8\": rpc error: code = NotFound desc = could not find container \"3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8\": container with ID starting with 3ca143ac27e11f5523abf210cd030f679881b1c33d980caa7e2b0d411e1cd6b8 not found: ID does not exist" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.302696 4786 scope.go:117] "RemoveContainer" containerID="50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1" Mar 13 16:42:13 crc kubenswrapper[4786]: E0313 16:42:13.303180 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1\": container with ID starting with 50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1 not found: ID does not exist" containerID="50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.303208 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1"} err="failed to get container status \"50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1\": rpc error: code = NotFound desc = could not find container \"50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1\": container with ID starting with 50dc46bd30490dc3d9b60ea8e6a9b95d0a493858ed2772e63f2f116edd8c8ec1 not found: ID does not exist" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.456493 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.456759 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-scripts\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.456930 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c0dfd41-bd16-4216-b44c-41ebbd25af63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.457062 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.457153 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-config-data\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.457260 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xc8d\" (UniqueName: \"kubernetes.io/projected/6c0dfd41-bd16-4216-b44c-41ebbd25af63-kube-api-access-6xc8d\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.558000 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xc8d\" (UniqueName: \"kubernetes.io/projected/6c0dfd41-bd16-4216-b44c-41ebbd25af63-kube-api-access-6xc8d\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.558300 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.558403 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-scripts\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.558576 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c0dfd41-bd16-4216-b44c-41ebbd25af63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.558690 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.558790 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-config-data\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.558696 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c0dfd41-bd16-4216-b44c-41ebbd25af63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.563700 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-scripts\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.564506 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.571443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.572033 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0dfd41-bd16-4216-b44c-41ebbd25af63-config-data\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.585006 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xc8d\" (UniqueName: \"kubernetes.io/projected/6c0dfd41-bd16-4216-b44c-41ebbd25af63-kube-api-access-6xc8d\") pod \"cinder-scheduler-0\" (UID: \"6c0dfd41-bd16-4216-b44c-41ebbd25af63\") " pod="openstack/cinder-scheduler-0" Mar 13 16:42:13 crc kubenswrapper[4786]: I0313 16:42:13.594968 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 16:42:14 crc kubenswrapper[4786]: I0313 16:42:14.078654 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 16:42:14 crc kubenswrapper[4786]: I0313 16:42:14.210783 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6c0dfd41-bd16-4216-b44c-41ebbd25af63","Type":"ContainerStarted","Data":"14c0c37cbff0eb25dafd651f3d16eefc36cb3e0ece8cd05002a3c3e92c41f0cc"} Mar 13 16:42:14 crc kubenswrapper[4786]: I0313 16:42:14.572607 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9118d024-0154-4f26-9d7d-fc8516b90904" path="/var/lib/kubelet/pods/9118d024-0154-4f26-9d7d-fc8516b90904/volumes" Mar 13 16:42:14 crc kubenswrapper[4786]: I0313 16:42:14.957171 4786 scope.go:117] "RemoveContainer" containerID="29b1da2d72be99bd636d74bf45826a04a282ffcf5457c18cc297564bc69fe9a2" Mar 13 16:42:15 crc kubenswrapper[4786]: I0313 16:42:15.032376 4786 scope.go:117] "RemoveContainer" containerID="d54087474411a1bfce3cf9f89513f5a54ef5502e83a32753d3ca6fb09e86914a" Mar 13 16:42:15 crc kubenswrapper[4786]: I0313 16:42:15.230329 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6c0dfd41-bd16-4216-b44c-41ebbd25af63","Type":"ContainerStarted","Data":"848245a89a043b39de91616ba9c5980be686506c69fc7c44e542cd766ca7dfb0"} Mar 13 16:42:16 crc kubenswrapper[4786]: I0313 16:42:16.243378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6c0dfd41-bd16-4216-b44c-41ebbd25af63","Type":"ContainerStarted","Data":"ac4221df0aaf95451d2df9c108d51392a1d34015f9a8eb2716acb6ad11b775dc"} Mar 13 16:42:16 crc kubenswrapper[4786]: I0313 16:42:16.270610 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.270592979 podStartE2EDuration="3.270592979s" podCreationTimestamp="2026-03-13 16:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:16.265417788 +0000 UTC m=+5966.428629629" watchObservedRunningTime="2026-03-13 16:42:16.270592979 +0000 UTC m=+5966.433804790" Mar 13 16:42:18 crc kubenswrapper[4786]: I0313 16:42:18.276829 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 16:42:18 crc kubenswrapper[4786]: I0313 16:42:18.596105 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 16:42:23 crc kubenswrapper[4786]: I0313 16:42:23.854494 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.349124 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qsk42"] Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.350717 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qsk42" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.362756 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qsk42"] Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.447176 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9589-account-create-update-tfxvl"] Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.448837 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.450986 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.460943 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9589-account-create-update-tfxvl"] Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.538310 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a98a232-c926-4aba-85c2-a628b9c6d16d-operator-scripts\") pod \"glance-db-create-qsk42\" (UID: \"8a98a232-c926-4aba-85c2-a628b9c6d16d\") " pod="openstack/glance-db-create-qsk42" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.538458 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlw22\" (UniqueName: \"kubernetes.io/projected/8a98a232-c926-4aba-85c2-a628b9c6d16d-kube-api-access-tlw22\") pod \"glance-db-create-qsk42\" (UID: \"8a98a232-c926-4aba-85c2-a628b9c6d16d\") " pod="openstack/glance-db-create-qsk42" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.639433 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47v8\" (UniqueName: \"kubernetes.io/projected/0856c17b-ac5d-4f95-b0d1-d0b725c11454-kube-api-access-g47v8\") pod \"glance-9589-account-create-update-tfxvl\" (UID: \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\") " pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.639698 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a98a232-c926-4aba-85c2-a628b9c6d16d-operator-scripts\") pod \"glance-db-create-qsk42\" (UID: \"8a98a232-c926-4aba-85c2-a628b9c6d16d\") " pod="openstack/glance-db-create-qsk42" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.640080 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0856c17b-ac5d-4f95-b0d1-d0b725c11454-operator-scripts\") pod \"glance-9589-account-create-update-tfxvl\" (UID: \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\") " pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.640168 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlw22\" (UniqueName: \"kubernetes.io/projected/8a98a232-c926-4aba-85c2-a628b9c6d16d-kube-api-access-tlw22\") pod \"glance-db-create-qsk42\" (UID: \"8a98a232-c926-4aba-85c2-a628b9c6d16d\") " pod="openstack/glance-db-create-qsk42" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.640796 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a98a232-c926-4aba-85c2-a628b9c6d16d-operator-scripts\") pod \"glance-db-create-qsk42\" (UID: \"8a98a232-c926-4aba-85c2-a628b9c6d16d\") " pod="openstack/glance-db-create-qsk42" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.658746 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlw22\" (UniqueName: \"kubernetes.io/projected/8a98a232-c926-4aba-85c2-a628b9c6d16d-kube-api-access-tlw22\") pod \"glance-db-create-qsk42\" (UID: \"8a98a232-c926-4aba-85c2-a628b9c6d16d\") " pod="openstack/glance-db-create-qsk42" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.675541 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qsk42" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.742243 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0856c17b-ac5d-4f95-b0d1-d0b725c11454-operator-scripts\") pod \"glance-9589-account-create-update-tfxvl\" (UID: \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\") " pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.742484 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g47v8\" (UniqueName: \"kubernetes.io/projected/0856c17b-ac5d-4f95-b0d1-d0b725c11454-kube-api-access-g47v8\") pod \"glance-9589-account-create-update-tfxvl\" (UID: \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\") " pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.743422 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0856c17b-ac5d-4f95-b0d1-d0b725c11454-operator-scripts\") pod \"glance-9589-account-create-update-tfxvl\" (UID: \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\") " pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.765165 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47v8\" (UniqueName: \"kubernetes.io/projected/0856c17b-ac5d-4f95-b0d1-d0b725c11454-kube-api-access-g47v8\") pod \"glance-9589-account-create-update-tfxvl\" (UID: \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\") " pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.774263 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:26 crc kubenswrapper[4786]: I0313 16:42:26.938309 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qsk42"] Mar 13 16:42:27 crc kubenswrapper[4786]: I0313 16:42:27.253389 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9589-account-create-update-tfxvl"] Mar 13 16:42:27 crc kubenswrapper[4786]: W0313 16:42:27.255941 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0856c17b_ac5d_4f95_b0d1_d0b725c11454.slice/crio-1b34af36217ff4c5b14ddade4b2c82fe3f931dcaad5689e34eb85f5fed59fa2b WatchSource:0}: Error finding container 1b34af36217ff4c5b14ddade4b2c82fe3f931dcaad5689e34eb85f5fed59fa2b: Status 404 returned error can't find the container with id 1b34af36217ff4c5b14ddade4b2c82fe3f931dcaad5689e34eb85f5fed59fa2b Mar 13 16:42:27 crc kubenswrapper[4786]: I0313 16:42:27.356198 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9589-account-create-update-tfxvl" event={"ID":"0856c17b-ac5d-4f95-b0d1-d0b725c11454","Type":"ContainerStarted","Data":"1b34af36217ff4c5b14ddade4b2c82fe3f931dcaad5689e34eb85f5fed59fa2b"} Mar 13 16:42:27 crc kubenswrapper[4786]: I0313 16:42:27.358327 4786 generic.go:334] "Generic (PLEG): container finished" podID="8a98a232-c926-4aba-85c2-a628b9c6d16d" containerID="7ebe02a3bf38c5b519df664d7024ae116da02c953617fef97b7557774ea9aa42" exitCode=0 Mar 13 16:42:27 crc kubenswrapper[4786]: I0313 16:42:27.358378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qsk42" event={"ID":"8a98a232-c926-4aba-85c2-a628b9c6d16d","Type":"ContainerDied","Data":"7ebe02a3bf38c5b519df664d7024ae116da02c953617fef97b7557774ea9aa42"} Mar 13 16:42:27 crc kubenswrapper[4786]: I0313 16:42:27.358396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qsk42" event={"ID":"8a98a232-c926-4aba-85c2-a628b9c6d16d","Type":"ContainerStarted","Data":"f9bf0018aa85c75166b7bf4ef4da259c9af64448567d40c86254506bd382de83"} Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.372100 4786 generic.go:334] "Generic (PLEG): container finished" podID="0856c17b-ac5d-4f95-b0d1-d0b725c11454" containerID="3f0355fa5a3e553fa139e964751be8306a12464b6081f5e860114efbb812d6cf" exitCode=0 Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.372762 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9589-account-create-update-tfxvl" event={"ID":"0856c17b-ac5d-4f95-b0d1-d0b725c11454","Type":"ContainerDied","Data":"3f0355fa5a3e553fa139e964751be8306a12464b6081f5e860114efbb812d6cf"} Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.786209 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qsk42" Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.894426 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlw22\" (UniqueName: \"kubernetes.io/projected/8a98a232-c926-4aba-85c2-a628b9c6d16d-kube-api-access-tlw22\") pod \"8a98a232-c926-4aba-85c2-a628b9c6d16d\" (UID: \"8a98a232-c926-4aba-85c2-a628b9c6d16d\") " Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.894575 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a98a232-c926-4aba-85c2-a628b9c6d16d-operator-scripts\") pod \"8a98a232-c926-4aba-85c2-a628b9c6d16d\" (UID: \"8a98a232-c926-4aba-85c2-a628b9c6d16d\") " Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.896146 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a98a232-c926-4aba-85c2-a628b9c6d16d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a98a232-c926-4aba-85c2-a628b9c6d16d" (UID: "8a98a232-c926-4aba-85c2-a628b9c6d16d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.903782 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a98a232-c926-4aba-85c2-a628b9c6d16d-kube-api-access-tlw22" (OuterVolumeSpecName: "kube-api-access-tlw22") pod "8a98a232-c926-4aba-85c2-a628b9c6d16d" (UID: "8a98a232-c926-4aba-85c2-a628b9c6d16d"). InnerVolumeSpecName "kube-api-access-tlw22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.996978 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlw22\" (UniqueName: \"kubernetes.io/projected/8a98a232-c926-4aba-85c2-a628b9c6d16d-kube-api-access-tlw22\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:28 crc kubenswrapper[4786]: I0313 16:42:28.997114 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a98a232-c926-4aba-85c2-a628b9c6d16d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:29 crc kubenswrapper[4786]: I0313 16:42:29.386812 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qsk42" Mar 13 16:42:29 crc kubenswrapper[4786]: I0313 16:42:29.386830 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qsk42" event={"ID":"8a98a232-c926-4aba-85c2-a628b9c6d16d","Type":"ContainerDied","Data":"f9bf0018aa85c75166b7bf4ef4da259c9af64448567d40c86254506bd382de83"} Mar 13 16:42:29 crc kubenswrapper[4786]: I0313 16:42:29.387275 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9bf0018aa85c75166b7bf4ef4da259c9af64448567d40c86254506bd382de83" Mar 13 16:42:29 crc kubenswrapper[4786]: I0313 16:42:29.881749 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.018263 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0856c17b-ac5d-4f95-b0d1-d0b725c11454-operator-scripts\") pod \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\" (UID: \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\") " Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.018472 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g47v8\" (UniqueName: \"kubernetes.io/projected/0856c17b-ac5d-4f95-b0d1-d0b725c11454-kube-api-access-g47v8\") pod \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\" (UID: \"0856c17b-ac5d-4f95-b0d1-d0b725c11454\") " Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.019228 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0856c17b-ac5d-4f95-b0d1-d0b725c11454-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0856c17b-ac5d-4f95-b0d1-d0b725c11454" (UID: "0856c17b-ac5d-4f95-b0d1-d0b725c11454"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.032114 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0856c17b-ac5d-4f95-b0d1-d0b725c11454-kube-api-access-g47v8" (OuterVolumeSpecName: "kube-api-access-g47v8") pod "0856c17b-ac5d-4f95-b0d1-d0b725c11454" (UID: "0856c17b-ac5d-4f95-b0d1-d0b725c11454"). InnerVolumeSpecName "kube-api-access-g47v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.121035 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g47v8\" (UniqueName: \"kubernetes.io/projected/0856c17b-ac5d-4f95-b0d1-d0b725c11454-kube-api-access-g47v8\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.121146 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0856c17b-ac5d-4f95-b0d1-d0b725c11454-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.400738 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9589-account-create-update-tfxvl" event={"ID":"0856c17b-ac5d-4f95-b0d1-d0b725c11454","Type":"ContainerDied","Data":"1b34af36217ff4c5b14ddade4b2c82fe3f931dcaad5689e34eb85f5fed59fa2b"} Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.400796 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b34af36217ff4c5b14ddade4b2c82fe3f931dcaad5689e34eb85f5fed59fa2b" Mar 13 16:42:30 crc kubenswrapper[4786]: I0313 16:42:30.400832 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9589-account-create-update-tfxvl" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.723823 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-4c84r"] Mar 13 16:42:31 crc kubenswrapper[4786]: E0313 16:42:31.724518 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0856c17b-ac5d-4f95-b0d1-d0b725c11454" containerName="mariadb-account-create-update" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.724535 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0856c17b-ac5d-4f95-b0d1-d0b725c11454" containerName="mariadb-account-create-update" Mar 13 16:42:31 crc kubenswrapper[4786]: E0313 16:42:31.724558 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a98a232-c926-4aba-85c2-a628b9c6d16d" containerName="mariadb-database-create" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.724565 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a98a232-c926-4aba-85c2-a628b9c6d16d" containerName="mariadb-database-create" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.724728 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a98a232-c926-4aba-85c2-a628b9c6d16d" containerName="mariadb-database-create" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.724744 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0856c17b-ac5d-4f95-b0d1-d0b725c11454" containerName="mariadb-account-create-update" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.725481 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.731198 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wh5b4" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.731351 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.737599 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4c84r"] Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.858429 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-combined-ca-bundle\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.858523 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-db-sync-config-data\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.858609 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-config-data\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.858738 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrjf\" (UniqueName: \"kubernetes.io/projected/aa484794-3f56-40f8-b139-b1c8ed536c2c-kube-api-access-csrjf\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.960118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-db-sync-config-data\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.960203 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-config-data\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.960282 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrjf\" (UniqueName: \"kubernetes.io/projected/aa484794-3f56-40f8-b139-b1c8ed536c2c-kube-api-access-csrjf\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.960466 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-combined-ca-bundle\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.966249 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-config-data\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.967259 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-db-sync-config-data\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.967588 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-combined-ca-bundle\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:31 crc kubenswrapper[4786]: I0313 16:42:31.988326 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrjf\" (UniqueName: \"kubernetes.io/projected/aa484794-3f56-40f8-b139-b1c8ed536c2c-kube-api-access-csrjf\") pod \"glance-db-sync-4c84r\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:32 crc kubenswrapper[4786]: I0313 16:42:32.067575 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:32 crc kubenswrapper[4786]: I0313 16:42:32.606793 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4c84r"] Mar 13 16:42:33 crc kubenswrapper[4786]: I0313 16:42:33.452642 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4c84r" event={"ID":"aa484794-3f56-40f8-b139-b1c8ed536c2c","Type":"ContainerStarted","Data":"7adf84386f2c7e8fc5d33877be0b4bcc40c815b4c4847700145610e4b83a278f"} Mar 13 16:42:33 crc kubenswrapper[4786]: I0313 16:42:33.452923 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4c84r" event={"ID":"aa484794-3f56-40f8-b139-b1c8ed536c2c","Type":"ContainerStarted","Data":"730eeb783da821189991ad0f43e3cdfca7dc108acb12aed428b98560574a8253"} Mar 13 16:42:33 crc kubenswrapper[4786]: I0313 16:42:33.489740 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-4c84r" podStartSLOduration=2.489719172 podStartE2EDuration="2.489719172s" podCreationTimestamp="2026-03-13 16:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:33.478141041 +0000 UTC m=+5983.641352892" watchObservedRunningTime="2026-03-13 16:42:33.489719172 +0000 UTC m=+5983.652930983" Mar 13 16:42:36 crc kubenswrapper[4786]: I0313 16:42:36.491698 4786 generic.go:334] "Generic (PLEG): container finished" podID="aa484794-3f56-40f8-b139-b1c8ed536c2c" containerID="7adf84386f2c7e8fc5d33877be0b4bcc40c815b4c4847700145610e4b83a278f" exitCode=0 Mar 13 16:42:36 crc kubenswrapper[4786]: I0313 16:42:36.491806 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4c84r" event={"ID":"aa484794-3f56-40f8-b139-b1c8ed536c2c","Type":"ContainerDied","Data":"7adf84386f2c7e8fc5d33877be0b4bcc40c815b4c4847700145610e4b83a278f"} Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.029670 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.178967 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-config-data\") pod \"aa484794-3f56-40f8-b139-b1c8ed536c2c\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.179080 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-combined-ca-bundle\") pod \"aa484794-3f56-40f8-b139-b1c8ed536c2c\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.179108 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-db-sync-config-data\") pod \"aa484794-3f56-40f8-b139-b1c8ed536c2c\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.179184 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csrjf\" (UniqueName: \"kubernetes.io/projected/aa484794-3f56-40f8-b139-b1c8ed536c2c-kube-api-access-csrjf\") pod \"aa484794-3f56-40f8-b139-b1c8ed536c2c\" (UID: \"aa484794-3f56-40f8-b139-b1c8ed536c2c\") " Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.186142 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aa484794-3f56-40f8-b139-b1c8ed536c2c" (UID: "aa484794-3f56-40f8-b139-b1c8ed536c2c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.193201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa484794-3f56-40f8-b139-b1c8ed536c2c-kube-api-access-csrjf" (OuterVolumeSpecName: "kube-api-access-csrjf") pod "aa484794-3f56-40f8-b139-b1c8ed536c2c" (UID: "aa484794-3f56-40f8-b139-b1c8ed536c2c"). InnerVolumeSpecName "kube-api-access-csrjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.212260 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa484794-3f56-40f8-b139-b1c8ed536c2c" (UID: "aa484794-3f56-40f8-b139-b1c8ed536c2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.230546 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-config-data" (OuterVolumeSpecName: "config-data") pod "aa484794-3f56-40f8-b139-b1c8ed536c2c" (UID: "aa484794-3f56-40f8-b139-b1c8ed536c2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.281381 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csrjf\" (UniqueName: \"kubernetes.io/projected/aa484794-3f56-40f8-b139-b1c8ed536c2c-kube-api-access-csrjf\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.281476 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.281498 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.281516 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa484794-3f56-40f8-b139-b1c8ed536c2c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.521496 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4c84r" event={"ID":"aa484794-3f56-40f8-b139-b1c8ed536c2c","Type":"ContainerDied","Data":"730eeb783da821189991ad0f43e3cdfca7dc108acb12aed428b98560574a8253"} Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.521553 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730eeb783da821189991ad0f43e3cdfca7dc108acb12aed428b98560574a8253" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.521688 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4c84r" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.950329 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77444f4759-c7kpj"] Mar 13 16:42:38 crc kubenswrapper[4786]: E0313 16:42:38.950970 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa484794-3f56-40f8-b139-b1c8ed536c2c" containerName="glance-db-sync" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.950986 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa484794-3f56-40f8-b139-b1c8ed536c2c" containerName="glance-db-sync" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.951149 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa484794-3f56-40f8-b139-b1c8ed536c2c" containerName="glance-db-sync" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.952185 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.960931 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77444f4759-c7kpj"] Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.977741 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.979528 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.981462 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.981635 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wh5b4" Mar 13 16:42:38 crc kubenswrapper[4786]: I0313 16:42:38.982189 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.003153 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.050669 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.052282 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.054758 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.059340 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101264 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-sb\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101300 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101327 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhq9k\" (UniqueName: \"kubernetes.io/projected/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-kube-api-access-vhq9k\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101385 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-dns-svc\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101442 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq9cc\" (UniqueName: \"kubernetes.io/projected/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-kube-api-access-rq9cc\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101469 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101638 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-config\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101699 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101799 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-nb\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.101842 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-logs\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.203603 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.203754 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-sb\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.203783 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.203810 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.203943 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhq9k\" (UniqueName: \"kubernetes.io/projected/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-kube-api-access-vhq9k\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204462 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204511 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-dns-svc\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204661 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204701 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204780 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq9cc\" (UniqueName: \"kubernetes.io/projected/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-kube-api-access-rq9cc\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204874 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204948 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-config\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.204976 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5zk\" (UniqueName: \"kubernetes.io/projected/08c78a3e-054e-45fd-8093-ca8b5233a0e9-kube-api-access-4t5zk\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.205000 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.205073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-nb\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.205102 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-logs\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.205504 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-sb\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.205532 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-logs\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.206119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-config\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.207210 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-nb\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.207231 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-dns-svc\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.210454 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.210742 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.211776 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.222019 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhq9k\" (UniqueName: \"kubernetes.io/projected/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-kube-api-access-vhq9k\") pod \"dnsmasq-dns-77444f4759-c7kpj\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.232268 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq9cc\" (UniqueName: \"kubernetes.io/projected/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-kube-api-access-rq9cc\") pod \"glance-default-external-api-0\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.268377 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.302616 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.306563 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.306621 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.306691 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.306747 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.306833 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.306926 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5zk\" (UniqueName: \"kubernetes.io/projected/08c78a3e-054e-45fd-8093-ca8b5233a0e9-kube-api-access-4t5zk\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.308178 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.308471 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.313179 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.316320 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.317614 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.329021 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5zk\" (UniqueName: \"kubernetes.io/projected/08c78a3e-054e-45fd-8093-ca8b5233a0e9-kube-api-access-4t5zk\") pod \"glance-default-internal-api-0\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.378503 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.766667 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:39 crc kubenswrapper[4786]: I0313 16:42:39.846087 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77444f4759-c7kpj"] Mar 13 16:42:39 crc kubenswrapper[4786]: W0313 16:42:39.849415 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe2b1fe2_d5a2_4e13_b192_f7ea9f658bc6.slice/crio-7c20f1fff60404cbf21c675a51b21d8e55c662afcecfb0792808c048d1d36212 WatchSource:0}: Error finding container 7c20f1fff60404cbf21c675a51b21d8e55c662afcecfb0792808c048d1d36212: Status 404 returned error can't find the container with id 7c20f1fff60404cbf21c675a51b21d8e55c662afcecfb0792808c048d1d36212 Mar 13 16:42:40 crc kubenswrapper[4786]: I0313 16:42:40.009694 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:40 crc kubenswrapper[4786]: I0313 16:42:40.112815 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:40 crc kubenswrapper[4786]: W0313 16:42:40.123399 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c78a3e_054e_45fd_8093_ca8b5233a0e9.slice/crio-6f474a3e2cbe720e92fe64965a2a1b30a7720b1a5069bec6737312a1d642552a WatchSource:0}: Error finding container 6f474a3e2cbe720e92fe64965a2a1b30a7720b1a5069bec6737312a1d642552a: Status 404 returned error can't find the container with id 6f474a3e2cbe720e92fe64965a2a1b30a7720b1a5069bec6737312a1d642552a Mar 13 16:42:40 crc kubenswrapper[4786]: I0313 16:42:40.540643 4786 generic.go:334] "Generic (PLEG): container finished" podID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" containerID="4c5e30097bc2ea6d86d373d53039e366883f7792bf86dbe3e36c7eb59bea3d76" exitCode=0 Mar 13 16:42:40 crc kubenswrapper[4786]: I0313 16:42:40.540894 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" event={"ID":"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6","Type":"ContainerDied","Data":"4c5e30097bc2ea6d86d373d53039e366883f7792bf86dbe3e36c7eb59bea3d76"} Mar 13 16:42:40 crc kubenswrapper[4786]: I0313 16:42:40.540945 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" event={"ID":"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6","Type":"ContainerStarted","Data":"7c20f1fff60404cbf21c675a51b21d8e55c662afcecfb0792808c048d1d36212"} Mar 13 16:42:40 crc kubenswrapper[4786]: I0313 16:42:40.570105 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ed594b5-d5fd-46c0-b1cc-dbd06b705694","Type":"ContainerStarted","Data":"64fc7bcf7a961d2598d0d0f578a6d368f14950c8e30fe8ea8eb5652bd1a03dcc"} Mar 13 16:42:40 crc kubenswrapper[4786]: I0313 16:42:40.570152 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08c78a3e-054e-45fd-8093-ca8b5233a0e9","Type":"ContainerStarted","Data":"6f474a3e2cbe720e92fe64965a2a1b30a7720b1a5069bec6737312a1d642552a"} Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.129637 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.576741 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ed594b5-d5fd-46c0-b1cc-dbd06b705694","Type":"ContainerStarted","Data":"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c"} Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.576799 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ed594b5-d5fd-46c0-b1cc-dbd06b705694","Type":"ContainerStarted","Data":"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1"} Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.576850 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerName="glance-log" containerID="cri-o://9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1" gracePeriod=30 Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.576927 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerName="glance-httpd" containerID="cri-o://002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c" gracePeriod=30 Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.583686 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08c78a3e-054e-45fd-8093-ca8b5233a0e9","Type":"ContainerStarted","Data":"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539"} Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.584084 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08c78a3e-054e-45fd-8093-ca8b5233a0e9","Type":"ContainerStarted","Data":"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be"} Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.584218 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerName="glance-log" containerID="cri-o://37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be" gracePeriod=30 Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.584321 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerName="glance-httpd" containerID="cri-o://e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539" gracePeriod=30 Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.594522 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" event={"ID":"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6","Type":"ContainerStarted","Data":"180e87d8c98c309fcc6202a062d93dbcd89f2ff2aba573890234f0b5905f3a3c"} Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.594707 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.626407 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.626386035 podStartE2EDuration="3.626386035s" podCreationTimestamp="2026-03-13 16:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:41.617882361 +0000 UTC m=+5991.781094172" watchObservedRunningTime="2026-03-13 16:42:41.626386035 +0000 UTC m=+5991.789597846" Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.668034 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.668012193 podStartE2EDuration="2.668012193s" podCreationTimestamp="2026-03-13 16:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:41.663374957 +0000 UTC m=+5991.826586768" watchObservedRunningTime="2026-03-13 16:42:41.668012193 +0000 UTC m=+5991.831224004" Mar 13 16:42:41 crc kubenswrapper[4786]: I0313 16:42:41.700146 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" podStartSLOduration=3.700132182 podStartE2EDuration="3.700132182s" podCreationTimestamp="2026-03-13 16:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:41.699072765 +0000 UTC m=+5991.862284576" watchObservedRunningTime="2026-03-13 16:42:41.700132182 +0000 UTC m=+5991.863343993" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.271922 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.368315 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t5zk\" (UniqueName: \"kubernetes.io/projected/08c78a3e-054e-45fd-8093-ca8b5233a0e9-kube-api-access-4t5zk\") pod \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.368412 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-scripts\") pod \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.368444 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-config-data\") pod \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.368639 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-httpd-run\") pod \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.368677 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-combined-ca-bundle\") pod \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.368724 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-logs\") pod \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\" (UID: \"08c78a3e-054e-45fd-8093-ca8b5233a0e9\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.369345 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-logs" (OuterVolumeSpecName: "logs") pod "08c78a3e-054e-45fd-8093-ca8b5233a0e9" (UID: "08c78a3e-054e-45fd-8093-ca8b5233a0e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.369545 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "08c78a3e-054e-45fd-8093-ca8b5233a0e9" (UID: "08c78a3e-054e-45fd-8093-ca8b5233a0e9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.375565 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-scripts" (OuterVolumeSpecName: "scripts") pod "08c78a3e-054e-45fd-8093-ca8b5233a0e9" (UID: "08c78a3e-054e-45fd-8093-ca8b5233a0e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.376750 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c78a3e-054e-45fd-8093-ca8b5233a0e9-kube-api-access-4t5zk" (OuterVolumeSpecName: "kube-api-access-4t5zk") pod "08c78a3e-054e-45fd-8093-ca8b5233a0e9" (UID: "08c78a3e-054e-45fd-8093-ca8b5233a0e9"). InnerVolumeSpecName "kube-api-access-4t5zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.405796 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c78a3e-054e-45fd-8093-ca8b5233a0e9" (UID: "08c78a3e-054e-45fd-8093-ca8b5233a0e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.444406 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-config-data" (OuterVolumeSpecName: "config-data") pod "08c78a3e-054e-45fd-8093-ca8b5233a0e9" (UID: "08c78a3e-054e-45fd-8093-ca8b5233a0e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.470370 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.470421 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.470434 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c78a3e-054e-45fd-8093-ca8b5233a0e9-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.470444 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t5zk\" (UniqueName: \"kubernetes.io/projected/08c78a3e-054e-45fd-8093-ca8b5233a0e9-kube-api-access-4t5zk\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.470454 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.470462 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c78a3e-054e-45fd-8093-ca8b5233a0e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.546641 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.606752 4786 generic.go:334] "Generic (PLEG): container finished" podID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerID="002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c" exitCode=0 Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.606781 4786 generic.go:334] "Generic (PLEG): container finished" podID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerID="9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1" exitCode=143 Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.606811 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.606830 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ed594b5-d5fd-46c0-b1cc-dbd06b705694","Type":"ContainerDied","Data":"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c"} Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.606888 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ed594b5-d5fd-46c0-b1cc-dbd06b705694","Type":"ContainerDied","Data":"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1"} Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.606900 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ed594b5-d5fd-46c0-b1cc-dbd06b705694","Type":"ContainerDied","Data":"64fc7bcf7a961d2598d0d0f578a6d368f14950c8e30fe8ea8eb5652bd1a03dcc"} Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.606915 4786 scope.go:117] "RemoveContainer" containerID="002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.608252 4786 generic.go:334] "Generic (PLEG): container finished" podID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerID="e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539" exitCode=143 Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.608278 4786 generic.go:334] "Generic (PLEG): container finished" podID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerID="37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be" exitCode=143 Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.608292 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.608374 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08c78a3e-054e-45fd-8093-ca8b5233a0e9","Type":"ContainerDied","Data":"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539"} Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.608401 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08c78a3e-054e-45fd-8093-ca8b5233a0e9","Type":"ContainerDied","Data":"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be"} Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.608411 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08c78a3e-054e-45fd-8093-ca8b5233a0e9","Type":"ContainerDied","Data":"6f474a3e2cbe720e92fe64965a2a1b30a7720b1a5069bec6737312a1d642552a"} Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.633790 4786 scope.go:117] "RemoveContainer" containerID="9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.639457 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.650967 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.656715 4786 scope.go:117] "RemoveContainer" containerID="002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.658592 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:42 crc kubenswrapper[4786]: E0313 16:42:42.658772 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c\": container with ID starting with 002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c not found: ID does not exist" containerID="002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.658832 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c"} err="failed to get container status \"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c\": rpc error: code = NotFound desc = could not find container \"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c\": container with ID starting with 002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c not found: ID does not exist" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.658898 4786 scope.go:117] "RemoveContainer" containerID="9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1" Mar 13 16:42:42 crc kubenswrapper[4786]: E0313 16:42:42.658994 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerName="glance-log" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659007 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerName="glance-log" Mar 13 16:42:42 crc kubenswrapper[4786]: E0313 16:42:42.659022 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerName="glance-httpd" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659029 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerName="glance-httpd" Mar 13 16:42:42 crc kubenswrapper[4786]: E0313 16:42:42.659044 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerName="glance-httpd" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659050 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerName="glance-httpd" Mar 13 16:42:42 crc kubenswrapper[4786]: E0313 16:42:42.659085 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerName="glance-log" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659092 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerName="glance-log" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659252 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerName="glance-httpd" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659267 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerName="glance-httpd" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659287 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" containerName="glance-log" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659300 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" containerName="glance-log" Mar 13 16:42:42 crc kubenswrapper[4786]: E0313 16:42:42.659557 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1\": container with ID starting with 9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1 not found: ID does not exist" containerID="9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659590 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1"} err="failed to get container status \"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1\": rpc error: code = NotFound desc = could not find container \"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1\": container with ID starting with 9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1 not found: ID does not exist" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659616 4786 scope.go:117] "RemoveContainer" containerID="002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659920 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c"} err="failed to get container status \"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c\": rpc error: code = NotFound desc = could not find container \"002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c\": container with ID starting with 002da8bd63fc5da7676bfb919ca911879a3ec4f55d720cce19f00a36bce9de3c not found: ID does not exist" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.659951 4786 scope.go:117] "RemoveContainer" containerID="9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.660138 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.660357 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1"} err="failed to get container status \"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1\": rpc error: code = NotFound desc = could not find container \"9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1\": container with ID starting with 9f279af38e6e64219fb391bfa4cea0378a8a8d7e65b7a9c54a4e455d3aa07de1 not found: ID does not exist" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.660378 4786 scope.go:117] "RemoveContainer" containerID="e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.662957 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.663051 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.672869 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-scripts\") pod \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.672946 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-logs\") pod \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.673024 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-config-data\") pod \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.673130 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-httpd-run\") pod \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.673157 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-combined-ca-bundle\") pod \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.673216 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq9cc\" (UniqueName: \"kubernetes.io/projected/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-kube-api-access-rq9cc\") pod \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\" (UID: \"5ed594b5-d5fd-46c0-b1cc-dbd06b705694\") " Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.673384 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-logs" (OuterVolumeSpecName: "logs") pod "5ed594b5-d5fd-46c0-b1cc-dbd06b705694" (UID: "5ed594b5-d5fd-46c0-b1cc-dbd06b705694"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.673956 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.674033 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ed594b5-d5fd-46c0-b1cc-dbd06b705694" (UID: "5ed594b5-d5fd-46c0-b1cc-dbd06b705694"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.676464 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.676488 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-kube-api-access-rq9cc" (OuterVolumeSpecName: "kube-api-access-rq9cc") pod "5ed594b5-d5fd-46c0-b1cc-dbd06b705694" (UID: "5ed594b5-d5fd-46c0-b1cc-dbd06b705694"). InnerVolumeSpecName "kube-api-access-rq9cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.682728 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-scripts" (OuterVolumeSpecName: "scripts") pod "5ed594b5-d5fd-46c0-b1cc-dbd06b705694" (UID: "5ed594b5-d5fd-46c0-b1cc-dbd06b705694"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.683913 4786 scope.go:117] "RemoveContainer" containerID="37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.705894 4786 scope.go:117] "RemoveContainer" containerID="e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539" Mar 13 16:42:42 crc kubenswrapper[4786]: E0313 16:42:42.706665 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539\": container with ID starting with e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539 not found: ID does not exist" containerID="e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.706702 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539"} err="failed to get container status \"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539\": rpc error: code = NotFound desc = could not find container \"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539\": container with ID starting with e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539 not found: ID does not exist" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.706731 4786 scope.go:117] "RemoveContainer" containerID="37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be" Mar 13 16:42:42 crc kubenswrapper[4786]: E0313 16:42:42.707137 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be\": container with ID starting with 37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be not found: ID does not exist" containerID="37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.707163 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be"} err="failed to get container status \"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be\": rpc error: code = NotFound desc = could not find container \"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be\": container with ID starting with 37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be not found: ID does not exist" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.707179 4786 scope.go:117] "RemoveContainer" containerID="e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.707543 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539"} err="failed to get container status \"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539\": rpc error: code = NotFound desc = could not find container \"e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539\": container with ID starting with e8aa0d0ed4c9ce1977a1077a0ff950822d78df2713b70205484bde276878f539 not found: ID does not exist" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.707597 4786 scope.go:117] "RemoveContainer" containerID="37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.707975 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be"} err="failed to get container status \"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be\": rpc error: code = NotFound desc = could not find container \"37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be\": container with ID starting with 37a5544c3ae685220ef35c74c7dc83c65ac2f76f1d707a556c41533dd68010be not found: ID does not exist" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.723050 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ed594b5-d5fd-46c0-b1cc-dbd06b705694" (UID: "5ed594b5-d5fd-46c0-b1cc-dbd06b705694"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.736365 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-config-data" (OuterVolumeSpecName: "config-data") pod "5ed594b5-d5fd-46c0-b1cc-dbd06b705694" (UID: "5ed594b5-d5fd-46c0-b1cc-dbd06b705694"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.775915 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.775976 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pwgt\" (UniqueName: \"kubernetes.io/projected/295aee2e-3c33-4ab5-a840-a92aa2fea90a-kube-api-access-7pwgt\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776069 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776107 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776129 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776151 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-logs\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776284 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq9cc\" (UniqueName: \"kubernetes.io/projected/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-kube-api-access-rq9cc\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776295 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776305 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776315 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.776326 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed594b5-d5fd-46c0-b1cc-dbd06b705694-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.877461 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.877509 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pwgt\" (UniqueName: \"kubernetes.io/projected/295aee2e-3c33-4ab5-a840-a92aa2fea90a-kube-api-access-7pwgt\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.877557 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.877604 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.877621 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.877639 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-logs\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.877666 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.878583 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-logs\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.878602 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.882990 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.883062 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.883529 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.885318 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.893218 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pwgt\" (UniqueName: \"kubernetes.io/projected/295aee2e-3c33-4ab5-a840-a92aa2fea90a-kube-api-access-7pwgt\") pod \"glance-default-internal-api-0\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.942440 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.951319 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.976980 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.985521 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.987648 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:42:42 crc kubenswrapper[4786]: I0313 16:42:42.992695 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.004783 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.004970 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.080668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.080737 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27v95\" (UniqueName: \"kubernetes.io/projected/a9df4880-8f92-48ed-b070-b4bb67f8743a-kube-api-access-27v95\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.080761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.080781 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-logs\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.080823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.082216 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.082307 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.183925 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.183977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27v95\" (UniqueName: \"kubernetes.io/projected/a9df4880-8f92-48ed-b070-b4bb67f8743a-kube-api-access-27v95\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.183998 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.184022 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-logs\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.184068 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.184103 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.184131 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.184937 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-logs\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.185215 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.189205 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.189378 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.190125 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-config-data\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.193454 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-scripts\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.199268 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27v95\" (UniqueName: \"kubernetes.io/projected/a9df4880-8f92-48ed-b070-b4bb67f8743a-kube-api-access-27v95\") pod \"glance-default-external-api-0\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.316090 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.377041 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.625450 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"295aee2e-3c33-4ab5-a840-a92aa2fea90a","Type":"ContainerStarted","Data":"1f177cfe0684c2369568eca6dcef3e9cc375721de746e51ab8b8bcc38756cada"} Mar 13 16:42:43 crc kubenswrapper[4786]: I0313 16:42:43.837767 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:42:43 crc kubenswrapper[4786]: W0313 16:42:43.846462 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9df4880_8f92_48ed_b070_b4bb67f8743a.slice/crio-1c4ee6ea70e0ce40acceba7b3759439662cdf4050014d73b9cab3edc71a497c9 WatchSource:0}: Error finding container 1c4ee6ea70e0ce40acceba7b3759439662cdf4050014d73b9cab3edc71a497c9: Status 404 returned error can't find the container with id 1c4ee6ea70e0ce40acceba7b3759439662cdf4050014d73b9cab3edc71a497c9 Mar 13 16:42:44 crc kubenswrapper[4786]: I0313 16:42:44.569971 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c78a3e-054e-45fd-8093-ca8b5233a0e9" path="/var/lib/kubelet/pods/08c78a3e-054e-45fd-8093-ca8b5233a0e9/volumes" Mar 13 16:42:44 crc kubenswrapper[4786]: I0313 16:42:44.571297 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed594b5-d5fd-46c0-b1cc-dbd06b705694" path="/var/lib/kubelet/pods/5ed594b5-d5fd-46c0-b1cc-dbd06b705694/volumes" Mar 13 16:42:44 crc kubenswrapper[4786]: I0313 16:42:44.645398 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9df4880-8f92-48ed-b070-b4bb67f8743a","Type":"ContainerStarted","Data":"47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4"} Mar 13 16:42:44 crc kubenswrapper[4786]: I0313 16:42:44.645461 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9df4880-8f92-48ed-b070-b4bb67f8743a","Type":"ContainerStarted","Data":"1c4ee6ea70e0ce40acceba7b3759439662cdf4050014d73b9cab3edc71a497c9"} Mar 13 16:42:44 crc kubenswrapper[4786]: I0313 16:42:44.647536 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"295aee2e-3c33-4ab5-a840-a92aa2fea90a","Type":"ContainerStarted","Data":"f7a70fcbb6a61b06b3a6852bfd8d57d0a920f0c9db536cf4231324f60c45c7ec"} Mar 13 16:42:44 crc kubenswrapper[4786]: I0313 16:42:44.647561 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"295aee2e-3c33-4ab5-a840-a92aa2fea90a","Type":"ContainerStarted","Data":"8c4fa795b62836e81bef5de48e5fd6b377d23dcdb4cce31d06972c7019b0768c"} Mar 13 16:42:44 crc kubenswrapper[4786]: I0313 16:42:44.673383 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.673360153 podStartE2EDuration="2.673360153s" podCreationTimestamp="2026-03-13 16:42:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:44.670900841 +0000 UTC m=+5994.834112682" watchObservedRunningTime="2026-03-13 16:42:44.673360153 +0000 UTC m=+5994.836571974" Mar 13 16:42:45 crc kubenswrapper[4786]: I0313 16:42:45.661744 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9df4880-8f92-48ed-b070-b4bb67f8743a","Type":"ContainerStarted","Data":"adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3"} Mar 13 16:42:45 crc kubenswrapper[4786]: I0313 16:42:45.692805 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.6927784470000002 podStartE2EDuration="3.692778447s" podCreationTimestamp="2026-03-13 16:42:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:42:45.685069703 +0000 UTC m=+5995.848281544" watchObservedRunningTime="2026-03-13 16:42:45.692778447 +0000 UTC m=+5995.855990268" Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.271094 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.353466 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bd7ddb875-t88qs"] Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.353762 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" podUID="37633ce3-d166-4549-a66b-d4696d0cb76d" containerName="dnsmasq-dns" containerID="cri-o://c152a135d84b04d752e27b57f8302098ae6505e6a0b268c565e5208190e92915" gracePeriod=10 Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.705906 4786 generic.go:334] "Generic (PLEG): container finished" podID="37633ce3-d166-4549-a66b-d4696d0cb76d" containerID="c152a135d84b04d752e27b57f8302098ae6505e6a0b268c565e5208190e92915" exitCode=0 Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.706300 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" event={"ID":"37633ce3-d166-4549-a66b-d4696d0cb76d","Type":"ContainerDied","Data":"c152a135d84b04d752e27b57f8302098ae6505e6a0b268c565e5208190e92915"} Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.829381 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.940603 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-nb\") pod \"37633ce3-d166-4549-a66b-d4696d0cb76d\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.940715 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-dns-svc\") pod \"37633ce3-d166-4549-a66b-d4696d0cb76d\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.940918 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2m7x\" (UniqueName: \"kubernetes.io/projected/37633ce3-d166-4549-a66b-d4696d0cb76d-kube-api-access-j2m7x\") pod \"37633ce3-d166-4549-a66b-d4696d0cb76d\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.941602 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-config\") pod \"37633ce3-d166-4549-a66b-d4696d0cb76d\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.941659 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-sb\") pod \"37633ce3-d166-4549-a66b-d4696d0cb76d\" (UID: \"37633ce3-d166-4549-a66b-d4696d0cb76d\") " Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.946329 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37633ce3-d166-4549-a66b-d4696d0cb76d-kube-api-access-j2m7x" (OuterVolumeSpecName: "kube-api-access-j2m7x") pod "37633ce3-d166-4549-a66b-d4696d0cb76d" (UID: "37633ce3-d166-4549-a66b-d4696d0cb76d"). InnerVolumeSpecName "kube-api-access-j2m7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:42:49 crc kubenswrapper[4786]: I0313 16:42:49.994620 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37633ce3-d166-4549-a66b-d4696d0cb76d" (UID: "37633ce3-d166-4549-a66b-d4696d0cb76d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.012058 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-config" (OuterVolumeSpecName: "config") pod "37633ce3-d166-4549-a66b-d4696d0cb76d" (UID: "37633ce3-d166-4549-a66b-d4696d0cb76d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.015167 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37633ce3-d166-4549-a66b-d4696d0cb76d" (UID: "37633ce3-d166-4549-a66b-d4696d0cb76d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.016176 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37633ce3-d166-4549-a66b-d4696d0cb76d" (UID: "37633ce3-d166-4549-a66b-d4696d0cb76d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.043939 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.043970 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.043984 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2m7x\" (UniqueName: \"kubernetes.io/projected/37633ce3-d166-4549-a66b-d4696d0cb76d-kube-api-access-j2m7x\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.043995 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.044009 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37633ce3-d166-4549-a66b-d4696d0cb76d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.723730 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" event={"ID":"37633ce3-d166-4549-a66b-d4696d0cb76d","Type":"ContainerDied","Data":"10a8796d1b5f107cd29ed0f0b1bd174b2ba0a19f80e1e0a2e2e8e8c1b5d8bbf4"} Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.723824 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bd7ddb875-t88qs" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.724405 4786 scope.go:117] "RemoveContainer" containerID="c152a135d84b04d752e27b57f8302098ae6505e6a0b268c565e5208190e92915" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.773603 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bd7ddb875-t88qs"] Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.783790 4786 scope.go:117] "RemoveContainer" containerID="202ae79cb82f581e59ce6d2465ec8ae010cdc3207faec5fa00b1692a7d2815c8" Mar 13 16:42:50 crc kubenswrapper[4786]: I0313 16:42:50.786037 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bd7ddb875-t88qs"] Mar 13 16:42:52 crc kubenswrapper[4786]: I0313 16:42:52.569254 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37633ce3-d166-4549-a66b-d4696d0cb76d" path="/var/lib/kubelet/pods/37633ce3-d166-4549-a66b-d4696d0cb76d/volumes" Mar 13 16:42:52 crc kubenswrapper[4786]: I0313 16:42:52.977218 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:52 crc kubenswrapper[4786]: I0313 16:42:52.977287 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.020460 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.034684 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.316968 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.317715 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.366163 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.373036 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.776366 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.776436 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.776459 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 16:42:53 crc kubenswrapper[4786]: I0313 16:42:53.776481 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 16:42:55 crc kubenswrapper[4786]: I0313 16:42:55.547055 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 16:42:55 crc kubenswrapper[4786]: I0313 16:42:55.566799 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 16:42:55 crc kubenswrapper[4786]: I0313 16:42:55.607533 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:55 crc kubenswrapper[4786]: I0313 16:42:55.858759 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 16:42:55 crc kubenswrapper[4786]: I0313 16:42:55.904450 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 16:42:57 crc kubenswrapper[4786]: E0313 16:42:57.126395 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37633ce3_d166_4549_a66b_d4696d0cb76d.slice\": RecentStats: unable to find data in memory cache]" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.002943 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2chdt"] Mar 13 16:43:04 crc kubenswrapper[4786]: E0313 16:43:04.003729 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37633ce3-d166-4549-a66b-d4696d0cb76d" containerName="dnsmasq-dns" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.003748 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="37633ce3-d166-4549-a66b-d4696d0cb76d" containerName="dnsmasq-dns" Mar 13 16:43:04 crc kubenswrapper[4786]: E0313 16:43:04.003796 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37633ce3-d166-4549-a66b-d4696d0cb76d" containerName="init" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.003808 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="37633ce3-d166-4549-a66b-d4696d0cb76d" containerName="init" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.008553 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="37633ce3-d166-4549-a66b-d4696d0cb76d" containerName="dnsmasq-dns" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.009285 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2chdt" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.029731 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwcbq\" (UniqueName: \"kubernetes.io/projected/8222e878-a64d-4afb-910c-7772b7226a4a-kube-api-access-pwcbq\") pod \"placement-db-create-2chdt\" (UID: \"8222e878-a64d-4afb-910c-7772b7226a4a\") " pod="openstack/placement-db-create-2chdt" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.030027 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8222e878-a64d-4afb-910c-7772b7226a4a-operator-scripts\") pod \"placement-db-create-2chdt\" (UID: \"8222e878-a64d-4afb-910c-7772b7226a4a\") " pod="openstack/placement-db-create-2chdt" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.038904 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2chdt"] Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.117450 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7e2d-account-create-update-sxwss"] Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.118548 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.120780 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.131897 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwcbq\" (UniqueName: \"kubernetes.io/projected/8222e878-a64d-4afb-910c-7772b7226a4a-kube-api-access-pwcbq\") pod \"placement-db-create-2chdt\" (UID: \"8222e878-a64d-4afb-910c-7772b7226a4a\") " pod="openstack/placement-db-create-2chdt" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.131986 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8222e878-a64d-4afb-910c-7772b7226a4a-operator-scripts\") pod \"placement-db-create-2chdt\" (UID: \"8222e878-a64d-4afb-910c-7772b7226a4a\") " pod="openstack/placement-db-create-2chdt" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.132950 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8222e878-a64d-4afb-910c-7772b7226a4a-operator-scripts\") pod \"placement-db-create-2chdt\" (UID: \"8222e878-a64d-4afb-910c-7772b7226a4a\") " pod="openstack/placement-db-create-2chdt" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.136593 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7e2d-account-create-update-sxwss"] Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.154494 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwcbq\" (UniqueName: \"kubernetes.io/projected/8222e878-a64d-4afb-910c-7772b7226a4a-kube-api-access-pwcbq\") pod \"placement-db-create-2chdt\" (UID: \"8222e878-a64d-4afb-910c-7772b7226a4a\") " pod="openstack/placement-db-create-2chdt" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.233915 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32ac72a8-e2ed-4402-b828-e930f5a90177-operator-scripts\") pod \"placement-7e2d-account-create-update-sxwss\" (UID: \"32ac72a8-e2ed-4402-b828-e930f5a90177\") " pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.234341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wft7j\" (UniqueName: \"kubernetes.io/projected/32ac72a8-e2ed-4402-b828-e930f5a90177-kube-api-access-wft7j\") pod \"placement-7e2d-account-create-update-sxwss\" (UID: \"32ac72a8-e2ed-4402-b828-e930f5a90177\") " pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.335984 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2chdt" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.336976 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wft7j\" (UniqueName: \"kubernetes.io/projected/32ac72a8-e2ed-4402-b828-e930f5a90177-kube-api-access-wft7j\") pod \"placement-7e2d-account-create-update-sxwss\" (UID: \"32ac72a8-e2ed-4402-b828-e930f5a90177\") " pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.337323 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32ac72a8-e2ed-4402-b828-e930f5a90177-operator-scripts\") pod \"placement-7e2d-account-create-update-sxwss\" (UID: \"32ac72a8-e2ed-4402-b828-e930f5a90177\") " pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.338525 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32ac72a8-e2ed-4402-b828-e930f5a90177-operator-scripts\") pod \"placement-7e2d-account-create-update-sxwss\" (UID: \"32ac72a8-e2ed-4402-b828-e930f5a90177\") " pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.359774 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wft7j\" (UniqueName: \"kubernetes.io/projected/32ac72a8-e2ed-4402-b828-e930f5a90177-kube-api-access-wft7j\") pod \"placement-7e2d-account-create-update-sxwss\" (UID: \"32ac72a8-e2ed-4402-b828-e930f5a90177\") " pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.445084 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.834981 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2chdt"] Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.918697 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7e2d-account-create-update-sxwss"] Mar 13 16:43:04 crc kubenswrapper[4786]: W0313 16:43:04.933671 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32ac72a8_e2ed_4402_b828_e930f5a90177.slice/crio-cd6be1f116a35d4fe00761ff1b4162d77eff66e8cbb88c2ec117e81c2a31392d WatchSource:0}: Error finding container cd6be1f116a35d4fe00761ff1b4162d77eff66e8cbb88c2ec117e81c2a31392d: Status 404 returned error can't find the container with id cd6be1f116a35d4fe00761ff1b4162d77eff66e8cbb88c2ec117e81c2a31392d Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.944333 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2chdt" event={"ID":"8222e878-a64d-4afb-910c-7772b7226a4a","Type":"ContainerStarted","Data":"fbda9af9a32cdaead63ebe96a54c00b1b5b2629948779e2c49d6ed55d4e6d322"} Mar 13 16:43:04 crc kubenswrapper[4786]: I0313 16:43:04.945541 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7e2d-account-create-update-sxwss" event={"ID":"32ac72a8-e2ed-4402-b828-e930f5a90177","Type":"ContainerStarted","Data":"cd6be1f116a35d4fe00761ff1b4162d77eff66e8cbb88c2ec117e81c2a31392d"} Mar 13 16:43:05 crc kubenswrapper[4786]: I0313 16:43:05.967530 4786 generic.go:334] "Generic (PLEG): container finished" podID="8222e878-a64d-4afb-910c-7772b7226a4a" containerID="f970d382746d79cc405a4fe8036dfc3ac12f1d9ffb9e5f79325ac5ed4256ccd3" exitCode=0 Mar 13 16:43:05 crc kubenswrapper[4786]: I0313 16:43:05.967619 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2chdt" event={"ID":"8222e878-a64d-4afb-910c-7772b7226a4a","Type":"ContainerDied","Data":"f970d382746d79cc405a4fe8036dfc3ac12f1d9ffb9e5f79325ac5ed4256ccd3"} Mar 13 16:43:05 crc kubenswrapper[4786]: I0313 16:43:05.974165 4786 generic.go:334] "Generic (PLEG): container finished" podID="32ac72a8-e2ed-4402-b828-e930f5a90177" containerID="9a787eba2620d73508e6833afa61d022be18ec4a4216112f7254e8381d174ead" exitCode=0 Mar 13 16:43:05 crc kubenswrapper[4786]: I0313 16:43:05.974225 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7e2d-account-create-update-sxwss" event={"ID":"32ac72a8-e2ed-4402-b828-e930f5a90177","Type":"ContainerDied","Data":"9a787eba2620d73508e6833afa61d022be18ec4a4216112f7254e8381d174ead"} Mar 13 16:43:07 crc kubenswrapper[4786]: E0313 16:43:07.365000 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37633ce3_d166_4549_a66b_d4696d0cb76d.slice\": RecentStats: unable to find data in memory cache]" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.476988 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2chdt" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.483924 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.502887 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8222e878-a64d-4afb-910c-7772b7226a4a-operator-scripts\") pod \"8222e878-a64d-4afb-910c-7772b7226a4a\" (UID: \"8222e878-a64d-4afb-910c-7772b7226a4a\") " Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.502933 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwcbq\" (UniqueName: \"kubernetes.io/projected/8222e878-a64d-4afb-910c-7772b7226a4a-kube-api-access-pwcbq\") pod \"8222e878-a64d-4afb-910c-7772b7226a4a\" (UID: \"8222e878-a64d-4afb-910c-7772b7226a4a\") " Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.503020 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wft7j\" (UniqueName: \"kubernetes.io/projected/32ac72a8-e2ed-4402-b828-e930f5a90177-kube-api-access-wft7j\") pod \"32ac72a8-e2ed-4402-b828-e930f5a90177\" (UID: \"32ac72a8-e2ed-4402-b828-e930f5a90177\") " Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.503108 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32ac72a8-e2ed-4402-b828-e930f5a90177-operator-scripts\") pod \"32ac72a8-e2ed-4402-b828-e930f5a90177\" (UID: \"32ac72a8-e2ed-4402-b828-e930f5a90177\") " Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.504005 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32ac72a8-e2ed-4402-b828-e930f5a90177-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32ac72a8-e2ed-4402-b828-e930f5a90177" (UID: "32ac72a8-e2ed-4402-b828-e930f5a90177"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.504114 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8222e878-a64d-4afb-910c-7772b7226a4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8222e878-a64d-4afb-910c-7772b7226a4a" (UID: "8222e878-a64d-4afb-910c-7772b7226a4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.511153 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ac72a8-e2ed-4402-b828-e930f5a90177-kube-api-access-wft7j" (OuterVolumeSpecName: "kube-api-access-wft7j") pod "32ac72a8-e2ed-4402-b828-e930f5a90177" (UID: "32ac72a8-e2ed-4402-b828-e930f5a90177"). InnerVolumeSpecName "kube-api-access-wft7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.511292 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8222e878-a64d-4afb-910c-7772b7226a4a-kube-api-access-pwcbq" (OuterVolumeSpecName: "kube-api-access-pwcbq") pod "8222e878-a64d-4afb-910c-7772b7226a4a" (UID: "8222e878-a64d-4afb-910c-7772b7226a4a"). InnerVolumeSpecName "kube-api-access-pwcbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.605064 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wft7j\" (UniqueName: \"kubernetes.io/projected/32ac72a8-e2ed-4402-b828-e930f5a90177-kube-api-access-wft7j\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.605098 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32ac72a8-e2ed-4402-b828-e930f5a90177-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.605111 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8222e878-a64d-4afb-910c-7772b7226a4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.605124 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwcbq\" (UniqueName: \"kubernetes.io/projected/8222e878-a64d-4afb-910c-7772b7226a4a-kube-api-access-pwcbq\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.869376 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.869813 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:43:07 crc kubenswrapper[4786]: I0313 16:43:07.998853 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2chdt" Mar 13 16:43:08 crc kubenswrapper[4786]: I0313 16:43:08.000939 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2chdt" event={"ID":"8222e878-a64d-4afb-910c-7772b7226a4a","Type":"ContainerDied","Data":"fbda9af9a32cdaead63ebe96a54c00b1b5b2629948779e2c49d6ed55d4e6d322"} Mar 13 16:43:08 crc kubenswrapper[4786]: I0313 16:43:08.001007 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbda9af9a32cdaead63ebe96a54c00b1b5b2629948779e2c49d6ed55d4e6d322" Mar 13 16:43:08 crc kubenswrapper[4786]: I0313 16:43:08.003330 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7e2d-account-create-update-sxwss" event={"ID":"32ac72a8-e2ed-4402-b828-e930f5a90177","Type":"ContainerDied","Data":"cd6be1f116a35d4fe00761ff1b4162d77eff66e8cbb88c2ec117e81c2a31392d"} Mar 13 16:43:08 crc kubenswrapper[4786]: I0313 16:43:08.003370 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd6be1f116a35d4fe00761ff1b4162d77eff66e8cbb88c2ec117e81c2a31392d" Mar 13 16:43:08 crc kubenswrapper[4786]: I0313 16:43:08.003457 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7e2d-account-create-update-sxwss" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.374606 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-gv2vc"] Mar 13 16:43:09 crc kubenswrapper[4786]: E0313 16:43:09.375021 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8222e878-a64d-4afb-910c-7772b7226a4a" containerName="mariadb-database-create" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.375037 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8222e878-a64d-4afb-910c-7772b7226a4a" containerName="mariadb-database-create" Mar 13 16:43:09 crc kubenswrapper[4786]: E0313 16:43:09.375048 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ac72a8-e2ed-4402-b828-e930f5a90177" containerName="mariadb-account-create-update" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.375056 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ac72a8-e2ed-4402-b828-e930f5a90177" containerName="mariadb-account-create-update" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.375240 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8222e878-a64d-4afb-910c-7772b7226a4a" containerName="mariadb-database-create" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.375263 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ac72a8-e2ed-4402-b828-e930f5a90177" containerName="mariadb-account-create-update" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.375935 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.381633 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.381739 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.381838 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-knkdr" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.393330 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gv2vc"] Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.430778 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-597c7dfdb7-2vd4x"] Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.432582 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.441070 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-597c7dfdb7-2vd4x"] Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.457562 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzntn\" (UniqueName: \"kubernetes.io/projected/42796ed0-ed45-4786-b1e9-544ac1a13d7d-kube-api-access-fzntn\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.457618 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-config-data\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.457649 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42796ed0-ed45-4786-b1e9-544ac1a13d7d-logs\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.457790 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-combined-ca-bundle\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.457983 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-scripts\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559574 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzntn\" (UniqueName: \"kubernetes.io/projected/42796ed0-ed45-4786-b1e9-544ac1a13d7d-kube-api-access-fzntn\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559632 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pdgp\" (UniqueName: \"kubernetes.io/projected/2502a5af-0301-419d-8d0d-0847b30ec60d-kube-api-access-5pdgp\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559656 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-config\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559681 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-config-data\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559713 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-sb\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42796ed0-ed45-4786-b1e9-544ac1a13d7d-logs\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-combined-ca-bundle\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559811 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-scripts\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559840 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-nb\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.559904 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-dns-svc\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.560577 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42796ed0-ed45-4786-b1e9-544ac1a13d7d-logs\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.566504 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-scripts\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.566764 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-config-data\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.578191 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzntn\" (UniqueName: \"kubernetes.io/projected/42796ed0-ed45-4786-b1e9-544ac1a13d7d-kube-api-access-fzntn\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.582562 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-combined-ca-bundle\") pod \"placement-db-sync-gv2vc\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.661782 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pdgp\" (UniqueName: \"kubernetes.io/projected/2502a5af-0301-419d-8d0d-0847b30ec60d-kube-api-access-5pdgp\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.661841 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-config\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.661946 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-sb\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.662041 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-nb\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.662098 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-dns-svc\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.662893 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-config\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.663064 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-dns-svc\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.663103 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-sb\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.663482 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-nb\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.686221 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pdgp\" (UniqueName: \"kubernetes.io/projected/2502a5af-0301-419d-8d0d-0847b30ec60d-kube-api-access-5pdgp\") pod \"dnsmasq-dns-597c7dfdb7-2vd4x\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.700220 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:09 crc kubenswrapper[4786]: I0313 16:43:09.760207 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:10 crc kubenswrapper[4786]: I0313 16:43:10.266095 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gv2vc"] Mar 13 16:43:10 crc kubenswrapper[4786]: I0313 16:43:10.294359 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-597c7dfdb7-2vd4x"] Mar 13 16:43:10 crc kubenswrapper[4786]: W0313 16:43:10.295244 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2502a5af_0301_419d_8d0d_0847b30ec60d.slice/crio-fc8bed9422ea239f2d30ac2c8f5c16ebfdec9ca13fb7316b086bbce2a640067b WatchSource:0}: Error finding container fc8bed9422ea239f2d30ac2c8f5c16ebfdec9ca13fb7316b086bbce2a640067b: Status 404 returned error can't find the container with id fc8bed9422ea239f2d30ac2c8f5c16ebfdec9ca13fb7316b086bbce2a640067b Mar 13 16:43:11 crc kubenswrapper[4786]: I0313 16:43:11.056121 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gv2vc" event={"ID":"42796ed0-ed45-4786-b1e9-544ac1a13d7d","Type":"ContainerStarted","Data":"022abf23a4cbcb59bf29df003e8cfe42028f0066832674e4ac8300b89c2c2d4a"} Mar 13 16:43:11 crc kubenswrapper[4786]: I0313 16:43:11.056426 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gv2vc" event={"ID":"42796ed0-ed45-4786-b1e9-544ac1a13d7d","Type":"ContainerStarted","Data":"2419657bc4c8a6728594db6ee20fc0729842feab465eef296b8d95dc0c9d4580"} Mar 13 16:43:11 crc kubenswrapper[4786]: I0313 16:43:11.064584 4786 generic.go:334] "Generic (PLEG): container finished" podID="2502a5af-0301-419d-8d0d-0847b30ec60d" containerID="1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65" exitCode=0 Mar 13 16:43:11 crc kubenswrapper[4786]: I0313 16:43:11.064626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" event={"ID":"2502a5af-0301-419d-8d0d-0847b30ec60d","Type":"ContainerDied","Data":"1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65"} Mar 13 16:43:11 crc kubenswrapper[4786]: I0313 16:43:11.064653 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" event={"ID":"2502a5af-0301-419d-8d0d-0847b30ec60d","Type":"ContainerStarted","Data":"fc8bed9422ea239f2d30ac2c8f5c16ebfdec9ca13fb7316b086bbce2a640067b"} Mar 13 16:43:11 crc kubenswrapper[4786]: I0313 16:43:11.100063 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-gv2vc" podStartSLOduration=2.100043441 podStartE2EDuration="2.100043441s" podCreationTimestamp="2026-03-13 16:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:43:11.089408183 +0000 UTC m=+6021.252619994" watchObservedRunningTime="2026-03-13 16:43:11.100043441 +0000 UTC m=+6021.263255252" Mar 13 16:43:12 crc kubenswrapper[4786]: I0313 16:43:12.077471 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" event={"ID":"2502a5af-0301-419d-8d0d-0847b30ec60d","Type":"ContainerStarted","Data":"0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b"} Mar 13 16:43:12 crc kubenswrapper[4786]: I0313 16:43:12.078171 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:12 crc kubenswrapper[4786]: I0313 16:43:12.080517 4786 generic.go:334] "Generic (PLEG): container finished" podID="42796ed0-ed45-4786-b1e9-544ac1a13d7d" containerID="022abf23a4cbcb59bf29df003e8cfe42028f0066832674e4ac8300b89c2c2d4a" exitCode=0 Mar 13 16:43:12 crc kubenswrapper[4786]: I0313 16:43:12.080560 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gv2vc" event={"ID":"42796ed0-ed45-4786-b1e9-544ac1a13d7d","Type":"ContainerDied","Data":"022abf23a4cbcb59bf29df003e8cfe42028f0066832674e4ac8300b89c2c2d4a"} Mar 13 16:43:12 crc kubenswrapper[4786]: I0313 16:43:12.110805 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" podStartSLOduration=3.110785794 podStartE2EDuration="3.110785794s" podCreationTimestamp="2026-03-13 16:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:43:12.104340382 +0000 UTC m=+6022.267552193" watchObservedRunningTime="2026-03-13 16:43:12.110785794 +0000 UTC m=+6022.273997615" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.477267 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.647920 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42796ed0-ed45-4786-b1e9-544ac1a13d7d-logs\") pod \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.647976 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzntn\" (UniqueName: \"kubernetes.io/projected/42796ed0-ed45-4786-b1e9-544ac1a13d7d-kube-api-access-fzntn\") pod \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.648040 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-combined-ca-bundle\") pod \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.648115 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-config-data\") pod \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.648129 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-scripts\") pod \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\" (UID: \"42796ed0-ed45-4786-b1e9-544ac1a13d7d\") " Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.649210 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42796ed0-ed45-4786-b1e9-544ac1a13d7d-logs" (OuterVolumeSpecName: "logs") pod "42796ed0-ed45-4786-b1e9-544ac1a13d7d" (UID: "42796ed0-ed45-4786-b1e9-544ac1a13d7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.656312 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-scripts" (OuterVolumeSpecName: "scripts") pod "42796ed0-ed45-4786-b1e9-544ac1a13d7d" (UID: "42796ed0-ed45-4786-b1e9-544ac1a13d7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.664725 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-58f4f77c48-brshm"] Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.665109 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42796ed0-ed45-4786-b1e9-544ac1a13d7d-kube-api-access-fzntn" (OuterVolumeSpecName: "kube-api-access-fzntn") pod "42796ed0-ed45-4786-b1e9-544ac1a13d7d" (UID: "42796ed0-ed45-4786-b1e9-544ac1a13d7d"). InnerVolumeSpecName "kube-api-access-fzntn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:43:13 crc kubenswrapper[4786]: E0313 16:43:13.665345 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42796ed0-ed45-4786-b1e9-544ac1a13d7d" containerName="placement-db-sync" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.665368 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="42796ed0-ed45-4786-b1e9-544ac1a13d7d" containerName="placement-db-sync" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.665585 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="42796ed0-ed45-4786-b1e9-544ac1a13d7d" containerName="placement-db-sync" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.667345 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.672676 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.672848 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.678909 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58f4f77c48-brshm"] Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.685010 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42796ed0-ed45-4786-b1e9-544ac1a13d7d" (UID: "42796ed0-ed45-4786-b1e9-544ac1a13d7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.694086 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-config-data" (OuterVolumeSpecName: "config-data") pod "42796ed0-ed45-4786-b1e9-544ac1a13d7d" (UID: "42796ed0-ed45-4786-b1e9-544ac1a13d7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750321 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d89k\" (UniqueName: \"kubernetes.io/projected/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-kube-api-access-2d89k\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750405 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-combined-ca-bundle\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750486 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-public-tls-certs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750513 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-scripts\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750564 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-logs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750664 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-internal-tls-certs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750691 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-config-data\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750900 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzntn\" (UniqueName: \"kubernetes.io/projected/42796ed0-ed45-4786-b1e9-544ac1a13d7d-kube-api-access-fzntn\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750921 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750936 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750946 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42796ed0-ed45-4786-b1e9-544ac1a13d7d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.750957 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42796ed0-ed45-4786-b1e9-544ac1a13d7d-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.852187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d89k\" (UniqueName: \"kubernetes.io/projected/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-kube-api-access-2d89k\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.852246 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-combined-ca-bundle\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.852321 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-public-tls-certs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.852338 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-scripts\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.852959 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-logs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.852999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-internal-tls-certs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.853017 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-config-data\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.853493 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-logs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.855610 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-public-tls-certs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.856244 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-combined-ca-bundle\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.856672 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-scripts\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.857494 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-internal-tls-certs\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.858789 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-config-data\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:13 crc kubenswrapper[4786]: I0313 16:43:13.870440 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d89k\" (UniqueName: \"kubernetes.io/projected/8538ba8c-4af3-4757-8a9a-ebcf54a6c253-kube-api-access-2d89k\") pod \"placement-58f4f77c48-brshm\" (UID: \"8538ba8c-4af3-4757-8a9a-ebcf54a6c253\") " pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:14 crc kubenswrapper[4786]: I0313 16:43:14.044324 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:14 crc kubenswrapper[4786]: I0313 16:43:14.105802 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gv2vc" event={"ID":"42796ed0-ed45-4786-b1e9-544ac1a13d7d","Type":"ContainerDied","Data":"2419657bc4c8a6728594db6ee20fc0729842feab465eef296b8d95dc0c9d4580"} Mar 13 16:43:14 crc kubenswrapper[4786]: I0313 16:43:14.105883 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2419657bc4c8a6728594db6ee20fc0729842feab465eef296b8d95dc0c9d4580" Mar 13 16:43:14 crc kubenswrapper[4786]: I0313 16:43:14.105957 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gv2vc" Mar 13 16:43:14 crc kubenswrapper[4786]: I0313 16:43:14.526515 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58f4f77c48-brshm"] Mar 13 16:43:15 crc kubenswrapper[4786]: I0313 16:43:15.116361 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58f4f77c48-brshm" event={"ID":"8538ba8c-4af3-4757-8a9a-ebcf54a6c253","Type":"ContainerStarted","Data":"30dd91ac1b5e5517bac07f5329c3bb2c54f9f0db342d9fe8f4d499b7a139d7e1"} Mar 13 16:43:15 crc kubenswrapper[4786]: I0313 16:43:15.116842 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:15 crc kubenswrapper[4786]: I0313 16:43:15.116881 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58f4f77c48-brshm" event={"ID":"8538ba8c-4af3-4757-8a9a-ebcf54a6c253","Type":"ContainerStarted","Data":"ed2e04ac08d7e9fff36763266568591834109cf9381e947437a71b62e934baaa"} Mar 13 16:43:15 crc kubenswrapper[4786]: I0313 16:43:15.116901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58f4f77c48-brshm" event={"ID":"8538ba8c-4af3-4757-8a9a-ebcf54a6c253","Type":"ContainerStarted","Data":"6cd024da002de7420d3d354bd6ce131a8752e18359fada03e46a15204ed83eb8"} Mar 13 16:43:15 crc kubenswrapper[4786]: I0313 16:43:15.142006 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-58f4f77c48-brshm" podStartSLOduration=2.141985748 podStartE2EDuration="2.141985748s" podCreationTimestamp="2026-03-13 16:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:43:15.136434508 +0000 UTC m=+6025.299646329" watchObservedRunningTime="2026-03-13 16:43:15.141985748 +0000 UTC m=+6025.305197559" Mar 13 16:43:16 crc kubenswrapper[4786]: I0313 16:43:16.124918 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:17 crc kubenswrapper[4786]: E0313 16:43:17.626107 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37633ce3_d166_4549_a66b_d4696d0cb76d.slice\": RecentStats: unable to find data in memory cache]" Mar 13 16:43:19 crc kubenswrapper[4786]: I0313 16:43:19.762119 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:43:19 crc kubenswrapper[4786]: I0313 16:43:19.837198 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77444f4759-c7kpj"] Mar 13 16:43:19 crc kubenswrapper[4786]: I0313 16:43:19.837639 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" podUID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" containerName="dnsmasq-dns" containerID="cri-o://180e87d8c98c309fcc6202a062d93dbcd89f2ff2aba573890234f0b5905f3a3c" gracePeriod=10 Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.165455 4786 generic.go:334] "Generic (PLEG): container finished" podID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" containerID="180e87d8c98c309fcc6202a062d93dbcd89f2ff2aba573890234f0b5905f3a3c" exitCode=0 Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.165501 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" event={"ID":"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6","Type":"ContainerDied","Data":"180e87d8c98c309fcc6202a062d93dbcd89f2ff2aba573890234f0b5905f3a3c"} Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.392365 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.474228 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhq9k\" (UniqueName: \"kubernetes.io/projected/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-kube-api-access-vhq9k\") pod \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.474348 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-config\") pod \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.474488 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-sb\") pod \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.474536 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-nb\") pod \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.474556 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-dns-svc\") pod \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\" (UID: \"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6\") " Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.492199 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-kube-api-access-vhq9k" (OuterVolumeSpecName: "kube-api-access-vhq9k") pod "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" (UID: "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6"). InnerVolumeSpecName "kube-api-access-vhq9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.520198 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" (UID: "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.522673 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" (UID: "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.524365 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" (UID: "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.556315 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-config" (OuterVolumeSpecName: "config") pod "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" (UID: "fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.575981 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.576022 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.576034 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.576044 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhq9k\" (UniqueName: \"kubernetes.io/projected/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-kube-api-access-vhq9k\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:20 crc kubenswrapper[4786]: I0313 16:43:20.576054 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:43:21 crc kubenswrapper[4786]: I0313 16:43:21.194686 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" event={"ID":"fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6","Type":"ContainerDied","Data":"7c20f1fff60404cbf21c675a51b21d8e55c662afcecfb0792808c048d1d36212"} Mar 13 16:43:21 crc kubenswrapper[4786]: I0313 16:43:21.194746 4786 scope.go:117] "RemoveContainer" containerID="180e87d8c98c309fcc6202a062d93dbcd89f2ff2aba573890234f0b5905f3a3c" Mar 13 16:43:21 crc kubenswrapper[4786]: I0313 16:43:21.194745 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77444f4759-c7kpj" Mar 13 16:43:21 crc kubenswrapper[4786]: I0313 16:43:21.216713 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77444f4759-c7kpj"] Mar 13 16:43:21 crc kubenswrapper[4786]: I0313 16:43:21.222242 4786 scope.go:117] "RemoveContainer" containerID="4c5e30097bc2ea6d86d373d53039e366883f7792bf86dbe3e36c7eb59bea3d76" Mar 13 16:43:21 crc kubenswrapper[4786]: I0313 16:43:21.227475 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77444f4759-c7kpj"] Mar 13 16:43:22 crc kubenswrapper[4786]: I0313 16:43:22.568706 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" path="/var/lib/kubelet/pods/fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6/volumes" Mar 13 16:43:27 crc kubenswrapper[4786]: E0313 16:43:27.897134 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37633ce3_d166_4549_a66b_d4696d0cb76d.slice\": RecentStats: unable to find data in memory cache]" Mar 13 16:43:37 crc kubenswrapper[4786]: I0313 16:43:37.868618 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:43:37 crc kubenswrapper[4786]: I0313 16:43:37.869464 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:43:38 crc kubenswrapper[4786]: E0313 16:43:38.091561 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37633ce3_d166_4549_a66b_d4696d0cb76d.slice\": RecentStats: unable to find data in memory cache]" Mar 13 16:43:45 crc kubenswrapper[4786]: I0313 16:43:45.087623 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:45 crc kubenswrapper[4786]: I0313 16:43:45.091482 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58f4f77c48-brshm" Mar 13 16:43:48 crc kubenswrapper[4786]: E0313 16:43:48.275020 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37633ce3_d166_4549_a66b_d4696d0cb76d.slice\": RecentStats: unable to find data in memory cache]" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.160412 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557004-7wrf2"] Mar 13 16:44:00 crc kubenswrapper[4786]: E0313 16:44:00.161593 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" containerName="dnsmasq-dns" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.161623 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" containerName="dnsmasq-dns" Mar 13 16:44:00 crc kubenswrapper[4786]: E0313 16:44:00.161641 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" containerName="init" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.161652 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" containerName="init" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.162070 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2b1fe2-d5a2-4e13-b192-f7ea9f658bc6" containerName="dnsmasq-dns" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.163036 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557004-7wrf2" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.166015 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.166592 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.171432 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.177067 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557004-7wrf2"] Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.294027 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n87nk\" (UniqueName: \"kubernetes.io/projected/6dd47d33-4d46-482b-b900-84f35aba9716-kube-api-access-n87nk\") pod \"auto-csr-approver-29557004-7wrf2\" (UID: \"6dd47d33-4d46-482b-b900-84f35aba9716\") " pod="openshift-infra/auto-csr-approver-29557004-7wrf2" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.395905 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n87nk\" (UniqueName: \"kubernetes.io/projected/6dd47d33-4d46-482b-b900-84f35aba9716-kube-api-access-n87nk\") pod \"auto-csr-approver-29557004-7wrf2\" (UID: \"6dd47d33-4d46-482b-b900-84f35aba9716\") " pod="openshift-infra/auto-csr-approver-29557004-7wrf2" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.429266 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n87nk\" (UniqueName: \"kubernetes.io/projected/6dd47d33-4d46-482b-b900-84f35aba9716-kube-api-access-n87nk\") pod \"auto-csr-approver-29557004-7wrf2\" (UID: \"6dd47d33-4d46-482b-b900-84f35aba9716\") " pod="openshift-infra/auto-csr-approver-29557004-7wrf2" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.484243 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557004-7wrf2" Mar 13 16:44:00 crc kubenswrapper[4786]: I0313 16:44:00.957888 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557004-7wrf2"] Mar 13 16:44:01 crc kubenswrapper[4786]: I0313 16:44:01.631537 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557004-7wrf2" event={"ID":"6dd47d33-4d46-482b-b900-84f35aba9716","Type":"ContainerStarted","Data":"4e8d2b83e9264396efc8fcb5f70d09aa57d5c0d9b63e9bfdedf1d8f06a78d30e"} Mar 13 16:44:02 crc kubenswrapper[4786]: I0313 16:44:02.643530 4786 generic.go:334] "Generic (PLEG): container finished" podID="6dd47d33-4d46-482b-b900-84f35aba9716" containerID="6d97179f85c12ce1d7e844a49536e50883e86d0c91f87272482b92aa3ee37afa" exitCode=0 Mar 13 16:44:02 crc kubenswrapper[4786]: I0313 16:44:02.643589 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557004-7wrf2" event={"ID":"6dd47d33-4d46-482b-b900-84f35aba9716","Type":"ContainerDied","Data":"6d97179f85c12ce1d7e844a49536e50883e86d0c91f87272482b92aa3ee37afa"} Mar 13 16:44:04 crc kubenswrapper[4786]: I0313 16:44:04.030487 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557004-7wrf2" Mar 13 16:44:04 crc kubenswrapper[4786]: I0313 16:44:04.167538 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n87nk\" (UniqueName: \"kubernetes.io/projected/6dd47d33-4d46-482b-b900-84f35aba9716-kube-api-access-n87nk\") pod \"6dd47d33-4d46-482b-b900-84f35aba9716\" (UID: \"6dd47d33-4d46-482b-b900-84f35aba9716\") " Mar 13 16:44:04 crc kubenswrapper[4786]: I0313 16:44:04.175572 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd47d33-4d46-482b-b900-84f35aba9716-kube-api-access-n87nk" (OuterVolumeSpecName: "kube-api-access-n87nk") pod "6dd47d33-4d46-482b-b900-84f35aba9716" (UID: "6dd47d33-4d46-482b-b900-84f35aba9716"). InnerVolumeSpecName "kube-api-access-n87nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:04 crc kubenswrapper[4786]: I0313 16:44:04.270394 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n87nk\" (UniqueName: \"kubernetes.io/projected/6dd47d33-4d46-482b-b900-84f35aba9716-kube-api-access-n87nk\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:04 crc kubenswrapper[4786]: I0313 16:44:04.676114 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557004-7wrf2" event={"ID":"6dd47d33-4d46-482b-b900-84f35aba9716","Type":"ContainerDied","Data":"4e8d2b83e9264396efc8fcb5f70d09aa57d5c0d9b63e9bfdedf1d8f06a78d30e"} Mar 13 16:44:04 crc kubenswrapper[4786]: I0313 16:44:04.676153 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e8d2b83e9264396efc8fcb5f70d09aa57d5c0d9b63e9bfdedf1d8f06a78d30e" Mar 13 16:44:04 crc kubenswrapper[4786]: I0313 16:44:04.676672 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557004-7wrf2" Mar 13 16:44:05 crc kubenswrapper[4786]: I0313 16:44:05.103950 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556998-r25pl"] Mar 13 16:44:05 crc kubenswrapper[4786]: I0313 16:44:05.112202 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556998-r25pl"] Mar 13 16:44:06 crc kubenswrapper[4786]: I0313 16:44:06.568945 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75dae8f7-1cd2-4d03-873e-76a806f00915" path="/var/lib/kubelet/pods/75dae8f7-1cd2-4d03-873e-76a806f00915/volumes" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.188566 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-m99vx"] Mar 13 16:44:07 crc kubenswrapper[4786]: E0313 16:44:07.189188 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd47d33-4d46-482b-b900-84f35aba9716" containerName="oc" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.189217 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd47d33-4d46-482b-b900-84f35aba9716" containerName="oc" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.189511 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd47d33-4d46-482b-b900-84f35aba9716" containerName="oc" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.190528 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.201589 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m99vx"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.279030 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-s6ctj"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.280378 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.319207 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-s6ctj"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.328495 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-operator-scripts\") pod \"nova-api-db-create-m99vx\" (UID: \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\") " pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.328623 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s74g7\" (UniqueName: \"kubernetes.io/projected/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-kube-api-access-s74g7\") pod \"nova-api-db-create-m99vx\" (UID: \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\") " pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.352220 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f116-account-create-update-f2fc9"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.353882 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.359548 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.382761 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f116-account-create-update-f2fc9"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.410819 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5kd2h"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.411933 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.420126 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5kd2h"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.430026 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/683c9de6-8e5a-4ea1-a17e-af0743fee223-operator-scripts\") pod \"nova-cell0-db-create-s6ctj\" (UID: \"683c9de6-8e5a-4ea1-a17e-af0743fee223\") " pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.430071 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk89k\" (UniqueName: \"kubernetes.io/projected/683c9de6-8e5a-4ea1-a17e-af0743fee223-kube-api-access-pk89k\") pod \"nova-cell0-db-create-s6ctj\" (UID: \"683c9de6-8e5a-4ea1-a17e-af0743fee223\") " pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.430123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-operator-scripts\") pod \"nova-api-db-create-m99vx\" (UID: \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\") " pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.430156 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s74g7\" (UniqueName: \"kubernetes.io/projected/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-kube-api-access-s74g7\") pod \"nova-api-db-create-m99vx\" (UID: \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\") " pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.435179 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-operator-scripts\") pod \"nova-api-db-create-m99vx\" (UID: \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\") " pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.449241 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s74g7\" (UniqueName: \"kubernetes.io/projected/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-kube-api-access-s74g7\") pod \"nova-api-db-create-m99vx\" (UID: \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\") " pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.490767 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fda7-account-create-update-fxslq"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.492074 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.494191 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.502030 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fda7-account-create-update-fxslq"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.514283 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.532336 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/683c9de6-8e5a-4ea1-a17e-af0743fee223-operator-scripts\") pod \"nova-cell0-db-create-s6ctj\" (UID: \"683c9de6-8e5a-4ea1-a17e-af0743fee223\") " pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.532400 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk89k\" (UniqueName: \"kubernetes.io/projected/683c9de6-8e5a-4ea1-a17e-af0743fee223-kube-api-access-pk89k\") pod \"nova-cell0-db-create-s6ctj\" (UID: \"683c9de6-8e5a-4ea1-a17e-af0743fee223\") " pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.532443 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-872sd\" (UniqueName: \"kubernetes.io/projected/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-kube-api-access-872sd\") pod \"nova-cell1-db-create-5kd2h\" (UID: \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\") " pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.532480 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68kp2\" (UniqueName: \"kubernetes.io/projected/be4ba9a6-435c-4672-83a6-c46c2ce855e7-kube-api-access-68kp2\") pod \"nova-api-f116-account-create-update-f2fc9\" (UID: \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\") " pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.532554 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4ba9a6-435c-4672-83a6-c46c2ce855e7-operator-scripts\") pod \"nova-api-f116-account-create-update-f2fc9\" (UID: \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\") " pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.532583 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-operator-scripts\") pod \"nova-cell1-db-create-5kd2h\" (UID: \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\") " pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.533271 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/683c9de6-8e5a-4ea1-a17e-af0743fee223-operator-scripts\") pod \"nova-cell0-db-create-s6ctj\" (UID: \"683c9de6-8e5a-4ea1-a17e-af0743fee223\") " pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.552777 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk89k\" (UniqueName: \"kubernetes.io/projected/683c9de6-8e5a-4ea1-a17e-af0743fee223-kube-api-access-pk89k\") pod \"nova-cell0-db-create-s6ctj\" (UID: \"683c9de6-8e5a-4ea1-a17e-af0743fee223\") " pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.596017 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.633872 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-872sd\" (UniqueName: \"kubernetes.io/projected/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-kube-api-access-872sd\") pod \"nova-cell1-db-create-5kd2h\" (UID: \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\") " pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.633921 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68kp2\" (UniqueName: \"kubernetes.io/projected/be4ba9a6-435c-4672-83a6-c46c2ce855e7-kube-api-access-68kp2\") pod \"nova-api-f116-account-create-update-f2fc9\" (UID: \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\") " pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.633983 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-operator-scripts\") pod \"nova-cell0-fda7-account-create-update-fxslq\" (UID: \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\") " pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.634004 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4ba9a6-435c-4672-83a6-c46c2ce855e7-operator-scripts\") pod \"nova-api-f116-account-create-update-f2fc9\" (UID: \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\") " pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.634025 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-operator-scripts\") pod \"nova-cell1-db-create-5kd2h\" (UID: \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\") " pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.634093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnw97\" (UniqueName: \"kubernetes.io/projected/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-kube-api-access-rnw97\") pod \"nova-cell0-fda7-account-create-update-fxslq\" (UID: \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\") " pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.635471 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4ba9a6-435c-4672-83a6-c46c2ce855e7-operator-scripts\") pod \"nova-api-f116-account-create-update-f2fc9\" (UID: \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\") " pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.635947 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-operator-scripts\") pod \"nova-cell1-db-create-5kd2h\" (UID: \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\") " pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.668299 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68kp2\" (UniqueName: \"kubernetes.io/projected/be4ba9a6-435c-4672-83a6-c46c2ce855e7-kube-api-access-68kp2\") pod \"nova-api-f116-account-create-update-f2fc9\" (UID: \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\") " pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.675384 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-872sd\" (UniqueName: \"kubernetes.io/projected/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-kube-api-access-872sd\") pod \"nova-cell1-db-create-5kd2h\" (UID: \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\") " pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.685739 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.720636 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0f54-account-create-update-l8l9w"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.721875 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.724747 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.727754 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.735640 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnw97\" (UniqueName: \"kubernetes.io/projected/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-kube-api-access-rnw97\") pod \"nova-cell0-fda7-account-create-update-fxslq\" (UID: \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\") " pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.735774 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-operator-scripts\") pod \"nova-cell0-fda7-account-create-update-fxslq\" (UID: \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\") " pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.736486 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-operator-scripts\") pod \"nova-cell0-fda7-account-create-update-fxslq\" (UID: \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\") " pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.761379 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0f54-account-create-update-l8l9w"] Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.782419 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnw97\" (UniqueName: \"kubernetes.io/projected/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-kube-api-access-rnw97\") pod \"nova-cell0-fda7-account-create-update-fxslq\" (UID: \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\") " pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.820943 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.837591 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec79171-dff4-4672-bb9c-d5f71a44dd70-operator-scripts\") pod \"nova-cell1-0f54-account-create-update-l8l9w\" (UID: \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\") " pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.837684 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qrcj\" (UniqueName: \"kubernetes.io/projected/1ec79171-dff4-4672-bb9c-d5f71a44dd70-kube-api-access-4qrcj\") pod \"nova-cell1-0f54-account-create-update-l8l9w\" (UID: \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\") " pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.868457 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.868503 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.868553 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.869188 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.869236 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" gracePeriod=600 Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.939205 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec79171-dff4-4672-bb9c-d5f71a44dd70-operator-scripts\") pod \"nova-cell1-0f54-account-create-update-l8l9w\" (UID: \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\") " pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.939850 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec79171-dff4-4672-bb9c-d5f71a44dd70-operator-scripts\") pod \"nova-cell1-0f54-account-create-update-l8l9w\" (UID: \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\") " pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.939997 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qrcj\" (UniqueName: \"kubernetes.io/projected/1ec79171-dff4-4672-bb9c-d5f71a44dd70-kube-api-access-4qrcj\") pod \"nova-cell1-0f54-account-create-update-l8l9w\" (UID: \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\") " pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:07 crc kubenswrapper[4786]: I0313 16:44:07.958791 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qrcj\" (UniqueName: \"kubernetes.io/projected/1ec79171-dff4-4672-bb9c-d5f71a44dd70-kube-api-access-4qrcj\") pod \"nova-cell1-0f54-account-create-update-l8l9w\" (UID: \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\") " pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:08 crc kubenswrapper[4786]: E0313 16:44:08.031371 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.084984 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m99vx"] Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.093582 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:08 crc kubenswrapper[4786]: W0313 16:44:08.094324 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dadc28e_92a8_4760_b6c1_d4bfc87535c2.slice/crio-29963d7c65c5c8e725620fc2c21284de7d9878eae40a43435d9e1e527062e5d8 WatchSource:0}: Error finding container 29963d7c65c5c8e725620fc2c21284de7d9878eae40a43435d9e1e527062e5d8: Status 404 returned error can't find the container with id 29963d7c65c5c8e725620fc2c21284de7d9878eae40a43435d9e1e527062e5d8 Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.234022 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-s6ctj"] Mar 13 16:44:08 crc kubenswrapper[4786]: W0313 16:44:08.243936 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod683c9de6_8e5a_4ea1_a17e_af0743fee223.slice/crio-89b92d07cf1317343036bef1f242c3e36d9f8110b5040fa011b3e098686a4c66 WatchSource:0}: Error finding container 89b92d07cf1317343036bef1f242c3e36d9f8110b5040fa011b3e098686a4c66: Status 404 returned error can't find the container with id 89b92d07cf1317343036bef1f242c3e36d9f8110b5040fa011b3e098686a4c66 Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.322147 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f116-account-create-update-f2fc9"] Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.406559 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fda7-account-create-update-fxslq"] Mar 13 16:44:08 crc kubenswrapper[4786]: W0313 16:44:08.412524 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fbe78b0_93cc_4afe_839c_c0d3e26873c2.slice/crio-eee9691f2a9bd500eccdea3e0de6b907e1066ef9b31ae57a75301c21ac3ffe00 WatchSource:0}: Error finding container eee9691f2a9bd500eccdea3e0de6b907e1066ef9b31ae57a75301c21ac3ffe00: Status 404 returned error can't find the container with id eee9691f2a9bd500eccdea3e0de6b907e1066ef9b31ae57a75301c21ac3ffe00 Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.414588 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5kd2h"] Mar 13 16:44:08 crc kubenswrapper[4786]: W0313 16:44:08.414999 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ee8d0e_612f_4ef9_a135_d1bd89fc4136.slice/crio-62cb10afce20821ded1df9485b236a225ea8447215179cbcea69d2c96ad23edf WatchSource:0}: Error finding container 62cb10afce20821ded1df9485b236a225ea8447215179cbcea69d2c96ad23edf: Status 404 returned error can't find the container with id 62cb10afce20821ded1df9485b236a225ea8447215179cbcea69d2c96ad23edf Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.547475 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0f54-account-create-update-l8l9w"] Mar 13 16:44:08 crc kubenswrapper[4786]: E0313 16:44:08.726409 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod683c9de6_8e5a_4ea1_a17e_af0743fee223.slice/crio-6002c2e0f1c7471fe1e760bc188a4df7fb96e4fee58e36ca8f36bf268890e4d6.scope\": RecentStats: unable to find data in memory cache]" Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.775539 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5kd2h" event={"ID":"81ee8d0e-612f-4ef9-a135-d1bd89fc4136","Type":"ContainerStarted","Data":"62cb10afce20821ded1df9485b236a225ea8447215179cbcea69d2c96ad23edf"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.777165 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" event={"ID":"1ec79171-dff4-4672-bb9c-d5f71a44dd70","Type":"ContainerStarted","Data":"52c8fcab11b787cce771fa6973524e1989d4846ca960eb1745cd5adeccb2e1a4"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.781053 4786 generic.go:334] "Generic (PLEG): container finished" podID="683c9de6-8e5a-4ea1-a17e-af0743fee223" containerID="6002c2e0f1c7471fe1e760bc188a4df7fb96e4fee58e36ca8f36bf268890e4d6" exitCode=0 Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.781168 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s6ctj" event={"ID":"683c9de6-8e5a-4ea1-a17e-af0743fee223","Type":"ContainerDied","Data":"6002c2e0f1c7471fe1e760bc188a4df7fb96e4fee58e36ca8f36bf268890e4d6"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.781202 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s6ctj" event={"ID":"683c9de6-8e5a-4ea1-a17e-af0743fee223","Type":"ContainerStarted","Data":"89b92d07cf1317343036bef1f242c3e36d9f8110b5040fa011b3e098686a4c66"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.783072 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fda7-account-create-update-fxslq" event={"ID":"0fbe78b0-93cc-4afe-839c-c0d3e26873c2","Type":"ContainerStarted","Data":"eee9691f2a9bd500eccdea3e0de6b907e1066ef9b31ae57a75301c21ac3ffe00"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.788108 4786 generic.go:334] "Generic (PLEG): container finished" podID="be4ba9a6-435c-4672-83a6-c46c2ce855e7" containerID="99240a98222f5ac13b06a15c98ab00d1b2e4f7de0df1174fc7cab2346947576b" exitCode=0 Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.788134 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f116-account-create-update-f2fc9" event={"ID":"be4ba9a6-435c-4672-83a6-c46c2ce855e7","Type":"ContainerDied","Data":"99240a98222f5ac13b06a15c98ab00d1b2e4f7de0df1174fc7cab2346947576b"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.788168 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f116-account-create-update-f2fc9" event={"ID":"be4ba9a6-435c-4672-83a6-c46c2ce855e7","Type":"ContainerStarted","Data":"17830f152b559b6dc4482fe8fb1b8ef3787faa9294ed2bd1a68bbade07910325"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.790902 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" exitCode=0 Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.790950 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.791021 4786 scope.go:117] "RemoveContainer" containerID="06f63c372c8088421f902d282e4d63c4055021fcedd22c0ec607e784bab36af6" Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.791787 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:44:08 crc kubenswrapper[4786]: E0313 16:44:08.792174 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.792927 4786 generic.go:334] "Generic (PLEG): container finished" podID="6dadc28e-92a8-4760-b6c1-d4bfc87535c2" containerID="f0cd0a5970b00748bfcdbcd54f7ae854e808959da022f38a572bffb5c54e4220" exitCode=0 Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.792969 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m99vx" event={"ID":"6dadc28e-92a8-4760-b6c1-d4bfc87535c2","Type":"ContainerDied","Data":"f0cd0a5970b00748bfcdbcd54f7ae854e808959da022f38a572bffb5c54e4220"} Mar 13 16:44:08 crc kubenswrapper[4786]: I0313 16:44:08.792991 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m99vx" event={"ID":"6dadc28e-92a8-4760-b6c1-d4bfc87535c2","Type":"ContainerStarted","Data":"29963d7c65c5c8e725620fc2c21284de7d9878eae40a43435d9e1e527062e5d8"} Mar 13 16:44:09 crc kubenswrapper[4786]: I0313 16:44:09.810991 4786 generic.go:334] "Generic (PLEG): container finished" podID="81ee8d0e-612f-4ef9-a135-d1bd89fc4136" containerID="a3eceef4608e58d05af49d15b9b49d79b87fcbd574226b60c89beea3a3e03fd2" exitCode=0 Mar 13 16:44:09 crc kubenswrapper[4786]: I0313 16:44:09.811238 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5kd2h" event={"ID":"81ee8d0e-612f-4ef9-a135-d1bd89fc4136","Type":"ContainerDied","Data":"a3eceef4608e58d05af49d15b9b49d79b87fcbd574226b60c89beea3a3e03fd2"} Mar 13 16:44:09 crc kubenswrapper[4786]: I0313 16:44:09.815067 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ec79171-dff4-4672-bb9c-d5f71a44dd70" containerID="846413b5b3803373eece2f112134a568159934affa9a7d7534f7a82069b7ecf3" exitCode=0 Mar 13 16:44:09 crc kubenswrapper[4786]: I0313 16:44:09.815142 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" event={"ID":"1ec79171-dff4-4672-bb9c-d5f71a44dd70","Type":"ContainerDied","Data":"846413b5b3803373eece2f112134a568159934affa9a7d7534f7a82069b7ecf3"} Mar 13 16:44:09 crc kubenswrapper[4786]: I0313 16:44:09.819683 4786 generic.go:334] "Generic (PLEG): container finished" podID="0fbe78b0-93cc-4afe-839c-c0d3e26873c2" containerID="7f27e8e08779d209490ffcbe0750f091e1dab517216867ffce25b02b396eb718" exitCode=0 Mar 13 16:44:09 crc kubenswrapper[4786]: I0313 16:44:09.819806 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fda7-account-create-update-fxslq" event={"ID":"0fbe78b0-93cc-4afe-839c-c0d3e26873c2","Type":"ContainerDied","Data":"7f27e8e08779d209490ffcbe0750f091e1dab517216867ffce25b02b396eb718"} Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.301387 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.306693 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.312073 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.394193 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s74g7\" (UniqueName: \"kubernetes.io/projected/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-kube-api-access-s74g7\") pod \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\" (UID: \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\") " Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.394293 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68kp2\" (UniqueName: \"kubernetes.io/projected/be4ba9a6-435c-4672-83a6-c46c2ce855e7-kube-api-access-68kp2\") pod \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\" (UID: \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\") " Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.394315 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk89k\" (UniqueName: \"kubernetes.io/projected/683c9de6-8e5a-4ea1-a17e-af0743fee223-kube-api-access-pk89k\") pod \"683c9de6-8e5a-4ea1-a17e-af0743fee223\" (UID: \"683c9de6-8e5a-4ea1-a17e-af0743fee223\") " Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.394377 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-operator-scripts\") pod \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\" (UID: \"6dadc28e-92a8-4760-b6c1-d4bfc87535c2\") " Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.394438 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/683c9de6-8e5a-4ea1-a17e-af0743fee223-operator-scripts\") pod \"683c9de6-8e5a-4ea1-a17e-af0743fee223\" (UID: \"683c9de6-8e5a-4ea1-a17e-af0743fee223\") " Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.394458 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4ba9a6-435c-4672-83a6-c46c2ce855e7-operator-scripts\") pod \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\" (UID: \"be4ba9a6-435c-4672-83a6-c46c2ce855e7\") " Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.395506 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6dadc28e-92a8-4760-b6c1-d4bfc87535c2" (UID: "6dadc28e-92a8-4760-b6c1-d4bfc87535c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.395520 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/683c9de6-8e5a-4ea1-a17e-af0743fee223-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "683c9de6-8e5a-4ea1-a17e-af0743fee223" (UID: "683c9de6-8e5a-4ea1-a17e-af0743fee223"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.395967 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be4ba9a6-435c-4672-83a6-c46c2ce855e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be4ba9a6-435c-4672-83a6-c46c2ce855e7" (UID: "be4ba9a6-435c-4672-83a6-c46c2ce855e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.399705 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4ba9a6-435c-4672-83a6-c46c2ce855e7-kube-api-access-68kp2" (OuterVolumeSpecName: "kube-api-access-68kp2") pod "be4ba9a6-435c-4672-83a6-c46c2ce855e7" (UID: "be4ba9a6-435c-4672-83a6-c46c2ce855e7"). InnerVolumeSpecName "kube-api-access-68kp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.400062 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-kube-api-access-s74g7" (OuterVolumeSpecName: "kube-api-access-s74g7") pod "6dadc28e-92a8-4760-b6c1-d4bfc87535c2" (UID: "6dadc28e-92a8-4760-b6c1-d4bfc87535c2"). InnerVolumeSpecName "kube-api-access-s74g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.401507 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/683c9de6-8e5a-4ea1-a17e-af0743fee223-kube-api-access-pk89k" (OuterVolumeSpecName: "kube-api-access-pk89k") pod "683c9de6-8e5a-4ea1-a17e-af0743fee223" (UID: "683c9de6-8e5a-4ea1-a17e-af0743fee223"). InnerVolumeSpecName "kube-api-access-pk89k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.496175 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.496217 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/683c9de6-8e5a-4ea1-a17e-af0743fee223-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.496231 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4ba9a6-435c-4672-83a6-c46c2ce855e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.496249 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s74g7\" (UniqueName: \"kubernetes.io/projected/6dadc28e-92a8-4760-b6c1-d4bfc87535c2-kube-api-access-s74g7\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.496266 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68kp2\" (UniqueName: \"kubernetes.io/projected/be4ba9a6-435c-4672-83a6-c46c2ce855e7-kube-api-access-68kp2\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.496282 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk89k\" (UniqueName: \"kubernetes.io/projected/683c9de6-8e5a-4ea1-a17e-af0743fee223-kube-api-access-pk89k\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.834371 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m99vx" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.835272 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m99vx" event={"ID":"6dadc28e-92a8-4760-b6c1-d4bfc87535c2","Type":"ContainerDied","Data":"29963d7c65c5c8e725620fc2c21284de7d9878eae40a43435d9e1e527062e5d8"} Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.835299 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29963d7c65c5c8e725620fc2c21284de7d9878eae40a43435d9e1e527062e5d8" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.837465 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s6ctj" event={"ID":"683c9de6-8e5a-4ea1-a17e-af0743fee223","Type":"ContainerDied","Data":"89b92d07cf1317343036bef1f242c3e36d9f8110b5040fa011b3e098686a4c66"} Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.837490 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b92d07cf1317343036bef1f242c3e36d9f8110b5040fa011b3e098686a4c66" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.837495 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s6ctj" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.840106 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f116-account-create-update-f2fc9" event={"ID":"be4ba9a6-435c-4672-83a6-c46c2ce855e7","Type":"ContainerDied","Data":"17830f152b559b6dc4482fe8fb1b8ef3787faa9294ed2bd1a68bbade07910325"} Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.840175 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17830f152b559b6dc4482fe8fb1b8ef3787faa9294ed2bd1a68bbade07910325" Mar 13 16:44:10 crc kubenswrapper[4786]: I0313 16:44:10.840233 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f116-account-create-update-f2fc9" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.308457 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.311805 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.339996 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.416759 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-872sd\" (UniqueName: \"kubernetes.io/projected/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-kube-api-access-872sd\") pod \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\" (UID: \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\") " Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.416914 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec79171-dff4-4672-bb9c-d5f71a44dd70-operator-scripts\") pod \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\" (UID: \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\") " Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.416984 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qrcj\" (UniqueName: \"kubernetes.io/projected/1ec79171-dff4-4672-bb9c-d5f71a44dd70-kube-api-access-4qrcj\") pod \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\" (UID: \"1ec79171-dff4-4672-bb9c-d5f71a44dd70\") " Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.417028 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-operator-scripts\") pod \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\" (UID: \"81ee8d0e-612f-4ef9-a135-d1bd89fc4136\") " Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.417577 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81ee8d0e-612f-4ef9-a135-d1bd89fc4136" (UID: "81ee8d0e-612f-4ef9-a135-d1bd89fc4136"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.417622 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec79171-dff4-4672-bb9c-d5f71a44dd70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ec79171-dff4-4672-bb9c-d5f71a44dd70" (UID: "1ec79171-dff4-4672-bb9c-d5f71a44dd70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.422401 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-kube-api-access-872sd" (OuterVolumeSpecName: "kube-api-access-872sd") pod "81ee8d0e-612f-4ef9-a135-d1bd89fc4136" (UID: "81ee8d0e-612f-4ef9-a135-d1bd89fc4136"). InnerVolumeSpecName "kube-api-access-872sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.422524 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec79171-dff4-4672-bb9c-d5f71a44dd70-kube-api-access-4qrcj" (OuterVolumeSpecName: "kube-api-access-4qrcj") pod "1ec79171-dff4-4672-bb9c-d5f71a44dd70" (UID: "1ec79171-dff4-4672-bb9c-d5f71a44dd70"). InnerVolumeSpecName "kube-api-access-4qrcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.519364 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnw97\" (UniqueName: \"kubernetes.io/projected/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-kube-api-access-rnw97\") pod \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\" (UID: \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\") " Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.519520 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-operator-scripts\") pod \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\" (UID: \"0fbe78b0-93cc-4afe-839c-c0d3e26873c2\") " Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.520486 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fbe78b0-93cc-4afe-839c-c0d3e26873c2" (UID: "0fbe78b0-93cc-4afe-839c-c0d3e26873c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.520668 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.520704 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec79171-dff4-4672-bb9c-d5f71a44dd70-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.520728 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qrcj\" (UniqueName: \"kubernetes.io/projected/1ec79171-dff4-4672-bb9c-d5f71a44dd70-kube-api-access-4qrcj\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.520746 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.520764 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-872sd\" (UniqueName: \"kubernetes.io/projected/81ee8d0e-612f-4ef9-a135-d1bd89fc4136-kube-api-access-872sd\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.522508 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-kube-api-access-rnw97" (OuterVolumeSpecName: "kube-api-access-rnw97") pod "0fbe78b0-93cc-4afe-839c-c0d3e26873c2" (UID: "0fbe78b0-93cc-4afe-839c-c0d3e26873c2"). InnerVolumeSpecName "kube-api-access-rnw97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.623354 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnw97\" (UniqueName: \"kubernetes.io/projected/0fbe78b0-93cc-4afe-839c-c0d3e26873c2-kube-api-access-rnw97\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.854847 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5kd2h" event={"ID":"81ee8d0e-612f-4ef9-a135-d1bd89fc4136","Type":"ContainerDied","Data":"62cb10afce20821ded1df9485b236a225ea8447215179cbcea69d2c96ad23edf"} Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.854931 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62cb10afce20821ded1df9485b236a225ea8447215179cbcea69d2c96ad23edf" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.855009 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5kd2h" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.857731 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.857731 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0f54-account-create-update-l8l9w" event={"ID":"1ec79171-dff4-4672-bb9c-d5f71a44dd70","Type":"ContainerDied","Data":"52c8fcab11b787cce771fa6973524e1989d4846ca960eb1745cd5adeccb2e1a4"} Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.857792 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52c8fcab11b787cce771fa6973524e1989d4846ca960eb1745cd5adeccb2e1a4" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.865128 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fda7-account-create-update-fxslq" event={"ID":"0fbe78b0-93cc-4afe-839c-c0d3e26873c2","Type":"ContainerDied","Data":"eee9691f2a9bd500eccdea3e0de6b907e1066ef9b31ae57a75301c21ac3ffe00"} Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.865159 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eee9691f2a9bd500eccdea3e0de6b907e1066ef9b31ae57a75301c21ac3ffe00" Mar 13 16:44:11 crc kubenswrapper[4786]: I0313 16:44:11.865233 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fda7-account-create-update-fxslq" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.660800 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6fhlc"] Mar 13 16:44:12 crc kubenswrapper[4786]: E0313 16:44:12.661439 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec79171-dff4-4672-bb9c-d5f71a44dd70" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661457 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec79171-dff4-4672-bb9c-d5f71a44dd70" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: E0313 16:44:12.661485 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ee8d0e-612f-4ef9-a135-d1bd89fc4136" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661494 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ee8d0e-612f-4ef9-a135-d1bd89fc4136" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: E0313 16:44:12.661510 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683c9de6-8e5a-4ea1-a17e-af0743fee223" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661518 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="683c9de6-8e5a-4ea1-a17e-af0743fee223" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: E0313 16:44:12.661528 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dadc28e-92a8-4760-b6c1-d4bfc87535c2" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661534 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dadc28e-92a8-4760-b6c1-d4bfc87535c2" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: E0313 16:44:12.661546 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fbe78b0-93cc-4afe-839c-c0d3e26873c2" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661552 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbe78b0-93cc-4afe-839c-c0d3e26873c2" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: E0313 16:44:12.661574 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4ba9a6-435c-4672-83a6-c46c2ce855e7" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661582 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4ba9a6-435c-4672-83a6-c46c2ce855e7" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661758 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4ba9a6-435c-4672-83a6-c46c2ce855e7" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661779 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dadc28e-92a8-4760-b6c1-d4bfc87535c2" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661791 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ee8d0e-612f-4ef9-a135-d1bd89fc4136" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661807 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec79171-dff4-4672-bb9c-d5f71a44dd70" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661819 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="683c9de6-8e5a-4ea1-a17e-af0743fee223" containerName="mariadb-database-create" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.661827 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fbe78b0-93cc-4afe-839c-c0d3e26873c2" containerName="mariadb-account-create-update" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.662469 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.664692 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kctvd" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.665123 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.667821 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.670786 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6fhlc"] Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.746673 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-config-data\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.746710 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-scripts\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.746744 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.746911 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vjd8\" (UniqueName: \"kubernetes.io/projected/1fc792b6-1efc-495e-a21f-5a1d9c854918-kube-api-access-2vjd8\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.848514 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-config-data\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.848565 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-scripts\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.848593 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.848625 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vjd8\" (UniqueName: \"kubernetes.io/projected/1fc792b6-1efc-495e-a21f-5a1d9c854918-kube-api-access-2vjd8\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.853544 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.853748 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-config-data\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.854842 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-scripts\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.864813 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vjd8\" (UniqueName: \"kubernetes.io/projected/1fc792b6-1efc-495e-a21f-5a1d9c854918-kube-api-access-2vjd8\") pod \"nova-cell0-conductor-db-sync-6fhlc\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:12 crc kubenswrapper[4786]: I0313 16:44:12.982834 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:13 crc kubenswrapper[4786]: I0313 16:44:13.508164 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6fhlc"] Mar 13 16:44:13 crc kubenswrapper[4786]: W0313 16:44:13.508714 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc792b6_1efc_495e_a21f_5a1d9c854918.slice/crio-616f98dbe88541b967f9094ec71dbb96fab00d14a215e91377122d8678ff87ed WatchSource:0}: Error finding container 616f98dbe88541b967f9094ec71dbb96fab00d14a215e91377122d8678ff87ed: Status 404 returned error can't find the container with id 616f98dbe88541b967f9094ec71dbb96fab00d14a215e91377122d8678ff87ed Mar 13 16:44:13 crc kubenswrapper[4786]: I0313 16:44:13.886783 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" event={"ID":"1fc792b6-1efc-495e-a21f-5a1d9c854918","Type":"ContainerStarted","Data":"9118afe520cae42ed9ed04b641fd4161c1f159a572113ef1f608c799e44d85e9"} Mar 13 16:44:13 crc kubenswrapper[4786]: I0313 16:44:13.887134 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" event={"ID":"1fc792b6-1efc-495e-a21f-5a1d9c854918","Type":"ContainerStarted","Data":"616f98dbe88541b967f9094ec71dbb96fab00d14a215e91377122d8678ff87ed"} Mar 13 16:44:13 crc kubenswrapper[4786]: I0313 16:44:13.918379 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" podStartSLOduration=1.9183632099999999 podStartE2EDuration="1.91836321s" podCreationTimestamp="2026-03-13 16:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:13.912991835 +0000 UTC m=+6084.076203656" watchObservedRunningTime="2026-03-13 16:44:13.91836321 +0000 UTC m=+6084.081575021" Mar 13 16:44:15 crc kubenswrapper[4786]: I0313 16:44:15.323580 4786 scope.go:117] "RemoveContainer" containerID="d8fa9696b8f2e0decbc7c6db8476fd4b4c3fa2c6de75f1eb17187c3a2c4440ed" Mar 13 16:44:15 crc kubenswrapper[4786]: I0313 16:44:15.395303 4786 scope.go:117] "RemoveContainer" containerID="9b397e3b818451fe5004ef34c11806826c4e40f95728dcc8d622a36d8f55ecc8" Mar 13 16:44:18 crc kubenswrapper[4786]: I0313 16:44:18.934385 4786 generic.go:334] "Generic (PLEG): container finished" podID="1fc792b6-1efc-495e-a21f-5a1d9c854918" containerID="9118afe520cae42ed9ed04b641fd4161c1f159a572113ef1f608c799e44d85e9" exitCode=0 Mar 13 16:44:18 crc kubenswrapper[4786]: I0313 16:44:18.934626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" event={"ID":"1fc792b6-1efc-495e-a21f-5a1d9c854918","Type":"ContainerDied","Data":"9118afe520cae42ed9ed04b641fd4161c1f159a572113ef1f608c799e44d85e9"} Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.274111 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.413483 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vjd8\" (UniqueName: \"kubernetes.io/projected/1fc792b6-1efc-495e-a21f-5a1d9c854918-kube-api-access-2vjd8\") pod \"1fc792b6-1efc-495e-a21f-5a1d9c854918\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.413614 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-scripts\") pod \"1fc792b6-1efc-495e-a21f-5a1d9c854918\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.413771 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-config-data\") pod \"1fc792b6-1efc-495e-a21f-5a1d9c854918\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.413892 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-combined-ca-bundle\") pod \"1fc792b6-1efc-495e-a21f-5a1d9c854918\" (UID: \"1fc792b6-1efc-495e-a21f-5a1d9c854918\") " Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.420955 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-scripts" (OuterVolumeSpecName: "scripts") pod "1fc792b6-1efc-495e-a21f-5a1d9c854918" (UID: "1fc792b6-1efc-495e-a21f-5a1d9c854918"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.421049 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc792b6-1efc-495e-a21f-5a1d9c854918-kube-api-access-2vjd8" (OuterVolumeSpecName: "kube-api-access-2vjd8") pod "1fc792b6-1efc-495e-a21f-5a1d9c854918" (UID: "1fc792b6-1efc-495e-a21f-5a1d9c854918"). InnerVolumeSpecName "kube-api-access-2vjd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.439220 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fc792b6-1efc-495e-a21f-5a1d9c854918" (UID: "1fc792b6-1efc-495e-a21f-5a1d9c854918"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.440640 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-config-data" (OuterVolumeSpecName: "config-data") pod "1fc792b6-1efc-495e-a21f-5a1d9c854918" (UID: "1fc792b6-1efc-495e-a21f-5a1d9c854918"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.517345 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.517397 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.517418 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vjd8\" (UniqueName: \"kubernetes.io/projected/1fc792b6-1efc-495e-a21f-5a1d9c854918-kube-api-access-2vjd8\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.517435 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc792b6-1efc-495e-a21f-5a1d9c854918-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.561204 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:44:20 crc kubenswrapper[4786]: E0313 16:44:20.561691 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.960656 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" event={"ID":"1fc792b6-1efc-495e-a21f-5a1d9c854918","Type":"ContainerDied","Data":"616f98dbe88541b967f9094ec71dbb96fab00d14a215e91377122d8678ff87ed"} Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.961101 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616f98dbe88541b967f9094ec71dbb96fab00d14a215e91377122d8678ff87ed" Mar 13 16:44:20 crc kubenswrapper[4786]: I0313 16:44:20.960843 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6fhlc" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.070395 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 16:44:21 crc kubenswrapper[4786]: E0313 16:44:21.070833 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc792b6-1efc-495e-a21f-5a1d9c854918" containerName="nova-cell0-conductor-db-sync" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.070869 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc792b6-1efc-495e-a21f-5a1d9c854918" containerName="nova-cell0-conductor-db-sync" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.071069 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc792b6-1efc-495e-a21f-5a1d9c854918" containerName="nova-cell0-conductor-db-sync" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.071808 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.080745 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.081006 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kctvd" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.098018 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.229593 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b658310c-9b6a-4a6b-babd-f5c50cce01d2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.229742 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkcm4\" (UniqueName: \"kubernetes.io/projected/b658310c-9b6a-4a6b-babd-f5c50cce01d2-kube-api-access-dkcm4\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.229946 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b658310c-9b6a-4a6b-babd-f5c50cce01d2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.332521 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b658310c-9b6a-4a6b-babd-f5c50cce01d2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.332740 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b658310c-9b6a-4a6b-babd-f5c50cce01d2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.332807 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkcm4\" (UniqueName: \"kubernetes.io/projected/b658310c-9b6a-4a6b-babd-f5c50cce01d2-kube-api-access-dkcm4\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.337558 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b658310c-9b6a-4a6b-babd-f5c50cce01d2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.338726 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b658310c-9b6a-4a6b-babd-f5c50cce01d2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.354953 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkcm4\" (UniqueName: \"kubernetes.io/projected/b658310c-9b6a-4a6b-babd-f5c50cce01d2-kube-api-access-dkcm4\") pod \"nova-cell0-conductor-0\" (UID: \"b658310c-9b6a-4a6b-babd-f5c50cce01d2\") " pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:21 crc kubenswrapper[4786]: I0313 16:44:21.396816 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:22 crc kubenswrapper[4786]: I0313 16:44:21.922210 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 16:44:22 crc kubenswrapper[4786]: I0313 16:44:21.972627 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b658310c-9b6a-4a6b-babd-f5c50cce01d2","Type":"ContainerStarted","Data":"26335fb8763b12d788dcabffda8c0087fcd9d3bd0d65830f65056b9977b5bbd0"} Mar 13 16:44:22 crc kubenswrapper[4786]: I0313 16:44:22.985320 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b658310c-9b6a-4a6b-babd-f5c50cce01d2","Type":"ContainerStarted","Data":"e4d374084b3e5b0bcb5de5b7120020d04cbdbb07d3ea503a0b80465c02eff892"} Mar 13 16:44:23 crc kubenswrapper[4786]: I0313 16:44:23.017950 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.017926705 podStartE2EDuration="2.017926705s" podCreationTimestamp="2026-03-13 16:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:23.007473772 +0000 UTC m=+6093.170685623" watchObservedRunningTime="2026-03-13 16:44:23.017926705 +0000 UTC m=+6093.181138526" Mar 13 16:44:23 crc kubenswrapper[4786]: I0313 16:44:23.994286 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:31 crc kubenswrapper[4786]: I0313 16:44:31.441709 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 16:44:31 crc kubenswrapper[4786]: I0313 16:44:31.552437 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:44:31 crc kubenswrapper[4786]: E0313 16:44:31.552639 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:44:31 crc kubenswrapper[4786]: I0313 16:44:31.985549 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mmn4t"] Mar 13 16:44:31 crc kubenswrapper[4786]: I0313 16:44:31.986675 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:31 crc kubenswrapper[4786]: I0313 16:44:31.988553 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 16:44:31 crc kubenswrapper[4786]: I0313 16:44:31.990610 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.005455 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mmn4t"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.107094 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.113719 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.118386 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.119725 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.155839 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.155949 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-scripts\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.155978 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-config-data\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.156024 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psdd6\" (UniqueName: \"kubernetes.io/projected/e59cb012-12c2-4674-850a-d9638f76670d-kube-api-access-psdd6\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.170386 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.174633 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.177181 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.186602 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.203087 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.204444 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.207259 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.230965 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257222 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psdd6\" (UniqueName: \"kubernetes.io/projected/e59cb012-12c2-4674-850a-d9638f76670d-kube-api-access-psdd6\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257265 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-logs\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/736c2655-9bc2-4f40-833a-995e03e75412-logs\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257346 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-config-data\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257370 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257419 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jslr\" (UniqueName: \"kubernetes.io/projected/736c2655-9bc2-4f40-833a-995e03e75412-kube-api-access-7jslr\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257445 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-config-data\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257464 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-scripts\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257518 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-config-data\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.257540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zbkg\" (UniqueName: \"kubernetes.io/projected/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-kube-api-access-7zbkg\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.263658 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.264469 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-config-data\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.267056 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-scripts\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.274928 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.276836 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.305745 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.317790 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psdd6\" (UniqueName: \"kubernetes.io/projected/e59cb012-12c2-4674-850a-d9638f76670d-kube-api-access-psdd6\") pod \"nova-cell0-cell-mapping-mmn4t\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.318424 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.347948 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.366751 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.366847 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-config-data\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.366913 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.366962 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-config-data\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.367644 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zbkg\" (UniqueName: \"kubernetes.io/projected/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-kube-api-access-7zbkg\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.367805 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-logs\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.367886 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.367927 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/736c2655-9bc2-4f40-833a-995e03e75412-logs\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.367997 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h8kj\" (UniqueName: \"kubernetes.io/projected/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-kube-api-access-8h8kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.368407 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-config-data\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.368438 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwx4\" (UniqueName: \"kubernetes.io/projected/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-kube-api-access-qlwx4\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.368448 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-logs\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.368693 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/736c2655-9bc2-4f40-833a-995e03e75412-logs\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.373493 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-config-data\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.373552 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.378017 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.378068 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.378100 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jslr\" (UniqueName: \"kubernetes.io/projected/736c2655-9bc2-4f40-833a-995e03e75412-kube-api-access-7jslr\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.381842 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-config-data\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.384092 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.386603 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zbkg\" (UniqueName: \"kubernetes.io/projected/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-kube-api-access-7zbkg\") pod \"nova-metadata-0\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.405746 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jslr\" (UniqueName: \"kubernetes.io/projected/736c2655-9bc2-4f40-833a-995e03e75412-kube-api-access-7jslr\") pod \"nova-api-0\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.408909 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bfd4f4db5-7b988"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.410395 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.418351 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bfd4f4db5-7b988"] Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.449251 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480089 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-config\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480353 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480405 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480443 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h8kj\" (UniqueName: \"kubernetes.io/projected/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-kube-api-access-8h8kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480462 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsnz9\" (UniqueName: \"kubernetes.io/projected/067c3518-fddf-49ed-9e5f-bc44f60d6897-kube-api-access-bsnz9\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480490 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwx4\" (UniqueName: \"kubernetes.io/projected/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-kube-api-access-qlwx4\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480526 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480549 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480570 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-dns-svc\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480587 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.480616 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-config-data\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.484813 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.485513 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-config-data\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.488330 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.488376 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.496842 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwx4\" (UniqueName: \"kubernetes.io/projected/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-kube-api-access-qlwx4\") pod \"nova-scheduler-0\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.498452 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h8kj\" (UniqueName: \"kubernetes.io/projected/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-kube-api-access-8h8kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.501251 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.519050 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.582145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsnz9\" (UniqueName: \"kubernetes.io/projected/067c3518-fddf-49ed-9e5f-bc44f60d6897-kube-api-access-bsnz9\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.582223 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.582259 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-dns-svc\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.582322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-config\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.582343 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.583137 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.583652 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-dns-svc\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.584051 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.585598 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-config\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.598519 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsnz9\" (UniqueName: \"kubernetes.io/projected/067c3518-fddf-49ed-9e5f-bc44f60d6897-kube-api-access-bsnz9\") pod \"dnsmasq-dns-5bfd4f4db5-7b988\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.755590 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.773093 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.844801 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mmn4t"] Mar 13 16:44:32 crc kubenswrapper[4786]: W0313 16:44:32.845178 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode59cb012_12c2_4674_850a_d9638f76670d.slice/crio-73ff562c1490c47a9ce94165a2a046802147501ef4a018f5a1cc03cf97a068a3 WatchSource:0}: Error finding container 73ff562c1490c47a9ce94165a2a046802147501ef4a018f5a1cc03cf97a068a3: Status 404 returned error can't find the container with id 73ff562c1490c47a9ce94165a2a046802147501ef4a018f5a1cc03cf97a068a3 Mar 13 16:44:32 crc kubenswrapper[4786]: I0313 16:44:32.975587 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:32 crc kubenswrapper[4786]: W0313 16:44:32.995440 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod736c2655_9bc2_4f40_833a_995e03e75412.slice/crio-edce4d527562fc8c6ebceaa87bf66d4aec085be8f2a0bc48b8616595ce22639f WatchSource:0}: Error finding container edce4d527562fc8c6ebceaa87bf66d4aec085be8f2a0bc48b8616595ce22639f: Status 404 returned error can't find the container with id edce4d527562fc8c6ebceaa87bf66d4aec085be8f2a0bc48b8616595ce22639f Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.037911 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.044854 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:33 crc kubenswrapper[4786]: W0313 16:44:33.067458 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d9e5b51_5271_40ac_aa4d_dd5c6d3ed188.slice/crio-f3ed3976332f5ae61488168408e267926e196512d5ea07f6ac37203dcbd0df41 WatchSource:0}: Error finding container f3ed3976332f5ae61488168408e267926e196512d5ea07f6ac37203dcbd0df41: Status 404 returned error can't find the container with id f3ed3976332f5ae61488168408e267926e196512d5ea07f6ac37203dcbd0df41 Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.073824 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dkm6m"] Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.075094 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.081262 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.081338 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.090565 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dkm6m"] Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.129185 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mmn4t" event={"ID":"e59cb012-12c2-4674-850a-d9638f76670d","Type":"ContainerStarted","Data":"b8cd6f5b561b22034a89ae3b3d6f259bb4169e8a9c5c2988ba829fe9ebde792a"} Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.129222 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mmn4t" event={"ID":"e59cb012-12c2-4674-850a-d9638f76670d","Type":"ContainerStarted","Data":"73ff562c1490c47a9ce94165a2a046802147501ef4a018f5a1cc03cf97a068a3"} Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.133745 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc","Type":"ContainerStarted","Data":"ab156a48b2eb8b0dc6ee5102ae1b9586babc10586c08ad0827ce94e54ae155f1"} Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.135319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188","Type":"ContainerStarted","Data":"f3ed3976332f5ae61488168408e267926e196512d5ea07f6ac37203dcbd0df41"} Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.136433 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"736c2655-9bc2-4f40-833a-995e03e75412","Type":"ContainerStarted","Data":"edce4d527562fc8c6ebceaa87bf66d4aec085be8f2a0bc48b8616595ce22639f"} Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.144924 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mmn4t" podStartSLOduration=2.144908593 podStartE2EDuration="2.144908593s" podCreationTimestamp="2026-03-13 16:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:33.144650077 +0000 UTC m=+6103.307861888" watchObservedRunningTime="2026-03-13 16:44:33.144908593 +0000 UTC m=+6103.308120404" Mar 13 16:44:33 crc kubenswrapper[4786]: W0313 16:44:33.197110 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f3ece23_fee4_42c3_8c8f_cafd6b5cfa7f.slice/crio-a429714ae3ec84fbf898f69a1599470383c6e9ec1cdd8b78d85a37ec5a3ba108 WatchSource:0}: Error finding container a429714ae3ec84fbf898f69a1599470383c6e9ec1cdd8b78d85a37ec5a3ba108: Status 404 returned error can't find the container with id a429714ae3ec84fbf898f69a1599470383c6e9ec1cdd8b78d85a37ec5a3ba108 Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.197738 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-config-data\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.197809 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-scripts\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.197849 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.197886 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn9sg\" (UniqueName: \"kubernetes.io/projected/76ef6159-b90f-418a-97d3-e256d87aefb5-kube-api-access-wn9sg\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.199053 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.299515 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-config-data\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.299587 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-scripts\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.299628 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.299651 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn9sg\" (UniqueName: \"kubernetes.io/projected/76ef6159-b90f-418a-97d3-e256d87aefb5-kube-api-access-wn9sg\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.303849 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-scripts\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.304885 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-config-data\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.306664 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.317154 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn9sg\" (UniqueName: \"kubernetes.io/projected/76ef6159-b90f-418a-97d3-e256d87aefb5-kube-api-access-wn9sg\") pod \"nova-cell1-conductor-db-sync-dkm6m\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.326106 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bfd4f4db5-7b988"] Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.401047 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:33 crc kubenswrapper[4786]: I0313 16:44:33.957532 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dkm6m"] Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.148166 4786 generic.go:334] "Generic (PLEG): container finished" podID="067c3518-fddf-49ed-9e5f-bc44f60d6897" containerID="413dd7bd2a10d71ebc2dd6ec5d8b04cd0623e6839a169533acf70d39c63844db" exitCode=0 Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.148245 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" event={"ID":"067c3518-fddf-49ed-9e5f-bc44f60d6897","Type":"ContainerDied","Data":"413dd7bd2a10d71ebc2dd6ec5d8b04cd0623e6839a169533acf70d39c63844db"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.148278 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" event={"ID":"067c3518-fddf-49ed-9e5f-bc44f60d6897","Type":"ContainerStarted","Data":"ab15bab01ad5d0c5b668b0a2b727443090e8768a3bd196c0741d9f59b3f57b07"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.150073 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc","Type":"ContainerStarted","Data":"59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.151494 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" event={"ID":"76ef6159-b90f-418a-97d3-e256d87aefb5","Type":"ContainerStarted","Data":"2764304bcc8da9df6388222c8e88cec2fb0266ab2e1596b78027f30b57b21292"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.151531 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" event={"ID":"76ef6159-b90f-418a-97d3-e256d87aefb5","Type":"ContainerStarted","Data":"91f6715198195ca477b3574043cc9b4ee9e812a9409f187a725c175c8c732181"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.160207 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188","Type":"ContainerStarted","Data":"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.160259 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188","Type":"ContainerStarted","Data":"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.166216 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f","Type":"ContainerStarted","Data":"8cc7c9c6f5950c2df01da0736fa4f81716ad6b8874a67f7ea5f6ac5a050f7a3a"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.166261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f","Type":"ContainerStarted","Data":"a429714ae3ec84fbf898f69a1599470383c6e9ec1cdd8b78d85a37ec5a3ba108"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.169410 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"736c2655-9bc2-4f40-833a-995e03e75412","Type":"ContainerStarted","Data":"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.169456 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"736c2655-9bc2-4f40-833a-995e03e75412","Type":"ContainerStarted","Data":"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468"} Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.227939 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" podStartSLOduration=1.227921792 podStartE2EDuration="1.227921792s" podCreationTimestamp="2026-03-13 16:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:34.225687436 +0000 UTC m=+6104.388899247" watchObservedRunningTime="2026-03-13 16:44:34.227921792 +0000 UTC m=+6104.391133603" Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.291755 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.2917378250000002 podStartE2EDuration="2.291737825s" podCreationTimestamp="2026-03-13 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:34.283096568 +0000 UTC m=+6104.446308379" watchObservedRunningTime="2026-03-13 16:44:34.291737825 +0000 UTC m=+6104.454949636" Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.406607 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.406576811 podStartE2EDuration="2.406576811s" podCreationTimestamp="2026-03-13 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:34.362486723 +0000 UTC m=+6104.525698534" watchObservedRunningTime="2026-03-13 16:44:34.406576811 +0000 UTC m=+6104.569788622" Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.440891 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.440866592 podStartE2EDuration="2.440866592s" podCreationTimestamp="2026-03-13 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:34.394127998 +0000 UTC m=+6104.557339809" watchObservedRunningTime="2026-03-13 16:44:34.440866592 +0000 UTC m=+6104.604078403" Mar 13 16:44:34 crc kubenswrapper[4786]: I0313 16:44:34.449740 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.449717755 podStartE2EDuration="2.449717755s" podCreationTimestamp="2026-03-13 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:34.438465762 +0000 UTC m=+6104.601677583" watchObservedRunningTime="2026-03-13 16:44:34.449717755 +0000 UTC m=+6104.612929566" Mar 13 16:44:35 crc kubenswrapper[4786]: I0313 16:44:35.181276 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" event={"ID":"067c3518-fddf-49ed-9e5f-bc44f60d6897","Type":"ContainerStarted","Data":"ee691082331590c03e45be6cc563420fdbecd6894a6ff59f77020466a4d10b5f"} Mar 13 16:44:35 crc kubenswrapper[4786]: I0313 16:44:35.205443 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" podStartSLOduration=3.205425161 podStartE2EDuration="3.205425161s" podCreationTimestamp="2026-03-13 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:35.203933774 +0000 UTC m=+6105.367145575" watchObservedRunningTime="2026-03-13 16:44:35.205425161 +0000 UTC m=+6105.368636972" Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.072619 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.101095 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.187680 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268" gracePeriod=30 Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.188311 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerName="nova-metadata-log" containerID="cri-o://e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752" gracePeriod=30 Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.188395 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerName="nova-metadata-metadata" containerID="cri-o://7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27" gracePeriod=30 Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.188551 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.755781 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.881410 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.903100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-config-data\") pod \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.903195 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-logs\") pod \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.903238 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-combined-ca-bundle\") pod \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.903275 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zbkg\" (UniqueName: \"kubernetes.io/projected/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-kube-api-access-7zbkg\") pod \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\" (UID: \"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188\") " Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.905808 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-logs" (OuterVolumeSpecName: "logs") pod "8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" (UID: "8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.911801 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-kube-api-access-7zbkg" (OuterVolumeSpecName: "kube-api-access-7zbkg") pod "8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" (UID: "8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188"). InnerVolumeSpecName "kube-api-access-7zbkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.938706 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-config-data" (OuterVolumeSpecName: "config-data") pod "8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" (UID: "8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:36 crc kubenswrapper[4786]: I0313 16:44:36.938742 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" (UID: "8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.004585 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-config-data\") pod \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.004621 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-combined-ca-bundle\") pod \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.004667 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h8kj\" (UniqueName: \"kubernetes.io/projected/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-kube-api-access-8h8kj\") pod \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\" (UID: \"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc\") " Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.005145 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.005164 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.005173 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zbkg\" (UniqueName: \"kubernetes.io/projected/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-kube-api-access-7zbkg\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.005183 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.008847 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-kube-api-access-8h8kj" (OuterVolumeSpecName: "kube-api-access-8h8kj") pod "010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" (UID: "010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc"). InnerVolumeSpecName "kube-api-access-8h8kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.026307 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-config-data" (OuterVolumeSpecName: "config-data") pod "010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" (UID: "010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.028102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" (UID: "010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.106572 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.106609 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.106623 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h8kj\" (UniqueName: \"kubernetes.io/projected/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc-kube-api-access-8h8kj\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.197157 4786 generic.go:334] "Generic (PLEG): container finished" podID="76ef6159-b90f-418a-97d3-e256d87aefb5" containerID="2764304bcc8da9df6388222c8e88cec2fb0266ab2e1596b78027f30b57b21292" exitCode=0 Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.197603 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" event={"ID":"76ef6159-b90f-418a-97d3-e256d87aefb5","Type":"ContainerDied","Data":"2764304bcc8da9df6388222c8e88cec2fb0266ab2e1596b78027f30b57b21292"} Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.199270 4786 generic.go:334] "Generic (PLEG): container finished" podID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerID="7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27" exitCode=0 Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.199303 4786 generic.go:334] "Generic (PLEG): container finished" podID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerID="e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752" exitCode=143 Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.199351 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188","Type":"ContainerDied","Data":"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27"} Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.199384 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188","Type":"ContainerDied","Data":"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752"} Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.199402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188","Type":"ContainerDied","Data":"f3ed3976332f5ae61488168408e267926e196512d5ea07f6ac37203dcbd0df41"} Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.199424 4786 scope.go:117] "RemoveContainer" containerID="7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.199583 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.214071 4786 generic.go:334] "Generic (PLEG): container finished" podID="010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" containerID="59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268" exitCode=0 Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.214134 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc","Type":"ContainerDied","Data":"59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268"} Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.214160 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.214173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc","Type":"ContainerDied","Data":"ab156a48b2eb8b0dc6ee5102ae1b9586babc10586c08ad0827ce94e54ae155f1"} Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.253124 4786 scope.go:117] "RemoveContainer" containerID="e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.262047 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.280517 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.288316 4786 scope.go:117] "RemoveContainer" containerID="7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27" Mar 13 16:44:37 crc kubenswrapper[4786]: E0313 16:44:37.289039 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27\": container with ID starting with 7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27 not found: ID does not exist" containerID="7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.289096 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27"} err="failed to get container status \"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27\": rpc error: code = NotFound desc = could not find container \"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27\": container with ID starting with 7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27 not found: ID does not exist" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.289131 4786 scope.go:117] "RemoveContainer" containerID="e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752" Mar 13 16:44:37 crc kubenswrapper[4786]: E0313 16:44:37.290133 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752\": container with ID starting with e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752 not found: ID does not exist" containerID="e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.290176 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752"} err="failed to get container status \"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752\": rpc error: code = NotFound desc = could not find container \"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752\": container with ID starting with e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752 not found: ID does not exist" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.290203 4786 scope.go:117] "RemoveContainer" containerID="7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.290688 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27"} err="failed to get container status \"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27\": rpc error: code = NotFound desc = could not find container \"7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27\": container with ID starting with 7274e9097c31f146c399eada7124ad1d10ff8535405212735c1967973be95b27 not found: ID does not exist" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.290717 4786 scope.go:117] "RemoveContainer" containerID="e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.291054 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752"} err="failed to get container status \"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752\": rpc error: code = NotFound desc = could not find container \"e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752\": container with ID starting with e670c72de74d3bf5b18539fcc92bd82188049b31995ea7e99c7c94de8c89e752 not found: ID does not exist" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.291077 4786 scope.go:117] "RemoveContainer" containerID="59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.293711 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.306002 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:37 crc kubenswrapper[4786]: E0313 16:44:37.306569 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerName="nova-metadata-metadata" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.306595 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerName="nova-metadata-metadata" Mar 13 16:44:37 crc kubenswrapper[4786]: E0313 16:44:37.306610 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerName="nova-metadata-log" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.306620 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerName="nova-metadata-log" Mar 13 16:44:37 crc kubenswrapper[4786]: E0313 16:44:37.306652 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.306661 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.306951 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerName="nova-metadata-metadata" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.307011 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.307043 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" containerName="nova-metadata-log" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.308623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.313483 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.313679 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.316920 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.341146 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.349220 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.350264 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.353267 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.353404 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.353513 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.373898 4786 scope.go:117] "RemoveContainer" containerID="59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268" Mar 13 16:44:37 crc kubenswrapper[4786]: E0313 16:44:37.376132 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268\": container with ID starting with 59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268 not found: ID does not exist" containerID="59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.376171 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268"} err="failed to get container status \"59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268\": rpc error: code = NotFound desc = could not find container \"59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268\": container with ID starting with 59806e8a70f95bcce9c9bb68e1b59747f5d0092a56e8a61aba922e73994c5268 not found: ID does not exist" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.386227 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.419938 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420031 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5nps\" (UniqueName: \"kubernetes.io/projected/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-kube-api-access-r5nps\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420135 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420213 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2dcv\" (UniqueName: \"kubernetes.io/projected/e137d965-4671-483c-80be-206f9c897681-kube-api-access-s2dcv\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420318 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-config-data\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420390 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420463 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420504 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.420540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-logs\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.523296 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-config-data\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.523428 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.523560 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.523622 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.523661 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-logs\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.523727 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.523801 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.523844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5nps\" (UniqueName: \"kubernetes.io/projected/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-kube-api-access-r5nps\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.524025 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.524116 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dcv\" (UniqueName: \"kubernetes.io/projected/e137d965-4671-483c-80be-206f9c897681-kube-api-access-s2dcv\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.526956 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-logs\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.538283 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.538363 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.538485 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.538599 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-config-data\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.541212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5nps\" (UniqueName: \"kubernetes.io/projected/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-kube-api-access-r5nps\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.542357 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.543886 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.547761 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e137d965-4671-483c-80be-206f9c897681-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.548539 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dcv\" (UniqueName: \"kubernetes.io/projected/e137d965-4671-483c-80be-206f9c897681-kube-api-access-s2dcv\") pod \"nova-cell1-novncproxy-0\" (UID: \"e137d965-4671-483c-80be-206f9c897681\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.677414 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.690716 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:37 crc kubenswrapper[4786]: I0313 16:44:37.756700 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 16:44:38 crc kubenswrapper[4786]: W0313 16:44:38.178895 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode137d965_4671_483c_80be_206f9c897681.slice/crio-56755256d5126a402a47c5d7e93e4457dcae83e202c88d220b48e5965d6f2fe0 WatchSource:0}: Error finding container 56755256d5126a402a47c5d7e93e4457dcae83e202c88d220b48e5965d6f2fe0: Status 404 returned error can't find the container with id 56755256d5126a402a47c5d7e93e4457dcae83e202c88d220b48e5965d6f2fe0 Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.179635 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.233667 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.233925 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e137d965-4671-483c-80be-206f9c897681","Type":"ContainerStarted","Data":"56755256d5126a402a47c5d7e93e4457dcae83e202c88d220b48e5965d6f2fe0"} Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.235476 4786 generic.go:334] "Generic (PLEG): container finished" podID="e59cb012-12c2-4674-850a-d9638f76670d" containerID="b8cd6f5b561b22034a89ae3b3d6f259bb4169e8a9c5c2988ba829fe9ebde792a" exitCode=0 Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.235538 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mmn4t" event={"ID":"e59cb012-12c2-4674-850a-d9638f76670d","Type":"ContainerDied","Data":"b8cd6f5b561b22034a89ae3b3d6f259bb4169e8a9c5c2988ba829fe9ebde792a"} Mar 13 16:44:38 crc kubenswrapper[4786]: W0313 16:44:38.266525 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda06aa5c6_6ef2_4bb2_bccc_82cb87f894f6.slice/crio-b8c8082819ff67c6cecae7cbefb0af485e69122c74e3b236d5561e63fda5dcb6 WatchSource:0}: Error finding container b8c8082819ff67c6cecae7cbefb0af485e69122c74e3b236d5561e63fda5dcb6: Status 404 returned error can't find the container with id b8c8082819ff67c6cecae7cbefb0af485e69122c74e3b236d5561e63fda5dcb6 Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.572128 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc" path="/var/lib/kubelet/pods/010f73b5-c76f-4030-8ca2-bd5dd1b5d0dc/volumes" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.574370 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188" path="/var/lib/kubelet/pods/8d9e5b51-5271-40ac-aa4d-dd5c6d3ed188/volumes" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.723646 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.853937 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-scripts\") pod \"76ef6159-b90f-418a-97d3-e256d87aefb5\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.854110 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn9sg\" (UniqueName: \"kubernetes.io/projected/76ef6159-b90f-418a-97d3-e256d87aefb5-kube-api-access-wn9sg\") pod \"76ef6159-b90f-418a-97d3-e256d87aefb5\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.854411 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-combined-ca-bundle\") pod \"76ef6159-b90f-418a-97d3-e256d87aefb5\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.854554 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-config-data\") pod \"76ef6159-b90f-418a-97d3-e256d87aefb5\" (UID: \"76ef6159-b90f-418a-97d3-e256d87aefb5\") " Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.861483 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-scripts" (OuterVolumeSpecName: "scripts") pod "76ef6159-b90f-418a-97d3-e256d87aefb5" (UID: "76ef6159-b90f-418a-97d3-e256d87aefb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.861999 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ef6159-b90f-418a-97d3-e256d87aefb5-kube-api-access-wn9sg" (OuterVolumeSpecName: "kube-api-access-wn9sg") pod "76ef6159-b90f-418a-97d3-e256d87aefb5" (UID: "76ef6159-b90f-418a-97d3-e256d87aefb5"). InnerVolumeSpecName "kube-api-access-wn9sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.896531 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76ef6159-b90f-418a-97d3-e256d87aefb5" (UID: "76ef6159-b90f-418a-97d3-e256d87aefb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.904375 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-config-data" (OuterVolumeSpecName: "config-data") pod "76ef6159-b90f-418a-97d3-e256d87aefb5" (UID: "76ef6159-b90f-418a-97d3-e256d87aefb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.959800 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.959835 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.959848 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn9sg\" (UniqueName: \"kubernetes.io/projected/76ef6159-b90f-418a-97d3-e256d87aefb5-kube-api-access-wn9sg\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:38 crc kubenswrapper[4786]: I0313 16:44:38.959875 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76ef6159-b90f-418a-97d3-e256d87aefb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.259166 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" event={"ID":"76ef6159-b90f-418a-97d3-e256d87aefb5","Type":"ContainerDied","Data":"91f6715198195ca477b3574043cc9b4ee9e812a9409f187a725c175c8c732181"} Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.259552 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91f6715198195ca477b3574043cc9b4ee9e812a9409f187a725c175c8c732181" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.259639 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dkm6m" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.273319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e137d965-4671-483c-80be-206f9c897681","Type":"ContainerStarted","Data":"d66f0492d9df9585df689692684fad429899c73a33f17b22ec93924636170717"} Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.287197 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6","Type":"ContainerStarted","Data":"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18"} Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.287245 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6","Type":"ContainerStarted","Data":"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5"} Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.287260 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6","Type":"ContainerStarted","Data":"b8c8082819ff67c6cecae7cbefb0af485e69122c74e3b236d5561e63fda5dcb6"} Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.354101 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 16:44:39 crc kubenswrapper[4786]: E0313 16:44:39.354532 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ef6159-b90f-418a-97d3-e256d87aefb5" containerName="nova-cell1-conductor-db-sync" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.354548 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ef6159-b90f-418a-97d3-e256d87aefb5" containerName="nova-cell1-conductor-db-sync" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.354843 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ef6159-b90f-418a-97d3-e256d87aefb5" containerName="nova-cell1-conductor-db-sync" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.356465 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.358342 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.360682 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.3606527760000002 podStartE2EDuration="2.360652776s" podCreationTimestamp="2026-03-13 16:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:39.338622123 +0000 UTC m=+6109.501833974" watchObservedRunningTime="2026-03-13 16:44:39.360652776 +0000 UTC m=+6109.523864587" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.395464 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.39544549 podStartE2EDuration="2.39544549s" podCreationTimestamp="2026-03-13 16:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:39.357026365 +0000 UTC m=+6109.520238176" watchObservedRunningTime="2026-03-13 16:44:39.39544549 +0000 UTC m=+6109.558657291" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.403544 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.476881 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2xcm\" (UniqueName: \"kubernetes.io/projected/6d725d27-695e-4244-8ce7-440d167598ce-kube-api-access-r2xcm\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.476996 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d725d27-695e-4244-8ce7-440d167598ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.477060 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d725d27-695e-4244-8ce7-440d167598ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: E0313 16:44:39.513711 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76ef6159_b90f_418a_97d3_e256d87aefb5.slice/crio-91f6715198195ca477b3574043cc9b4ee9e812a9409f187a725c175c8c732181\": RecentStats: unable to find data in memory cache]" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.578459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d725d27-695e-4244-8ce7-440d167598ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.578726 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d725d27-695e-4244-8ce7-440d167598ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.578800 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2xcm\" (UniqueName: \"kubernetes.io/projected/6d725d27-695e-4244-8ce7-440d167598ce-kube-api-access-r2xcm\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.582842 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d725d27-695e-4244-8ce7-440d167598ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.584178 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d725d27-695e-4244-8ce7-440d167598ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.595626 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2xcm\" (UniqueName: \"kubernetes.io/projected/6d725d27-695e-4244-8ce7-440d167598ce-kube-api-access-r2xcm\") pod \"nova-cell1-conductor-0\" (UID: \"6d725d27-695e-4244-8ce7-440d167598ce\") " pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.671699 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.695107 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.781573 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-scripts\") pod \"e59cb012-12c2-4674-850a-d9638f76670d\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.781695 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psdd6\" (UniqueName: \"kubernetes.io/projected/e59cb012-12c2-4674-850a-d9638f76670d-kube-api-access-psdd6\") pod \"e59cb012-12c2-4674-850a-d9638f76670d\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.781760 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-combined-ca-bundle\") pod \"e59cb012-12c2-4674-850a-d9638f76670d\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.781805 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-config-data\") pod \"e59cb012-12c2-4674-850a-d9638f76670d\" (UID: \"e59cb012-12c2-4674-850a-d9638f76670d\") " Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.786586 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-scripts" (OuterVolumeSpecName: "scripts") pod "e59cb012-12c2-4674-850a-d9638f76670d" (UID: "e59cb012-12c2-4674-850a-d9638f76670d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.795529 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59cb012-12c2-4674-850a-d9638f76670d-kube-api-access-psdd6" (OuterVolumeSpecName: "kube-api-access-psdd6") pod "e59cb012-12c2-4674-850a-d9638f76670d" (UID: "e59cb012-12c2-4674-850a-d9638f76670d"). InnerVolumeSpecName "kube-api-access-psdd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.828234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-config-data" (OuterVolumeSpecName: "config-data") pod "e59cb012-12c2-4674-850a-d9638f76670d" (UID: "e59cb012-12c2-4674-850a-d9638f76670d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.831695 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e59cb012-12c2-4674-850a-d9638f76670d" (UID: "e59cb012-12c2-4674-850a-d9638f76670d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.884019 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psdd6\" (UniqueName: \"kubernetes.io/projected/e59cb012-12c2-4674-850a-d9638f76670d-kube-api-access-psdd6\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.884048 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.884058 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:39 crc kubenswrapper[4786]: I0313 16:44:39.884066 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59cb012-12c2-4674-850a-d9638f76670d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.189556 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 16:44:40 crc kubenswrapper[4786]: W0313 16:44:40.191675 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d725d27_695e_4244_8ce7_440d167598ce.slice/crio-d7cfebb996451e3a1202c6c8a7bbb44dc267590aefc7289d98adfa4fa878b2f5 WatchSource:0}: Error finding container d7cfebb996451e3a1202c6c8a7bbb44dc267590aefc7289d98adfa4fa878b2f5: Status 404 returned error can't find the container with id d7cfebb996451e3a1202c6c8a7bbb44dc267590aefc7289d98adfa4fa878b2f5 Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.298421 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6d725d27-695e-4244-8ce7-440d167598ce","Type":"ContainerStarted","Data":"d7cfebb996451e3a1202c6c8a7bbb44dc267590aefc7289d98adfa4fa878b2f5"} Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.306999 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mmn4t" Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.308003 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mmn4t" event={"ID":"e59cb012-12c2-4674-850a-d9638f76670d","Type":"ContainerDied","Data":"73ff562c1490c47a9ce94165a2a046802147501ef4a018f5a1cc03cf97a068a3"} Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.308064 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ff562c1490c47a9ce94165a2a046802147501ef4a018f5a1cc03cf97a068a3" Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.398288 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.399009 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="736c2655-9bc2-4f40-833a-995e03e75412" containerName="nova-api-log" containerID="cri-o://688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468" gracePeriod=30 Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.399106 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="736c2655-9bc2-4f40-833a-995e03e75412" containerName="nova-api-api" containerID="cri-o://9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0" gracePeriod=30 Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.408658 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.408911 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" containerName="nova-scheduler-scheduler" containerID="cri-o://8cc7c9c6f5950c2df01da0736fa4f81716ad6b8874a67f7ea5f6ac5a050f7a3a" gracePeriod=30 Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.430507 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:40 crc kubenswrapper[4786]: I0313 16:44:40.972705 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.119113 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jslr\" (UniqueName: \"kubernetes.io/projected/736c2655-9bc2-4f40-833a-995e03e75412-kube-api-access-7jslr\") pod \"736c2655-9bc2-4f40-833a-995e03e75412\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.119284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-combined-ca-bundle\") pod \"736c2655-9bc2-4f40-833a-995e03e75412\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.119395 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-config-data\") pod \"736c2655-9bc2-4f40-833a-995e03e75412\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.119577 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/736c2655-9bc2-4f40-833a-995e03e75412-logs\") pod \"736c2655-9bc2-4f40-833a-995e03e75412\" (UID: \"736c2655-9bc2-4f40-833a-995e03e75412\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.120452 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736c2655-9bc2-4f40-833a-995e03e75412-logs" (OuterVolumeSpecName: "logs") pod "736c2655-9bc2-4f40-833a-995e03e75412" (UID: "736c2655-9bc2-4f40-833a-995e03e75412"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.124966 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736c2655-9bc2-4f40-833a-995e03e75412-kube-api-access-7jslr" (OuterVolumeSpecName: "kube-api-access-7jslr") pod "736c2655-9bc2-4f40-833a-995e03e75412" (UID: "736c2655-9bc2-4f40-833a-995e03e75412"). InnerVolumeSpecName "kube-api-access-7jslr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.151931 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "736c2655-9bc2-4f40-833a-995e03e75412" (UID: "736c2655-9bc2-4f40-833a-995e03e75412"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.159146 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-config-data" (OuterVolumeSpecName: "config-data") pod "736c2655-9bc2-4f40-833a-995e03e75412" (UID: "736c2655-9bc2-4f40-833a-995e03e75412"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.222413 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.222444 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/736c2655-9bc2-4f40-833a-995e03e75412-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.222454 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jslr\" (UniqueName: \"kubernetes.io/projected/736c2655-9bc2-4f40-833a-995e03e75412-kube-api-access-7jslr\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.222464 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736c2655-9bc2-4f40-833a-995e03e75412-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.322462 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6d725d27-695e-4244-8ce7-440d167598ce","Type":"ContainerStarted","Data":"71aaceceaa851a6793143e792a7235c84a4edb30ba231642556bf35379fb0a51"} Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.322738 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326001 4786 generic.go:334] "Generic (PLEG): container finished" podID="736c2655-9bc2-4f40-833a-995e03e75412" containerID="9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0" exitCode=0 Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326031 4786 generic.go:334] "Generic (PLEG): container finished" podID="736c2655-9bc2-4f40-833a-995e03e75412" containerID="688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468" exitCode=143 Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326147 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326226 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"736c2655-9bc2-4f40-833a-995e03e75412","Type":"ContainerDied","Data":"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0"} Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326234 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerName="nova-metadata-log" containerID="cri-o://64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5" gracePeriod=30 Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326258 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"736c2655-9bc2-4f40-833a-995e03e75412","Type":"ContainerDied","Data":"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468"} Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326272 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"736c2655-9bc2-4f40-833a-995e03e75412","Type":"ContainerDied","Data":"edce4d527562fc8c6ebceaa87bf66d4aec085be8f2a0bc48b8616595ce22639f"} Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326290 4786 scope.go:117] "RemoveContainer" containerID="9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.326350 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerName="nova-metadata-metadata" containerID="cri-o://40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18" gracePeriod=30 Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.348752 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.348732494 podStartE2EDuration="2.348732494s" podCreationTimestamp="2026-03-13 16:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:41.347351599 +0000 UTC m=+6111.510563430" watchObservedRunningTime="2026-03-13 16:44:41.348732494 +0000 UTC m=+6111.511944305" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.368510 4786 scope.go:117] "RemoveContainer" containerID="688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.382324 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.396127 4786 scope.go:117] "RemoveContainer" containerID="9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.396426 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:41 crc kubenswrapper[4786]: E0313 16:44:41.397912 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0\": container with ID starting with 9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0 not found: ID does not exist" containerID="9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.397962 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0"} err="failed to get container status \"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0\": rpc error: code = NotFound desc = could not find container \"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0\": container with ID starting with 9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0 not found: ID does not exist" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.397991 4786 scope.go:117] "RemoveContainer" containerID="688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468" Mar 13 16:44:41 crc kubenswrapper[4786]: E0313 16:44:41.398596 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468\": container with ID starting with 688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468 not found: ID does not exist" containerID="688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.398619 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468"} err="failed to get container status \"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468\": rpc error: code = NotFound desc = could not find container \"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468\": container with ID starting with 688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468 not found: ID does not exist" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.398658 4786 scope.go:117] "RemoveContainer" containerID="9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.399117 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0"} err="failed to get container status \"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0\": rpc error: code = NotFound desc = could not find container \"9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0\": container with ID starting with 9c5498c87568492861babe9bbbc8b170efd49e3c779dfff744c5994aecf358c0 not found: ID does not exist" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.399138 4786 scope.go:117] "RemoveContainer" containerID="688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.399314 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468"} err="failed to get container status \"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468\": rpc error: code = NotFound desc = could not find container \"688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468\": container with ID starting with 688e7b33e14319e4c8f2f179600a26dec1da3c57e7125a926aa9aea25f49f468 not found: ID does not exist" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.417220 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:41 crc kubenswrapper[4786]: E0313 16:44:41.417914 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736c2655-9bc2-4f40-833a-995e03e75412" containerName="nova-api-log" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.418002 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="736c2655-9bc2-4f40-833a-995e03e75412" containerName="nova-api-log" Mar 13 16:44:41 crc kubenswrapper[4786]: E0313 16:44:41.418092 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736c2655-9bc2-4f40-833a-995e03e75412" containerName="nova-api-api" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.418169 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="736c2655-9bc2-4f40-833a-995e03e75412" containerName="nova-api-api" Mar 13 16:44:41 crc kubenswrapper[4786]: E0313 16:44:41.418238 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59cb012-12c2-4674-850a-d9638f76670d" containerName="nova-manage" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.418289 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59cb012-12c2-4674-850a-d9638f76670d" containerName="nova-manage" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.418496 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="736c2655-9bc2-4f40-833a-995e03e75412" containerName="nova-api-log" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.418730 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="736c2655-9bc2-4f40-833a-995e03e75412" containerName="nova-api-api" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.418788 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59cb012-12c2-4674-850a-d9638f76670d" containerName="nova-manage" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.419867 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.422564 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.442303 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.527012 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.527081 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-config-data\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.527124 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klmr9\" (UniqueName: \"kubernetes.io/projected/37c31938-f62c-4b24-8e63-d62be4ffd979-kube-api-access-klmr9\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.527177 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c31938-f62c-4b24-8e63-d62be4ffd979-logs\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.628985 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.629081 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-config-data\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.629137 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klmr9\" (UniqueName: \"kubernetes.io/projected/37c31938-f62c-4b24-8e63-d62be4ffd979-kube-api-access-klmr9\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.629248 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c31938-f62c-4b24-8e63-d62be4ffd979-logs\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.630830 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c31938-f62c-4b24-8e63-d62be4ffd979-logs\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.636439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.645379 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klmr9\" (UniqueName: \"kubernetes.io/projected/37c31938-f62c-4b24-8e63-d62be4ffd979-kube-api-access-klmr9\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.647223 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-config-data\") pod \"nova-api-0\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.751131 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.781876 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.832159 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-combined-ca-bundle\") pod \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.832256 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5nps\" (UniqueName: \"kubernetes.io/projected/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-kube-api-access-r5nps\") pod \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.832367 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-config-data\") pod \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.832463 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-logs\") pod \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.832499 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-nova-metadata-tls-certs\") pod \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\" (UID: \"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6\") " Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.834266 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-logs" (OuterVolumeSpecName: "logs") pod "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" (UID: "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.838448 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-kube-api-access-r5nps" (OuterVolumeSpecName: "kube-api-access-r5nps") pod "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" (UID: "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6"). InnerVolumeSpecName "kube-api-access-r5nps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.858235 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" (UID: "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.871449 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-config-data" (OuterVolumeSpecName: "config-data") pod "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" (UID: "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.882449 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" (UID: "a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.934607 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5nps\" (UniqueName: \"kubernetes.io/projected/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-kube-api-access-r5nps\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.934645 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.934656 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.934664 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:41 crc kubenswrapper[4786]: I0313 16:44:41.934674 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.188555 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:42 crc kubenswrapper[4786]: W0313 16:44:42.200169 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37c31938_f62c_4b24_8e63_d62be4ffd979.slice/crio-35efdadba78b2b4d07e94ed99f59446ee21590420e0fb84ece9e18ed92c9f460 WatchSource:0}: Error finding container 35efdadba78b2b4d07e94ed99f59446ee21590420e0fb84ece9e18ed92c9f460: Status 404 returned error can't find the container with id 35efdadba78b2b4d07e94ed99f59446ee21590420e0fb84ece9e18ed92c9f460 Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.338762 4786 generic.go:334] "Generic (PLEG): container finished" podID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerID="40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18" exitCode=0 Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.339125 4786 generic.go:334] "Generic (PLEG): container finished" podID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerID="64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5" exitCode=143 Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.338908 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6","Type":"ContainerDied","Data":"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18"} Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.338939 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.339224 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6","Type":"ContainerDied","Data":"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5"} Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.339249 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6","Type":"ContainerDied","Data":"b8c8082819ff67c6cecae7cbefb0af485e69122c74e3b236d5561e63fda5dcb6"} Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.339277 4786 scope.go:117] "RemoveContainer" containerID="40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.342391 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37c31938-f62c-4b24-8e63-d62be4ffd979","Type":"ContainerStarted","Data":"35efdadba78b2b4d07e94ed99f59446ee21590420e0fb84ece9e18ed92c9f460"} Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.426245 4786 scope.go:117] "RemoveContainer" containerID="64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.451766 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.462144 4786 scope.go:117] "RemoveContainer" containerID="40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18" Mar 13 16:44:42 crc kubenswrapper[4786]: E0313 16:44:42.467509 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18\": container with ID starting with 40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18 not found: ID does not exist" containerID="40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.467590 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18"} err="failed to get container status \"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18\": rpc error: code = NotFound desc = could not find container \"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18\": container with ID starting with 40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18 not found: ID does not exist" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.467626 4786 scope.go:117] "RemoveContainer" containerID="64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5" Mar 13 16:44:42 crc kubenswrapper[4786]: E0313 16:44:42.468073 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5\": container with ID starting with 64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5 not found: ID does not exist" containerID="64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.468153 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5"} err="failed to get container status \"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5\": rpc error: code = NotFound desc = could not find container \"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5\": container with ID starting with 64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5 not found: ID does not exist" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.468189 4786 scope.go:117] "RemoveContainer" containerID="40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.468758 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18"} err="failed to get container status \"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18\": rpc error: code = NotFound desc = could not find container \"40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18\": container with ID starting with 40eccc6c099a7fc96dfdef51b5aac261c07fc48393d7d1d2c4ae32d4ec2f2a18 not found: ID does not exist" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.468787 4786 scope.go:117] "RemoveContainer" containerID="64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.469675 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5"} err="failed to get container status \"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5\": rpc error: code = NotFound desc = could not find container \"64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5\": container with ID starting with 64cf7cb8ed5c387266c9c2e32085f96ae9504b846e4ff41644c55b8026ed10f5 not found: ID does not exist" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.477165 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.502835 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:42 crc kubenswrapper[4786]: E0313 16:44:42.505080 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerName="nova-metadata-metadata" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.505137 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerName="nova-metadata-metadata" Mar 13 16:44:42 crc kubenswrapper[4786]: E0313 16:44:42.505168 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerName="nova-metadata-log" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.505177 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerName="nova-metadata-log" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.505514 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerName="nova-metadata-log" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.505546 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" containerName="nova-metadata-metadata" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.506895 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.509345 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.509430 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.519573 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.571465 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736c2655-9bc2-4f40-833a-995e03e75412" path="/var/lib/kubelet/pods/736c2655-9bc2-4f40-833a-995e03e75412/volumes" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.572086 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6" path="/var/lib/kubelet/pods/a06aa5c6-6ef2-4bb2-bccc-82cb87f894f6/volumes" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.649416 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.649811 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62jtc\" (UniqueName: \"kubernetes.io/projected/38351364-0d7c-4fc8-929c-667a43a0d1d5-kube-api-access-62jtc\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.649896 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.649923 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-config-data\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.649970 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38351364-0d7c-4fc8-929c-667a43a0d1d5-logs\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.690947 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.752248 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62jtc\" (UniqueName: \"kubernetes.io/projected/38351364-0d7c-4fc8-929c-667a43a0d1d5-kube-api-access-62jtc\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.752467 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.752554 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-config-data\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.752635 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38351364-0d7c-4fc8-929c-667a43a0d1d5-logs\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.752736 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.753187 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38351364-0d7c-4fc8-929c-667a43a0d1d5-logs\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.757832 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-config-data\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.758151 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.774207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.774414 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.785112 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62jtc\" (UniqueName: \"kubernetes.io/projected/38351364-0d7c-4fc8-929c-667a43a0d1d5-kube-api-access-62jtc\") pod \"nova-metadata-0\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " pod="openstack/nova-metadata-0" Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.857532 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-597c7dfdb7-2vd4x"] Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.857743 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" podUID="2502a5af-0301-419d-8d0d-0847b30ec60d" containerName="dnsmasq-dns" containerID="cri-o://0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b" gracePeriod=10 Mar 13 16:44:42 crc kubenswrapper[4786]: I0313 16:44:42.965223 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.288125 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.354231 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37c31938-f62c-4b24-8e63-d62be4ffd979","Type":"ContainerStarted","Data":"18c1725004adb2f6ac319320e2173050db4d350615b923df4e8262535b235ada"} Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.354277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37c31938-f62c-4b24-8e63-d62be4ffd979","Type":"ContainerStarted","Data":"649ea058453d1956c1535745c50ac7c2030b4348780a37c90204a531b9eb9db2"} Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.356365 4786 generic.go:334] "Generic (PLEG): container finished" podID="2502a5af-0301-419d-8d0d-0847b30ec60d" containerID="0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b" exitCode=0 Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.356413 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" event={"ID":"2502a5af-0301-419d-8d0d-0847b30ec60d","Type":"ContainerDied","Data":"0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b"} Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.356452 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" event={"ID":"2502a5af-0301-419d-8d0d-0847b30ec60d","Type":"ContainerDied","Data":"fc8bed9422ea239f2d30ac2c8f5c16ebfdec9ca13fb7316b086bbce2a640067b"} Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.356478 4786 scope.go:117] "RemoveContainer" containerID="0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.356633 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597c7dfdb7-2vd4x" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.362050 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-sb\") pod \"2502a5af-0301-419d-8d0d-0847b30ec60d\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.362101 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-config\") pod \"2502a5af-0301-419d-8d0d-0847b30ec60d\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.362245 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pdgp\" (UniqueName: \"kubernetes.io/projected/2502a5af-0301-419d-8d0d-0847b30ec60d-kube-api-access-5pdgp\") pod \"2502a5af-0301-419d-8d0d-0847b30ec60d\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.362295 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-nb\") pod \"2502a5af-0301-419d-8d0d-0847b30ec60d\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.362329 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-dns-svc\") pod \"2502a5af-0301-419d-8d0d-0847b30ec60d\" (UID: \"2502a5af-0301-419d-8d0d-0847b30ec60d\") " Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.382796 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2502a5af-0301-419d-8d0d-0847b30ec60d-kube-api-access-5pdgp" (OuterVolumeSpecName: "kube-api-access-5pdgp") pod "2502a5af-0301-419d-8d0d-0847b30ec60d" (UID: "2502a5af-0301-419d-8d0d-0847b30ec60d"). InnerVolumeSpecName "kube-api-access-5pdgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.388005 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.387987168 podStartE2EDuration="2.387987168s" podCreationTimestamp="2026-03-13 16:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:43.368707844 +0000 UTC m=+6113.531919695" watchObservedRunningTime="2026-03-13 16:44:43.387987168 +0000 UTC m=+6113.551198979" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.408695 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2502a5af-0301-419d-8d0d-0847b30ec60d" (UID: "2502a5af-0301-419d-8d0d-0847b30ec60d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.413571 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2502a5af-0301-419d-8d0d-0847b30ec60d" (UID: "2502a5af-0301-419d-8d0d-0847b30ec60d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.414391 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-config" (OuterVolumeSpecName: "config") pod "2502a5af-0301-419d-8d0d-0847b30ec60d" (UID: "2502a5af-0301-419d-8d0d-0847b30ec60d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.418311 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2502a5af-0301-419d-8d0d-0847b30ec60d" (UID: "2502a5af-0301-419d-8d0d-0847b30ec60d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.464278 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.464529 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.464540 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pdgp\" (UniqueName: \"kubernetes.io/projected/2502a5af-0301-419d-8d0d-0847b30ec60d-kube-api-access-5pdgp\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.464550 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.464559 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2502a5af-0301-419d-8d0d-0847b30ec60d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.498389 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.543632 4786 scope.go:117] "RemoveContainer" containerID="1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.572492 4786 scope.go:117] "RemoveContainer" containerID="0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b" Mar 13 16:44:43 crc kubenswrapper[4786]: E0313 16:44:43.573002 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b\": container with ID starting with 0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b not found: ID does not exist" containerID="0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.573044 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b"} err="failed to get container status \"0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b\": rpc error: code = NotFound desc = could not find container \"0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b\": container with ID starting with 0d434e5abb9764441c2155cd62cfb21b00ba88bb07695cbe4c1851b9c090328b not found: ID does not exist" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.573073 4786 scope.go:117] "RemoveContainer" containerID="1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65" Mar 13 16:44:43 crc kubenswrapper[4786]: E0313 16:44:43.573374 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65\": container with ID starting with 1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65 not found: ID does not exist" containerID="1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.573414 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65"} err="failed to get container status \"1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65\": rpc error: code = NotFound desc = could not find container \"1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65\": container with ID starting with 1486f83f20345c7c77381c84e8b0ad7db64936f813c9bf9691e1add278d84b65 not found: ID does not exist" Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.693578 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-597c7dfdb7-2vd4x"] Mar 13 16:44:43 crc kubenswrapper[4786]: I0313 16:44:43.703713 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-597c7dfdb7-2vd4x"] Mar 13 16:44:44 crc kubenswrapper[4786]: I0313 16:44:44.367118 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"38351364-0d7c-4fc8-929c-667a43a0d1d5","Type":"ContainerStarted","Data":"3371ce4c2d8f58b8edf50baeb77cccffdff4f5577f83cacfcb5f55a37af2f18e"} Mar 13 16:44:44 crc kubenswrapper[4786]: I0313 16:44:44.367462 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"38351364-0d7c-4fc8-929c-667a43a0d1d5","Type":"ContainerStarted","Data":"153cabb8b59e451b933432c1d7c9ae6a49a0ab7e657d106e1e152723a69a126e"} Mar 13 16:44:44 crc kubenswrapper[4786]: I0313 16:44:44.367488 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"38351364-0d7c-4fc8-929c-667a43a0d1d5","Type":"ContainerStarted","Data":"e3549ae4ae4f40803c0c9d59c400f57637a44a8f2cd13b769f69d5ed8bd4f790"} Mar 13 16:44:44 crc kubenswrapper[4786]: I0313 16:44:44.391324 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.391299285 podStartE2EDuration="2.391299285s" podCreationTimestamp="2026-03-13 16:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:44.383784877 +0000 UTC m=+6114.546996728" watchObservedRunningTime="2026-03-13 16:44:44.391299285 +0000 UTC m=+6114.554511106" Mar 13 16:44:44 crc kubenswrapper[4786]: I0313 16:44:44.571050 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2502a5af-0301-419d-8d0d-0847b30ec60d" path="/var/lib/kubelet/pods/2502a5af-0301-419d-8d0d-0847b30ec60d/volumes" Mar 13 16:44:46 crc kubenswrapper[4786]: I0313 16:44:46.552257 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:44:46 crc kubenswrapper[4786]: E0313 16:44:46.552531 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:44:47 crc kubenswrapper[4786]: I0313 16:44:47.692071 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:47 crc kubenswrapper[4786]: I0313 16:44:47.721425 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:48 crc kubenswrapper[4786]: I0313 16:44:48.457455 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 16:44:49 crc kubenswrapper[4786]: I0313 16:44:49.731479 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.267682 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xrdtt"] Mar 13 16:44:50 crc kubenswrapper[4786]: E0313 16:44:50.268133 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2502a5af-0301-419d-8d0d-0847b30ec60d" containerName="dnsmasq-dns" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.268155 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502a5af-0301-419d-8d0d-0847b30ec60d" containerName="dnsmasq-dns" Mar 13 16:44:50 crc kubenswrapper[4786]: E0313 16:44:50.268174 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2502a5af-0301-419d-8d0d-0847b30ec60d" containerName="init" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.268182 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502a5af-0301-419d-8d0d-0847b30ec60d" containerName="init" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.268396 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2502a5af-0301-419d-8d0d-0847b30ec60d" containerName="dnsmasq-dns" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.269172 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.270696 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.272155 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.283197 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xrdtt"] Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.420976 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-scripts\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.421404 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/68020e92-0f8a-4061-8308-a3a20f80b4f3-kube-api-access-8g98w\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.421546 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-config-data\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.421635 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.523435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.523578 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-scripts\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.523606 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/68020e92-0f8a-4061-8308-a3a20f80b4f3-kube-api-access-8g98w\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.523652 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-config-data\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.526895 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.527081 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.529973 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.538485 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-config-data\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.542147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-scripts\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.545702 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/68020e92-0f8a-4061-8308-a3a20f80b4f3-kube-api-access-8g98w\") pod \"nova-cell1-cell-mapping-xrdtt\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:50 crc kubenswrapper[4786]: I0313 16:44:50.596918 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:51 crc kubenswrapper[4786]: I0313 16:44:51.058599 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xrdtt"] Mar 13 16:44:51 crc kubenswrapper[4786]: I0313 16:44:51.453286 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xrdtt" event={"ID":"68020e92-0f8a-4061-8308-a3a20f80b4f3","Type":"ContainerStarted","Data":"94bbeea2fbe22940c761ac71078fe115aa8dc96a579d0c317c11767385a4334a"} Mar 13 16:44:51 crc kubenswrapper[4786]: I0313 16:44:51.453535 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xrdtt" event={"ID":"68020e92-0f8a-4061-8308-a3a20f80b4f3","Type":"ContainerStarted","Data":"2dae901e9ea7eff98ec5761c17a1053959a2f05eaea9da9978c45872c3b9a32b"} Mar 13 16:44:51 crc kubenswrapper[4786]: I0313 16:44:51.466479 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xrdtt" podStartSLOduration=1.4664581700000001 podStartE2EDuration="1.46645817s" podCreationTimestamp="2026-03-13 16:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:44:51.46569297 +0000 UTC m=+6121.628904781" watchObservedRunningTime="2026-03-13 16:44:51.46645817 +0000 UTC m=+6121.629669981" Mar 13 16:44:51 crc kubenswrapper[4786]: I0313 16:44:51.751878 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 16:44:51 crc kubenswrapper[4786]: I0313 16:44:51.752189 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 16:44:52 crc kubenswrapper[4786]: I0313 16:44:52.793085 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.127:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 16:44:52 crc kubenswrapper[4786]: I0313 16:44:52.834087 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.127:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 16:44:52 crc kubenswrapper[4786]: I0313 16:44:52.967923 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 16:44:52 crc kubenswrapper[4786]: I0313 16:44:52.968405 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 16:44:53 crc kubenswrapper[4786]: I0313 16:44:53.980082 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.128:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 16:44:53 crc kubenswrapper[4786]: I0313 16:44:53.980126 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.128:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 16:44:56 crc kubenswrapper[4786]: I0313 16:44:56.507434 4786 generic.go:334] "Generic (PLEG): container finished" podID="68020e92-0f8a-4061-8308-a3a20f80b4f3" containerID="94bbeea2fbe22940c761ac71078fe115aa8dc96a579d0c317c11767385a4334a" exitCode=0 Mar 13 16:44:56 crc kubenswrapper[4786]: I0313 16:44:56.507522 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xrdtt" event={"ID":"68020e92-0f8a-4061-8308-a3a20f80b4f3","Type":"ContainerDied","Data":"94bbeea2fbe22940c761ac71078fe115aa8dc96a579d0c317c11767385a4334a"} Mar 13 16:44:57 crc kubenswrapper[4786]: I0313 16:44:57.885077 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:57 crc kubenswrapper[4786]: I0313 16:44:57.986375 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-combined-ca-bundle\") pod \"68020e92-0f8a-4061-8308-a3a20f80b4f3\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " Mar 13 16:44:57 crc kubenswrapper[4786]: I0313 16:44:57.986511 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-scripts\") pod \"68020e92-0f8a-4061-8308-a3a20f80b4f3\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " Mar 13 16:44:57 crc kubenswrapper[4786]: I0313 16:44:57.986598 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-config-data\") pod \"68020e92-0f8a-4061-8308-a3a20f80b4f3\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " Mar 13 16:44:57 crc kubenswrapper[4786]: I0313 16:44:57.986661 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/68020e92-0f8a-4061-8308-a3a20f80b4f3-kube-api-access-8g98w\") pod \"68020e92-0f8a-4061-8308-a3a20f80b4f3\" (UID: \"68020e92-0f8a-4061-8308-a3a20f80b4f3\") " Mar 13 16:44:57 crc kubenswrapper[4786]: I0313 16:44:57.992141 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68020e92-0f8a-4061-8308-a3a20f80b4f3-kube-api-access-8g98w" (OuterVolumeSpecName: "kube-api-access-8g98w") pod "68020e92-0f8a-4061-8308-a3a20f80b4f3" (UID: "68020e92-0f8a-4061-8308-a3a20f80b4f3"). InnerVolumeSpecName "kube-api-access-8g98w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:44:57 crc kubenswrapper[4786]: I0313 16:44:57.992455 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-scripts" (OuterVolumeSpecName: "scripts") pod "68020e92-0f8a-4061-8308-a3a20f80b4f3" (UID: "68020e92-0f8a-4061-8308-a3a20f80b4f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.031624 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68020e92-0f8a-4061-8308-a3a20f80b4f3" (UID: "68020e92-0f8a-4061-8308-a3a20f80b4f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.037246 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-config-data" (OuterVolumeSpecName: "config-data") pod "68020e92-0f8a-4061-8308-a3a20f80b4f3" (UID: "68020e92-0f8a-4061-8308-a3a20f80b4f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.088522 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.088560 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g98w\" (UniqueName: \"kubernetes.io/projected/68020e92-0f8a-4061-8308-a3a20f80b4f3-kube-api-access-8g98w\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.088570 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.088580 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68020e92-0f8a-4061-8308-a3a20f80b4f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.540214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xrdtt" event={"ID":"68020e92-0f8a-4061-8308-a3a20f80b4f3","Type":"ContainerDied","Data":"2dae901e9ea7eff98ec5761c17a1053959a2f05eaea9da9978c45872c3b9a32b"} Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.540667 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dae901e9ea7eff98ec5761c17a1053959a2f05eaea9da9978c45872c3b9a32b" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.540679 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xrdtt" Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.727980 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.728191 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-log" containerID="cri-o://649ea058453d1956c1535745c50ac7c2030b4348780a37c90204a531b9eb9db2" gracePeriod=30 Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.728582 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-api" containerID="cri-o://18c1725004adb2f6ac319320e2173050db4d350615b923df4e8262535b235ada" gracePeriod=30 Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.773607 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.774201 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-log" containerID="cri-o://153cabb8b59e451b933432c1d7c9ae6a49a0ab7e657d106e1e152723a69a126e" gracePeriod=30 Mar 13 16:44:58 crc kubenswrapper[4786]: I0313 16:44:58.774333 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-metadata" containerID="cri-o://3371ce4c2d8f58b8edf50baeb77cccffdff4f5577f83cacfcb5f55a37af2f18e" gracePeriod=30 Mar 13 16:44:59 crc kubenswrapper[4786]: I0313 16:44:59.551320 4786 generic.go:334] "Generic (PLEG): container finished" podID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerID="153cabb8b59e451b933432c1d7c9ae6a49a0ab7e657d106e1e152723a69a126e" exitCode=143 Mar 13 16:44:59 crc kubenswrapper[4786]: I0313 16:44:59.551664 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"38351364-0d7c-4fc8-929c-667a43a0d1d5","Type":"ContainerDied","Data":"153cabb8b59e451b933432c1d7c9ae6a49a0ab7e657d106e1e152723a69a126e"} Mar 13 16:44:59 crc kubenswrapper[4786]: I0313 16:44:59.554037 4786 generic.go:334] "Generic (PLEG): container finished" podID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerID="649ea058453d1956c1535745c50ac7c2030b4348780a37c90204a531b9eb9db2" exitCode=143 Mar 13 16:44:59 crc kubenswrapper[4786]: I0313 16:44:59.554067 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37c31938-f62c-4b24-8e63-d62be4ffd979","Type":"ContainerDied","Data":"649ea058453d1956c1535745c50ac7c2030b4348780a37c90204a531b9eb9db2"} Mar 13 16:44:59 crc kubenswrapper[4786]: I0313 16:44:59.751356 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 16:44:59 crc kubenswrapper[4786]: I0313 16:44:59.751413 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.162747 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz"] Mar 13 16:45:00 crc kubenswrapper[4786]: E0313 16:45:00.163481 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68020e92-0f8a-4061-8308-a3a20f80b4f3" containerName="nova-manage" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.163516 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="68020e92-0f8a-4061-8308-a3a20f80b4f3" containerName="nova-manage" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.163920 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="68020e92-0f8a-4061-8308-a3a20f80b4f3" containerName="nova-manage" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.164957 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.167629 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.167884 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.178189 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz"] Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.232258 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d726e9d-7bef-48d1-96cf-14b0622c6b65-config-volume\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.232461 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d726e9d-7bef-48d1-96cf-14b0622c6b65-secret-volume\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.232555 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djblj\" (UniqueName: \"kubernetes.io/projected/1d726e9d-7bef-48d1-96cf-14b0622c6b65-kube-api-access-djblj\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.335086 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d726e9d-7bef-48d1-96cf-14b0622c6b65-config-volume\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.335269 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d726e9d-7bef-48d1-96cf-14b0622c6b65-secret-volume\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.335330 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djblj\" (UniqueName: \"kubernetes.io/projected/1d726e9d-7bef-48d1-96cf-14b0622c6b65-kube-api-access-djblj\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.336790 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d726e9d-7bef-48d1-96cf-14b0622c6b65-config-volume\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.342364 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d726e9d-7bef-48d1-96cf-14b0622c6b65-secret-volume\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.361130 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djblj\" (UniqueName: \"kubernetes.io/projected/1d726e9d-7bef-48d1-96cf-14b0622c6b65-kube-api-access-djblj\") pod \"collect-profiles-29557005-8t7qz\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.502624 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.561904 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:45:00 crc kubenswrapper[4786]: E0313 16:45:00.562502 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.829085 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz"] Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.965793 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 16:45:00 crc kubenswrapper[4786]: I0313 16:45:00.966372 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 16:45:01 crc kubenswrapper[4786]: I0313 16:45:01.580699 4786 generic.go:334] "Generic (PLEG): container finished" podID="1d726e9d-7bef-48d1-96cf-14b0622c6b65" containerID="f866b38ab4b57269ef19915f62a3ef8593abe1f3d8d0a10bec98606f2ea48105" exitCode=0 Mar 13 16:45:01 crc kubenswrapper[4786]: I0313 16:45:01.580802 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" event={"ID":"1d726e9d-7bef-48d1-96cf-14b0622c6b65","Type":"ContainerDied","Data":"f866b38ab4b57269ef19915f62a3ef8593abe1f3d8d0a10bec98606f2ea48105"} Mar 13 16:45:01 crc kubenswrapper[4786]: I0313 16:45:01.581208 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" event={"ID":"1d726e9d-7bef-48d1-96cf-14b0622c6b65","Type":"ContainerStarted","Data":"12d99011d410f3d7ce93a2606fca8988ba6f97c24408bc432b08ba7fd636f983"} Mar 13 16:45:02 crc kubenswrapper[4786]: I0313 16:45:02.946589 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:02 crc kubenswrapper[4786]: I0313 16:45:02.988431 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djblj\" (UniqueName: \"kubernetes.io/projected/1d726e9d-7bef-48d1-96cf-14b0622c6b65-kube-api-access-djblj\") pod \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " Mar 13 16:45:02 crc kubenswrapper[4786]: I0313 16:45:02.988522 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d726e9d-7bef-48d1-96cf-14b0622c6b65-config-volume\") pod \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " Mar 13 16:45:02 crc kubenswrapper[4786]: I0313 16:45:02.988583 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d726e9d-7bef-48d1-96cf-14b0622c6b65-secret-volume\") pod \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\" (UID: \"1d726e9d-7bef-48d1-96cf-14b0622c6b65\") " Mar 13 16:45:02 crc kubenswrapper[4786]: I0313 16:45:02.991334 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d726e9d-7bef-48d1-96cf-14b0622c6b65-config-volume" (OuterVolumeSpecName: "config-volume") pod "1d726e9d-7bef-48d1-96cf-14b0622c6b65" (UID: "1d726e9d-7bef-48d1-96cf-14b0622c6b65"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:45:02 crc kubenswrapper[4786]: I0313 16:45:02.997042 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d726e9d-7bef-48d1-96cf-14b0622c6b65-kube-api-access-djblj" (OuterVolumeSpecName: "kube-api-access-djblj") pod "1d726e9d-7bef-48d1-96cf-14b0622c6b65" (UID: "1d726e9d-7bef-48d1-96cf-14b0622c6b65"). InnerVolumeSpecName "kube-api-access-djblj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:45:02 crc kubenswrapper[4786]: I0313 16:45:02.998164 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d726e9d-7bef-48d1-96cf-14b0622c6b65-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1d726e9d-7bef-48d1-96cf-14b0622c6b65" (UID: "1d726e9d-7bef-48d1-96cf-14b0622c6b65"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:03 crc kubenswrapper[4786]: I0313 16:45:03.090341 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d726e9d-7bef-48d1-96cf-14b0622c6b65-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:03 crc kubenswrapper[4786]: I0313 16:45:03.090382 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djblj\" (UniqueName: \"kubernetes.io/projected/1d726e9d-7bef-48d1-96cf-14b0622c6b65-kube-api-access-djblj\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:03 crc kubenswrapper[4786]: I0313 16:45:03.090393 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d726e9d-7bef-48d1-96cf-14b0622c6b65-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:03 crc kubenswrapper[4786]: I0313 16:45:03.613689 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" event={"ID":"1d726e9d-7bef-48d1-96cf-14b0622c6b65","Type":"ContainerDied","Data":"12d99011d410f3d7ce93a2606fca8988ba6f97c24408bc432b08ba7fd636f983"} Mar 13 16:45:03 crc kubenswrapper[4786]: I0313 16:45:03.614224 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12d99011d410f3d7ce93a2606fca8988ba6f97c24408bc432b08ba7fd636f983" Mar 13 16:45:03 crc kubenswrapper[4786]: I0313 16:45:03.613791 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557005-8t7qz" Mar 13 16:45:04 crc kubenswrapper[4786]: I0313 16:45:04.021334 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr"] Mar 13 16:45:04 crc kubenswrapper[4786]: I0313 16:45:04.029168 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556960-zsvtr"] Mar 13 16:45:04 crc kubenswrapper[4786]: I0313 16:45:04.570225 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="142a1d24-73ca-476f-92c9-0983ac69687b" path="/var/lib/kubelet/pods/142a1d24-73ca-476f-92c9-0983ac69687b/volumes" Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.709517 4786 generic.go:334] "Generic (PLEG): container finished" podID="4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" containerID="8cc7c9c6f5950c2df01da0736fa4f81716ad6b8874a67f7ea5f6ac5a050f7a3a" exitCode=137 Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.709789 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f","Type":"ContainerDied","Data":"8cc7c9c6f5950c2df01da0736fa4f81716ad6b8874a67f7ea5f6ac5a050f7a3a"} Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.833461 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.943620 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlwx4\" (UniqueName: \"kubernetes.io/projected/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-kube-api-access-qlwx4\") pod \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.943698 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-config-data\") pod \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.943839 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-combined-ca-bundle\") pod \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\" (UID: \"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f\") " Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.957193 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-kube-api-access-qlwx4" (OuterVolumeSpecName: "kube-api-access-qlwx4") pod "4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" (UID: "4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f"). InnerVolumeSpecName "kube-api-access-qlwx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.982022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-config-data" (OuterVolumeSpecName: "config-data") pod "4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" (UID: "4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:10 crc kubenswrapper[4786]: I0313 16:45:10.982963 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" (UID: "4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.046343 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlwx4\" (UniqueName: \"kubernetes.io/projected/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-kube-api-access-qlwx4\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.046373 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.046384 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.725306 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f","Type":"ContainerDied","Data":"a429714ae3ec84fbf898f69a1599470383c6e9ec1cdd8b78d85a37ec5a3ba108"} Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.725708 4786 scope.go:117] "RemoveContainer" containerID="8cc7c9c6f5950c2df01da0736fa4f81716ad6b8874a67f7ea5f6ac5a050f7a3a" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.726050 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.778937 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.798503 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.816421 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:45:11 crc kubenswrapper[4786]: E0313 16:45:11.817066 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d726e9d-7bef-48d1-96cf-14b0622c6b65" containerName="collect-profiles" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.817097 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d726e9d-7bef-48d1-96cf-14b0622c6b65" containerName="collect-profiles" Mar 13 16:45:11 crc kubenswrapper[4786]: E0313 16:45:11.817167 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" containerName="nova-scheduler-scheduler" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.817181 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" containerName="nova-scheduler-scheduler" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.817471 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d726e9d-7bef-48d1-96cf-14b0622c6b65" containerName="collect-profiles" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.817494 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" containerName="nova-scheduler-scheduler" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.835708 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.835879 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.840526 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.962647 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7tcs\" (UniqueName: \"kubernetes.io/projected/17b94d7f-1927-4369-a6df-460f0c896ced-kube-api-access-d7tcs\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.962697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17b94d7f-1927-4369-a6df-460f0c896ced-config-data\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:11 crc kubenswrapper[4786]: I0313 16:45:11.962872 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b94d7f-1927-4369-a6df-460f0c896ced-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.064727 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b94d7f-1927-4369-a6df-460f0c896ced-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.065097 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7tcs\" (UniqueName: \"kubernetes.io/projected/17b94d7f-1927-4369-a6df-460f0c896ced-kube-api-access-d7tcs\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.065153 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17b94d7f-1927-4369-a6df-460f0c896ced-config-data\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.074147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17b94d7f-1927-4369-a6df-460f0c896ced-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.075135 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17b94d7f-1927-4369-a6df-460f0c896ced-config-data\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.096703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7tcs\" (UniqueName: \"kubernetes.io/projected/17b94d7f-1927-4369-a6df-460f0c896ced-kube-api-access-d7tcs\") pod \"nova-scheduler-0\" (UID: \"17b94d7f-1927-4369-a6df-460f0c896ced\") " pod="openstack/nova-scheduler-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.159212 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.562111 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f" path="/var/lib/kubelet/pods/4f3ece23-fee4-42c3-8c8f-cafd6b5cfa7f/volumes" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.739265 4786 generic.go:334] "Generic (PLEG): container finished" podID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerID="3371ce4c2d8f58b8edf50baeb77cccffdff4f5577f83cacfcb5f55a37af2f18e" exitCode=0 Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.739585 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"38351364-0d7c-4fc8-929c-667a43a0d1d5","Type":"ContainerDied","Data":"3371ce4c2d8f58b8edf50baeb77cccffdff4f5577f83cacfcb5f55a37af2f18e"} Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.741196 4786 generic.go:334] "Generic (PLEG): container finished" podID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerID="18c1725004adb2f6ac319320e2173050db4d350615b923df4e8262535b235ada" exitCode=0 Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.741227 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37c31938-f62c-4b24-8e63-d62be4ffd979","Type":"ContainerDied","Data":"18c1725004adb2f6ac319320e2173050db4d350615b923df4e8262535b235ada"} Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.789277 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.795501 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.795706 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.883221 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klmr9\" (UniqueName: \"kubernetes.io/projected/37c31938-f62c-4b24-8e63-d62be4ffd979-kube-api-access-klmr9\") pod \"37c31938-f62c-4b24-8e63-d62be4ffd979\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.883342 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-config-data\") pod \"37c31938-f62c-4b24-8e63-d62be4ffd979\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.883382 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-combined-ca-bundle\") pod \"37c31938-f62c-4b24-8e63-d62be4ffd979\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.883468 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c31938-f62c-4b24-8e63-d62be4ffd979-logs\") pod \"37c31938-f62c-4b24-8e63-d62be4ffd979\" (UID: \"37c31938-f62c-4b24-8e63-d62be4ffd979\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.884351 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c31938-f62c-4b24-8e63-d62be4ffd979-logs" (OuterVolumeSpecName: "logs") pod "37c31938-f62c-4b24-8e63-d62be4ffd979" (UID: "37c31938-f62c-4b24-8e63-d62be4ffd979"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.888597 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c31938-f62c-4b24-8e63-d62be4ffd979-kube-api-access-klmr9" (OuterVolumeSpecName: "kube-api-access-klmr9") pod "37c31938-f62c-4b24-8e63-d62be4ffd979" (UID: "37c31938-f62c-4b24-8e63-d62be4ffd979"). InnerVolumeSpecName "kube-api-access-klmr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.911297 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-config-data" (OuterVolumeSpecName: "config-data") pod "37c31938-f62c-4b24-8e63-d62be4ffd979" (UID: "37c31938-f62c-4b24-8e63-d62be4ffd979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.913188 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37c31938-f62c-4b24-8e63-d62be4ffd979" (UID: "37c31938-f62c-4b24-8e63-d62be4ffd979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.985555 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-combined-ca-bundle\") pod \"38351364-0d7c-4fc8-929c-667a43a0d1d5\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.985733 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62jtc\" (UniqueName: \"kubernetes.io/projected/38351364-0d7c-4fc8-929c-667a43a0d1d5-kube-api-access-62jtc\") pod \"38351364-0d7c-4fc8-929c-667a43a0d1d5\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.985820 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-nova-metadata-tls-certs\") pod \"38351364-0d7c-4fc8-929c-667a43a0d1d5\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.985905 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-config-data\") pod \"38351364-0d7c-4fc8-929c-667a43a0d1d5\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.986015 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38351364-0d7c-4fc8-929c-667a43a0d1d5-logs\") pod \"38351364-0d7c-4fc8-929c-667a43a0d1d5\" (UID: \"38351364-0d7c-4fc8-929c-667a43a0d1d5\") " Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.986723 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c31938-f62c-4b24-8e63-d62be4ffd979-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.986758 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klmr9\" (UniqueName: \"kubernetes.io/projected/37c31938-f62c-4b24-8e63-d62be4ffd979-kube-api-access-klmr9\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.986781 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.986801 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c31938-f62c-4b24-8e63-d62be4ffd979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.988420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38351364-0d7c-4fc8-929c-667a43a0d1d5-logs" (OuterVolumeSpecName: "logs") pod "38351364-0d7c-4fc8-929c-667a43a0d1d5" (UID: "38351364-0d7c-4fc8-929c-667a43a0d1d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:45:12 crc kubenswrapper[4786]: I0313 16:45:12.990131 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38351364-0d7c-4fc8-929c-667a43a0d1d5-kube-api-access-62jtc" (OuterVolumeSpecName: "kube-api-access-62jtc") pod "38351364-0d7c-4fc8-929c-667a43a0d1d5" (UID: "38351364-0d7c-4fc8-929c-667a43a0d1d5"). InnerVolumeSpecName "kube-api-access-62jtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.020775 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38351364-0d7c-4fc8-929c-667a43a0d1d5" (UID: "38351364-0d7c-4fc8-929c-667a43a0d1d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.021163 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-config-data" (OuterVolumeSpecName: "config-data") pod "38351364-0d7c-4fc8-929c-667a43a0d1d5" (UID: "38351364-0d7c-4fc8-929c-667a43a0d1d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.041940 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "38351364-0d7c-4fc8-929c-667a43a0d1d5" (UID: "38351364-0d7c-4fc8-929c-667a43a0d1d5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.088486 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.088536 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62jtc\" (UniqueName: \"kubernetes.io/projected/38351364-0d7c-4fc8-929c-667a43a0d1d5-kube-api-access-62jtc\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.088557 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.088575 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38351364-0d7c-4fc8-929c-667a43a0d1d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.088589 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38351364-0d7c-4fc8-929c-667a43a0d1d5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.754087 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17b94d7f-1927-4369-a6df-460f0c896ced","Type":"ContainerStarted","Data":"ec93360c5a7cef724524f7b3f5ceb3c6165a547751add1496662ca9becfbddb7"} Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.754480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"17b94d7f-1927-4369-a6df-460f0c896ced","Type":"ContainerStarted","Data":"85053b2504e85faad3c48db903181714c7fcec47c2fd7defc3fe8135746f0000"} Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.768253 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.771616 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37c31938-f62c-4b24-8e63-d62be4ffd979","Type":"ContainerDied","Data":"35efdadba78b2b4d07e94ed99f59446ee21590420e0fb84ece9e18ed92c9f460"} Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.771674 4786 scope.go:117] "RemoveContainer" containerID="18c1725004adb2f6ac319320e2173050db4d350615b923df4e8262535b235ada" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.779718 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.779692941 podStartE2EDuration="2.779692941s" podCreationTimestamp="2026-03-13 16:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:45:13.777199628 +0000 UTC m=+6143.940411449" watchObservedRunningTime="2026-03-13 16:45:13.779692941 +0000 UTC m=+6143.942904772" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.781723 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"38351364-0d7c-4fc8-929c-667a43a0d1d5","Type":"ContainerDied","Data":"e3549ae4ae4f40803c0c9d59c400f57637a44a8f2cd13b769f69d5ed8bd4f790"} Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.781787 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.812150 4786 scope.go:117] "RemoveContainer" containerID="649ea058453d1956c1535745c50ac7c2030b4348780a37c90204a531b9eb9db2" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.827536 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.852157 4786 scope.go:117] "RemoveContainer" containerID="3371ce4c2d8f58b8edf50baeb77cccffdff4f5577f83cacfcb5f55a37af2f18e" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.854520 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.896994 4786 scope.go:117] "RemoveContainer" containerID="153cabb8b59e451b933432c1d7c9ae6a49a0ab7e657d106e1e152723a69a126e" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.908690 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:13 crc kubenswrapper[4786]: E0313 16:45:13.909362 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-log" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.909378 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-log" Mar 13 16:45:13 crc kubenswrapper[4786]: E0313 16:45:13.909408 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-metadata" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.909414 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-metadata" Mar 13 16:45:13 crc kubenswrapper[4786]: E0313 16:45:13.909440 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-api" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.909446 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-api" Mar 13 16:45:13 crc kubenswrapper[4786]: E0313 16:45:13.909466 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-log" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.909472 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-log" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.909762 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-log" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.909785 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" containerName="nova-api-api" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.909794 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-log" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.909825 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" containerName="nova-metadata-metadata" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.911715 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.920271 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.920503 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.931677 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.944559 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.952166 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.954437 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.956315 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.959482 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 16:45:13 crc kubenswrapper[4786]: I0313 16:45:13.961783 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.013806 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-logs\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.013885 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww84t\" (UniqueName: \"kubernetes.io/projected/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-kube-api-access-ww84t\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.013918 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b947k\" (UniqueName: \"kubernetes.io/projected/183127c3-ab56-4da7-8459-34b1d6060c41-kube-api-access-b947k\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.013950 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.013986 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-config-data\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.014007 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-config-data\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.014158 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.014414 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/183127c3-ab56-4da7-8459-34b1d6060c41-logs\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.014485 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116093 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/183127c3-ab56-4da7-8459-34b1d6060c41-logs\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116157 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116213 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-logs\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116246 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww84t\" (UniqueName: \"kubernetes.io/projected/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-kube-api-access-ww84t\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116278 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b947k\" (UniqueName: \"kubernetes.io/projected/183127c3-ab56-4da7-8459-34b1d6060c41-kube-api-access-b947k\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116308 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116347 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-config-data\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116369 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-config-data\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116423 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116576 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/183127c3-ab56-4da7-8459-34b1d6060c41-logs\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.116720 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-logs\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.120396 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.120561 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-config-data\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.120764 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.120891 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-config-data\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.125345 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/183127c3-ab56-4da7-8459-34b1d6060c41-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.133643 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww84t\" (UniqueName: \"kubernetes.io/projected/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-kube-api-access-ww84t\") pod \"nova-api-0\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.136614 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b947k\" (UniqueName: \"kubernetes.io/projected/183127c3-ab56-4da7-8459-34b1d6060c41-kube-api-access-b947k\") pod \"nova-metadata-0\" (UID: \"183127c3-ab56-4da7-8459-34b1d6060c41\") " pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.237532 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.274060 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.552982 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:45:14 crc kubenswrapper[4786]: E0313 16:45:14.553225 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.568436 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c31938-f62c-4b24-8e63-d62be4ffd979" path="/var/lib/kubelet/pods/37c31938-f62c-4b24-8e63-d62be4ffd979/volumes" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.569707 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38351364-0d7c-4fc8-929c-667a43a0d1d5" path="/var/lib/kubelet/pods/38351364-0d7c-4fc8-929c-667a43a0d1d5/volumes" Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.711071 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.777736 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.796424 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7","Type":"ContainerStarted","Data":"991076cee1ec7d33b7a309b0e0331a071c41c168b536459a8b1a40d74bd22ac9"} Mar 13 16:45:14 crc kubenswrapper[4786]: I0313 16:45:14.807710 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"183127c3-ab56-4da7-8459-34b1d6060c41","Type":"ContainerStarted","Data":"d6aa2c1597a48e1dd77fbdd0728f166c75b579e4184fa0cc1b08a80708c2ccf8"} Mar 13 16:45:15 crc kubenswrapper[4786]: I0313 16:45:15.522292 4786 scope.go:117] "RemoveContainer" containerID="53d2dc6a268adc8be9eaabbb78446fc5080e6fdd81e1b81f979048c7557b2b51" Mar 13 16:45:15 crc kubenswrapper[4786]: I0313 16:45:15.826056 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"183127c3-ab56-4da7-8459-34b1d6060c41","Type":"ContainerStarted","Data":"53c66dc76be978b91014a47c88a9a428a0343a6a4d28c159b06ec74b56059192"} Mar 13 16:45:15 crc kubenswrapper[4786]: I0313 16:45:15.826115 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"183127c3-ab56-4da7-8459-34b1d6060c41","Type":"ContainerStarted","Data":"bbf0f8e5c4afaea383254222226fa1b6e7bbeb0aa9251e335ee33abbe3acdbc3"} Mar 13 16:45:15 crc kubenswrapper[4786]: I0313 16:45:15.830958 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7","Type":"ContainerStarted","Data":"158a903cf8a94c95e5b9445ed1ec0be24950dc442c2878f02df3ca251490de89"} Mar 13 16:45:15 crc kubenswrapper[4786]: I0313 16:45:15.831019 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7","Type":"ContainerStarted","Data":"9b1fc6f09c429f55880c25a7bfdfe8af45f64e0424d0a681e4bf81cbc20ecb27"} Mar 13 16:45:15 crc kubenswrapper[4786]: I0313 16:45:15.859951 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.859930274 podStartE2EDuration="2.859930274s" podCreationTimestamp="2026-03-13 16:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:45:15.858055767 +0000 UTC m=+6146.021267588" watchObservedRunningTime="2026-03-13 16:45:15.859930274 +0000 UTC m=+6146.023142085" Mar 13 16:45:15 crc kubenswrapper[4786]: I0313 16:45:15.890068 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8900457299999998 podStartE2EDuration="2.89004573s" podCreationTimestamp="2026-03-13 16:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:45:15.886518652 +0000 UTC m=+6146.049730493" watchObservedRunningTime="2026-03-13 16:45:15.89004573 +0000 UTC m=+6146.053257541" Mar 13 16:45:17 crc kubenswrapper[4786]: I0313 16:45:17.159659 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 16:45:22 crc kubenswrapper[4786]: I0313 16:45:22.160242 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 16:45:22 crc kubenswrapper[4786]: I0313 16:45:22.210718 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 16:45:22 crc kubenswrapper[4786]: I0313 16:45:22.944888 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 16:45:24 crc kubenswrapper[4786]: I0313 16:45:24.237807 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 16:45:24 crc kubenswrapper[4786]: I0313 16:45:24.238294 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 16:45:24 crc kubenswrapper[4786]: I0313 16:45:24.274891 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 16:45:24 crc kubenswrapper[4786]: I0313 16:45:24.274963 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 16:45:25 crc kubenswrapper[4786]: I0313 16:45:25.320317 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.132:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 16:45:25 crc kubenswrapper[4786]: I0313 16:45:25.338095 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.132:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 16:45:25 crc kubenswrapper[4786]: I0313 16:45:25.338152 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="183127c3-ab56-4da7-8459-34b1d6060c41" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.133:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 16:45:25 crc kubenswrapper[4786]: I0313 16:45:25.338252 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="183127c3-ab56-4da7-8459-34b1d6060c41" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.133:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 16:45:25 crc kubenswrapper[4786]: I0313 16:45:25.552635 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:45:25 crc kubenswrapper[4786]: E0313 16:45:25.552963 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:45:32 crc kubenswrapper[4786]: I0313 16:45:32.238623 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 16:45:32 crc kubenswrapper[4786]: I0313 16:45:32.239500 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 16:45:32 crc kubenswrapper[4786]: I0313 16:45:32.274380 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 16:45:32 crc kubenswrapper[4786]: I0313 16:45:32.274460 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 16:45:34 crc kubenswrapper[4786]: I0313 16:45:34.246207 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 16:45:34 crc kubenswrapper[4786]: I0313 16:45:34.248093 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 16:45:34 crc kubenswrapper[4786]: I0313 16:45:34.253472 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 16:45:34 crc kubenswrapper[4786]: I0313 16:45:34.296608 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 16:45:34 crc kubenswrapper[4786]: I0313 16:45:34.297369 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 16:45:34 crc kubenswrapper[4786]: I0313 16:45:34.310018 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.055249 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.058654 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.338621 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-559978d967-l5m4z"] Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.347601 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.355066 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559978d967-l5m4z"] Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.495080 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-ovsdbserver-sb\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.495463 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-ovsdbserver-nb\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.495494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-config\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.495518 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-dns-svc\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.495568 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2sl\" (UniqueName: \"kubernetes.io/projected/100899d2-4eab-4a4b-a33d-435515ece751-kube-api-access-wr2sl\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.598033 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2sl\" (UniqueName: \"kubernetes.io/projected/100899d2-4eab-4a4b-a33d-435515ece751-kube-api-access-wr2sl\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.598196 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-ovsdbserver-sb\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.598233 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-ovsdbserver-nb\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.598266 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-config\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.598288 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-dns-svc\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.599263 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-dns-svc\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.599306 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-ovsdbserver-nb\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.599308 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-ovsdbserver-sb\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.599445 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/100899d2-4eab-4a4b-a33d-435515ece751-config\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.627268 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2sl\" (UniqueName: \"kubernetes.io/projected/100899d2-4eab-4a4b-a33d-435515ece751-kube-api-access-wr2sl\") pod \"dnsmasq-dns-559978d967-l5m4z\" (UID: \"100899d2-4eab-4a4b-a33d-435515ece751\") " pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:35 crc kubenswrapper[4786]: I0313 16:45:35.675600 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:36 crc kubenswrapper[4786]: I0313 16:45:36.168334 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559978d967-l5m4z"] Mar 13 16:45:36 crc kubenswrapper[4786]: W0313 16:45:36.171771 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod100899d2_4eab_4a4b_a33d_435515ece751.slice/crio-af4b6fd24e14ab725956fe1733a7b74b48a9b0525acc145283971f77cb011dc6 WatchSource:0}: Error finding container af4b6fd24e14ab725956fe1733a7b74b48a9b0525acc145283971f77cb011dc6: Status 404 returned error can't find the container with id af4b6fd24e14ab725956fe1733a7b74b48a9b0525acc145283971f77cb011dc6 Mar 13 16:45:36 crc kubenswrapper[4786]: I0313 16:45:36.551948 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:45:36 crc kubenswrapper[4786]: E0313 16:45:36.552207 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:45:37 crc kubenswrapper[4786]: I0313 16:45:37.067011 4786 generic.go:334] "Generic (PLEG): container finished" podID="100899d2-4eab-4a4b-a33d-435515ece751" containerID="6f0988103e2375c96f89535c3a9c969cdc513a8efc7ec0e7fac55e8761c36a34" exitCode=0 Mar 13 16:45:37 crc kubenswrapper[4786]: I0313 16:45:37.067124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559978d967-l5m4z" event={"ID":"100899d2-4eab-4a4b-a33d-435515ece751","Type":"ContainerDied","Data":"6f0988103e2375c96f89535c3a9c969cdc513a8efc7ec0e7fac55e8761c36a34"} Mar 13 16:45:37 crc kubenswrapper[4786]: I0313 16:45:37.067221 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559978d967-l5m4z" event={"ID":"100899d2-4eab-4a4b-a33d-435515ece751","Type":"ContainerStarted","Data":"af4b6fd24e14ab725956fe1733a7b74b48a9b0525acc145283971f77cb011dc6"} Mar 13 16:45:37 crc kubenswrapper[4786]: I0313 16:45:37.605927 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:38 crc kubenswrapper[4786]: I0313 16:45:38.081347 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559978d967-l5m4z" event={"ID":"100899d2-4eab-4a4b-a33d-435515ece751","Type":"ContainerStarted","Data":"d53fe169d06b600879acb4a7ab912a2fc446ae9cf8dc00f820f2fc27e32f10be"} Mar 13 16:45:38 crc kubenswrapper[4786]: I0313 16:45:38.081457 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-log" containerID="cri-o://9b1fc6f09c429f55880c25a7bfdfe8af45f64e0424d0a681e4bf81cbc20ecb27" gracePeriod=30 Mar 13 16:45:38 crc kubenswrapper[4786]: I0313 16:45:38.081548 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-api" containerID="cri-o://158a903cf8a94c95e5b9445ed1ec0be24950dc442c2878f02df3ca251490de89" gracePeriod=30 Mar 13 16:45:38 crc kubenswrapper[4786]: I0313 16:45:38.110423 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-559978d967-l5m4z" podStartSLOduration=3.110407678 podStartE2EDuration="3.110407678s" podCreationTimestamp="2026-03-13 16:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:45:38.107355362 +0000 UTC m=+6168.270567173" watchObservedRunningTime="2026-03-13 16:45:38.110407678 +0000 UTC m=+6168.273619489" Mar 13 16:45:39 crc kubenswrapper[4786]: I0313 16:45:39.093010 4786 generic.go:334] "Generic (PLEG): container finished" podID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerID="9b1fc6f09c429f55880c25a7bfdfe8af45f64e0424d0a681e4bf81cbc20ecb27" exitCode=143 Mar 13 16:45:39 crc kubenswrapper[4786]: I0313 16:45:39.093216 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7","Type":"ContainerDied","Data":"9b1fc6f09c429f55880c25a7bfdfe8af45f64e0424d0a681e4bf81cbc20ecb27"} Mar 13 16:45:39 crc kubenswrapper[4786]: I0313 16:45:39.093571 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:41 crc kubenswrapper[4786]: E0313 16:45:41.480816 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb8a1c5b_50b6_493a_bdfb_fd3b0d6d73f7.slice/crio-conmon-158a903cf8a94c95e5b9445ed1ec0be24950dc442c2878f02df3ca251490de89.scope\": RecentStats: unable to find data in memory cache]" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.130098 4786 generic.go:334] "Generic (PLEG): container finished" podID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerID="158a903cf8a94c95e5b9445ed1ec0be24950dc442c2878f02df3ca251490de89" exitCode=0 Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.130327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7","Type":"ContainerDied","Data":"158a903cf8a94c95e5b9445ed1ec0be24950dc442c2878f02df3ca251490de89"} Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.348093 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.450719 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-logs\") pod \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.450963 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-config-data\") pod \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.451051 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww84t\" (UniqueName: \"kubernetes.io/projected/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-kube-api-access-ww84t\") pod \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.451278 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-combined-ca-bundle\") pod \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\" (UID: \"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7\") " Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.451507 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-logs" (OuterVolumeSpecName: "logs") pod "fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" (UID: "fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.451985 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.460394 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-kube-api-access-ww84t" (OuterVolumeSpecName: "kube-api-access-ww84t") pod "fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" (UID: "fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7"). InnerVolumeSpecName "kube-api-access-ww84t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.481556 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" (UID: "fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.499319 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-config-data" (OuterVolumeSpecName: "config-data") pod "fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" (UID: "fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.553678 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.553727 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww84t\" (UniqueName: \"kubernetes.io/projected/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-kube-api-access-ww84t\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:42 crc kubenswrapper[4786]: I0313 16:45:42.553744 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.140620 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7","Type":"ContainerDied","Data":"991076cee1ec7d33b7a309b0e0331a071c41c168b536459a8b1a40d74bd22ac9"} Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.140974 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.141064 4786 scope.go:117] "RemoveContainer" containerID="158a903cf8a94c95e5b9445ed1ec0be24950dc442c2878f02df3ca251490de89" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.167848 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.170896 4786 scope.go:117] "RemoveContainer" containerID="9b1fc6f09c429f55880c25a7bfdfe8af45f64e0424d0a681e4bf81cbc20ecb27" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.177132 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.200004 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:43 crc kubenswrapper[4786]: E0313 16:45:43.200481 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-api" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.200510 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-api" Mar 13 16:45:43 crc kubenswrapper[4786]: E0313 16:45:43.200544 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-log" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.200781 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-log" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.201274 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-api" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.201325 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" containerName="nova-api-log" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.202907 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.205625 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.205977 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.206243 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.290663 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.370377 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-logs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.370416 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.370661 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.370763 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctftn\" (UniqueName: \"kubernetes.io/projected/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-kube-api-access-ctftn\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.370780 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-public-tls-certs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.370840 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-config-data\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.410289 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7f4zh"] Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.412121 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.425687 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7f4zh"] Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.472900 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-logs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.473424 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.473373 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-logs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.473570 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.473724 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-public-tls-certs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.474307 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctftn\" (UniqueName: \"kubernetes.io/projected/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-kube-api-access-ctftn\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.474365 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-config-data\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.480487 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.482733 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.486139 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-public-tls-certs\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.492486 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-config-data\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.498540 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctftn\" (UniqueName: \"kubernetes.io/projected/40766174-3a3b-4797-b4f8-1a3bf0e9eb7c-kube-api-access-ctftn\") pod \"nova-api-0\" (UID: \"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c\") " pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.530239 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.575658 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-catalog-content\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.575708 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-utilities\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.575744 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4nc6\" (UniqueName: \"kubernetes.io/projected/2643f396-8f5e-4143-9d1f-afa693746d58-kube-api-access-r4nc6\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.678243 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4nc6\" (UniqueName: \"kubernetes.io/projected/2643f396-8f5e-4143-9d1f-afa693746d58-kube-api-access-r4nc6\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.680511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-catalog-content\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.680568 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-utilities\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.681677 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-utilities\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.683175 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-catalog-content\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.703632 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4nc6\" (UniqueName: \"kubernetes.io/projected/2643f396-8f5e-4143-9d1f-afa693746d58-kube-api-access-r4nc6\") pod \"community-operators-7f4zh\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:43 crc kubenswrapper[4786]: I0313 16:45:43.745666 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:44 crc kubenswrapper[4786]: I0313 16:45:44.003533 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 16:45:44 crc kubenswrapper[4786]: I0313 16:45:44.152876 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c","Type":"ContainerStarted","Data":"c9e1a8b4ed709ac1e07e088fc6f0e306943d7a56d05fbb9d3079798829b5246a"} Mar 13 16:45:44 crc kubenswrapper[4786]: W0313 16:45:44.293182 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2643f396_8f5e_4143_9d1f_afa693746d58.slice/crio-0901bfca748c17e81e7e76caddf0f850448512c06de6d42d69a6078d7b2d4142 WatchSource:0}: Error finding container 0901bfca748c17e81e7e76caddf0f850448512c06de6d42d69a6078d7b2d4142: Status 404 returned error can't find the container with id 0901bfca748c17e81e7e76caddf0f850448512c06de6d42d69a6078d7b2d4142 Mar 13 16:45:44 crc kubenswrapper[4786]: I0313 16:45:44.296641 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7f4zh"] Mar 13 16:45:44 crc kubenswrapper[4786]: I0313 16:45:44.566568 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7" path="/var/lib/kubelet/pods/fb8a1c5b-50b6-493a-bdfb-fd3b0d6d73f7/volumes" Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.163428 4786 generic.go:334] "Generic (PLEG): container finished" podID="2643f396-8f5e-4143-9d1f-afa693746d58" containerID="63e9b77cf398b5a0543fb37f8c5a4789d45ff41a8472dc04a60cc91773474caa" exitCode=0 Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.163539 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f4zh" event={"ID":"2643f396-8f5e-4143-9d1f-afa693746d58","Type":"ContainerDied","Data":"63e9b77cf398b5a0543fb37f8c5a4789d45ff41a8472dc04a60cc91773474caa"} Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.163573 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f4zh" event={"ID":"2643f396-8f5e-4143-9d1f-afa693746d58","Type":"ContainerStarted","Data":"0901bfca748c17e81e7e76caddf0f850448512c06de6d42d69a6078d7b2d4142"} Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.166113 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.166325 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c","Type":"ContainerStarted","Data":"7d55f0a7c2741f27466f76b7c5b9dc3a8337a255fe433390ff1ead96a44b5cd3"} Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.166396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40766174-3a3b-4797-b4f8-1a3bf0e9eb7c","Type":"ContainerStarted","Data":"fd0db764121e22f7340944b3b5511ba089ea6909105a63cbb8ed6ad4ec7b00fd"} Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.205826 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.205809601 podStartE2EDuration="2.205809601s" podCreationTimestamp="2026-03-13 16:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:45:45.201084393 +0000 UTC m=+6175.364296204" watchObservedRunningTime="2026-03-13 16:45:45.205809601 +0000 UTC m=+6175.369021412" Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.678764 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-559978d967-l5m4z" Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.739747 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bfd4f4db5-7b988"] Mar 13 16:45:45 crc kubenswrapper[4786]: I0313 16:45:45.740101 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" podUID="067c3518-fddf-49ed-9e5f-bc44f60d6897" containerName="dnsmasq-dns" containerID="cri-o://ee691082331590c03e45be6cc563420fdbecd6894a6ff59f77020466a4d10b5f" gracePeriod=10 Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.175656 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f4zh" event={"ID":"2643f396-8f5e-4143-9d1f-afa693746d58","Type":"ContainerStarted","Data":"8d4873881dbdd6635168d48b9979d05c082e0f19204da841e1b603eca5366782"} Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.183553 4786 generic.go:334] "Generic (PLEG): container finished" podID="067c3518-fddf-49ed-9e5f-bc44f60d6897" containerID="ee691082331590c03e45be6cc563420fdbecd6894a6ff59f77020466a4d10b5f" exitCode=0 Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.184407 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" event={"ID":"067c3518-fddf-49ed-9e5f-bc44f60d6897","Type":"ContainerDied","Data":"ee691082331590c03e45be6cc563420fdbecd6894a6ff59f77020466a4d10b5f"} Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.184520 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" event={"ID":"067c3518-fddf-49ed-9e5f-bc44f60d6897","Type":"ContainerDied","Data":"ab15bab01ad5d0c5b668b0a2b727443090e8768a3bd196c0741d9f59b3f57b07"} Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.184578 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab15bab01ad5d0c5b668b0a2b727443090e8768a3bd196c0741d9f59b3f57b07" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.215478 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.346881 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-sb\") pod \"067c3518-fddf-49ed-9e5f-bc44f60d6897\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.346938 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsnz9\" (UniqueName: \"kubernetes.io/projected/067c3518-fddf-49ed-9e5f-bc44f60d6897-kube-api-access-bsnz9\") pod \"067c3518-fddf-49ed-9e5f-bc44f60d6897\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.347099 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-nb\") pod \"067c3518-fddf-49ed-9e5f-bc44f60d6897\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.347152 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-config\") pod \"067c3518-fddf-49ed-9e5f-bc44f60d6897\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.347226 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-dns-svc\") pod \"067c3518-fddf-49ed-9e5f-bc44f60d6897\" (UID: \"067c3518-fddf-49ed-9e5f-bc44f60d6897\") " Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.354359 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067c3518-fddf-49ed-9e5f-bc44f60d6897-kube-api-access-bsnz9" (OuterVolumeSpecName: "kube-api-access-bsnz9") pod "067c3518-fddf-49ed-9e5f-bc44f60d6897" (UID: "067c3518-fddf-49ed-9e5f-bc44f60d6897"). InnerVolumeSpecName "kube-api-access-bsnz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.392829 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "067c3518-fddf-49ed-9e5f-bc44f60d6897" (UID: "067c3518-fddf-49ed-9e5f-bc44f60d6897"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.393185 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "067c3518-fddf-49ed-9e5f-bc44f60d6897" (UID: "067c3518-fddf-49ed-9e5f-bc44f60d6897"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.397243 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "067c3518-fddf-49ed-9e5f-bc44f60d6897" (UID: "067c3518-fddf-49ed-9e5f-bc44f60d6897"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.407667 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-config" (OuterVolumeSpecName: "config") pod "067c3518-fddf-49ed-9e5f-bc44f60d6897" (UID: "067c3518-fddf-49ed-9e5f-bc44f60d6897"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.449663 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.449710 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.449730 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.449747 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/067c3518-fddf-49ed-9e5f-bc44f60d6897-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:46 crc kubenswrapper[4786]: I0313 16:45:46.449765 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsnz9\" (UniqueName: \"kubernetes.io/projected/067c3518-fddf-49ed-9e5f-bc44f60d6897-kube-api-access-bsnz9\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:47 crc kubenswrapper[4786]: I0313 16:45:47.197543 4786 generic.go:334] "Generic (PLEG): container finished" podID="2643f396-8f5e-4143-9d1f-afa693746d58" containerID="8d4873881dbdd6635168d48b9979d05c082e0f19204da841e1b603eca5366782" exitCode=0 Mar 13 16:45:47 crc kubenswrapper[4786]: I0313 16:45:47.197623 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f4zh" event={"ID":"2643f396-8f5e-4143-9d1f-afa693746d58","Type":"ContainerDied","Data":"8d4873881dbdd6635168d48b9979d05c082e0f19204da841e1b603eca5366782"} Mar 13 16:45:47 crc kubenswrapper[4786]: I0313 16:45:47.197961 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfd4f4db5-7b988" Mar 13 16:45:47 crc kubenswrapper[4786]: I0313 16:45:47.253503 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bfd4f4db5-7b988"] Mar 13 16:45:47 crc kubenswrapper[4786]: I0313 16:45:47.263514 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bfd4f4db5-7b988"] Mar 13 16:45:48 crc kubenswrapper[4786]: I0313 16:45:48.208647 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f4zh" event={"ID":"2643f396-8f5e-4143-9d1f-afa693746d58","Type":"ContainerStarted","Data":"412bd909264d5c3d4b77de6c3edc13cd57c4246074cf55c50adfa2a0594ec1bc"} Mar 13 16:45:48 crc kubenswrapper[4786]: I0313 16:45:48.235884 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7f4zh" podStartSLOduration=2.653657744 podStartE2EDuration="5.235837537s" podCreationTimestamp="2026-03-13 16:45:43 +0000 UTC" firstStartedPulling="2026-03-13 16:45:45.165657403 +0000 UTC m=+6175.328869234" lastFinishedPulling="2026-03-13 16:45:47.747837206 +0000 UTC m=+6177.911049027" observedRunningTime="2026-03-13 16:45:48.231516128 +0000 UTC m=+6178.394727939" watchObservedRunningTime="2026-03-13 16:45:48.235837537 +0000 UTC m=+6178.399049388" Mar 13 16:45:48 crc kubenswrapper[4786]: I0313 16:45:48.571633 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067c3518-fddf-49ed-9e5f-bc44f60d6897" path="/var/lib/kubelet/pods/067c3518-fddf-49ed-9e5f-bc44f60d6897/volumes" Mar 13 16:45:50 crc kubenswrapper[4786]: I0313 16:45:50.558815 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:45:50 crc kubenswrapper[4786]: E0313 16:45:50.559376 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:45:53 crc kubenswrapper[4786]: I0313 16:45:53.530517 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 16:45:53 crc kubenswrapper[4786]: I0313 16:45:53.532559 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 16:45:53 crc kubenswrapper[4786]: I0313 16:45:53.746166 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:53 crc kubenswrapper[4786]: I0313 16:45:53.746224 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:53 crc kubenswrapper[4786]: I0313 16:45:53.832434 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:54 crc kubenswrapper[4786]: I0313 16:45:54.358549 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:54 crc kubenswrapper[4786]: I0313 16:45:54.549075 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="40766174-3a3b-4797-b4f8-1a3bf0e9eb7c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.135:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 16:45:54 crc kubenswrapper[4786]: I0313 16:45:54.549558 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="40766174-3a3b-4797-b4f8-1a3bf0e9eb7c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.135:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 16:45:57 crc kubenswrapper[4786]: I0313 16:45:57.874848 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7f4zh"] Mar 13 16:45:57 crc kubenswrapper[4786]: I0313 16:45:57.875630 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7f4zh" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" containerName="registry-server" containerID="cri-o://412bd909264d5c3d4b77de6c3edc13cd57c4246074cf55c50adfa2a0594ec1bc" gracePeriod=2 Mar 13 16:45:58 crc kubenswrapper[4786]: E0313 16:45:58.009874 4786 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.12:57996->38.102.83.12:40601: read tcp 38.102.83.12:57996->38.102.83.12:40601: read: connection reset by peer Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.344568 4786 generic.go:334] "Generic (PLEG): container finished" podID="2643f396-8f5e-4143-9d1f-afa693746d58" containerID="412bd909264d5c3d4b77de6c3edc13cd57c4246074cf55c50adfa2a0594ec1bc" exitCode=0 Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.344626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f4zh" event={"ID":"2643f396-8f5e-4143-9d1f-afa693746d58","Type":"ContainerDied","Data":"412bd909264d5c3d4b77de6c3edc13cd57c4246074cf55c50adfa2a0594ec1bc"} Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.476022 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.648282 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-utilities\") pod \"2643f396-8f5e-4143-9d1f-afa693746d58\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.648450 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-catalog-content\") pod \"2643f396-8f5e-4143-9d1f-afa693746d58\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.648555 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4nc6\" (UniqueName: \"kubernetes.io/projected/2643f396-8f5e-4143-9d1f-afa693746d58-kube-api-access-r4nc6\") pod \"2643f396-8f5e-4143-9d1f-afa693746d58\" (UID: \"2643f396-8f5e-4143-9d1f-afa693746d58\") " Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.651917 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-utilities" (OuterVolumeSpecName: "utilities") pod "2643f396-8f5e-4143-9d1f-afa693746d58" (UID: "2643f396-8f5e-4143-9d1f-afa693746d58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.654796 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2643f396-8f5e-4143-9d1f-afa693746d58-kube-api-access-r4nc6" (OuterVolumeSpecName: "kube-api-access-r4nc6") pod "2643f396-8f5e-4143-9d1f-afa693746d58" (UID: "2643f396-8f5e-4143-9d1f-afa693746d58"). InnerVolumeSpecName "kube-api-access-r4nc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.731155 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2643f396-8f5e-4143-9d1f-afa693746d58" (UID: "2643f396-8f5e-4143-9d1f-afa693746d58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.751623 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.751679 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4nc6\" (UniqueName: \"kubernetes.io/projected/2643f396-8f5e-4143-9d1f-afa693746d58-kube-api-access-r4nc6\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:58 crc kubenswrapper[4786]: I0313 16:45:58.751701 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2643f396-8f5e-4143-9d1f-afa693746d58-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:45:59 crc kubenswrapper[4786]: I0313 16:45:59.364574 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f4zh" event={"ID":"2643f396-8f5e-4143-9d1f-afa693746d58","Type":"ContainerDied","Data":"0901bfca748c17e81e7e76caddf0f850448512c06de6d42d69a6078d7b2d4142"} Mar 13 16:45:59 crc kubenswrapper[4786]: I0313 16:45:59.364627 4786 scope.go:117] "RemoveContainer" containerID="412bd909264d5c3d4b77de6c3edc13cd57c4246074cf55c50adfa2a0594ec1bc" Mar 13 16:45:59 crc kubenswrapper[4786]: I0313 16:45:59.364774 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f4zh" Mar 13 16:45:59 crc kubenswrapper[4786]: I0313 16:45:59.422734 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7f4zh"] Mar 13 16:45:59 crc kubenswrapper[4786]: I0313 16:45:59.424468 4786 scope.go:117] "RemoveContainer" containerID="8d4873881dbdd6635168d48b9979d05c082e0f19204da841e1b603eca5366782" Mar 13 16:45:59 crc kubenswrapper[4786]: I0313 16:45:59.434268 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7f4zh"] Mar 13 16:45:59 crc kubenswrapper[4786]: I0313 16:45:59.448413 4786 scope.go:117] "RemoveContainer" containerID="63e9b77cf398b5a0543fb37f8c5a4789d45ff41a8472dc04a60cc91773474caa" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.208149 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557006-4q2sc"] Mar 13 16:46:00 crc kubenswrapper[4786]: E0313 16:46:00.209045 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" containerName="extract-content" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.209077 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" containerName="extract-content" Mar 13 16:46:00 crc kubenswrapper[4786]: E0313 16:46:00.209094 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067c3518-fddf-49ed-9e5f-bc44f60d6897" containerName="init" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.209103 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="067c3518-fddf-49ed-9e5f-bc44f60d6897" containerName="init" Mar 13 16:46:00 crc kubenswrapper[4786]: E0313 16:46:00.209140 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" containerName="registry-server" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.209151 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" containerName="registry-server" Mar 13 16:46:00 crc kubenswrapper[4786]: E0313 16:46:00.209194 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067c3518-fddf-49ed-9e5f-bc44f60d6897" containerName="dnsmasq-dns" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.209205 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="067c3518-fddf-49ed-9e5f-bc44f60d6897" containerName="dnsmasq-dns" Mar 13 16:46:00 crc kubenswrapper[4786]: E0313 16:46:00.209227 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" containerName="extract-utilities" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.209238 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" containerName="extract-utilities" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.209500 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" containerName="registry-server" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.209535 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="067c3518-fddf-49ed-9e5f-bc44f60d6897" containerName="dnsmasq-dns" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.210469 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.212903 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.213833 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.216794 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.233152 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557006-4q2sc"] Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.308952 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52p58\" (UniqueName: \"kubernetes.io/projected/40aa8c47-ea06-43fe-ab42-ed8406764e84-kube-api-access-52p58\") pod \"auto-csr-approver-29557006-4q2sc\" (UID: \"40aa8c47-ea06-43fe-ab42-ed8406764e84\") " pod="openshift-infra/auto-csr-approver-29557006-4q2sc" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.410130 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52p58\" (UniqueName: \"kubernetes.io/projected/40aa8c47-ea06-43fe-ab42-ed8406764e84-kube-api-access-52p58\") pod \"auto-csr-approver-29557006-4q2sc\" (UID: \"40aa8c47-ea06-43fe-ab42-ed8406764e84\") " pod="openshift-infra/auto-csr-approver-29557006-4q2sc" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.442686 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52p58\" (UniqueName: \"kubernetes.io/projected/40aa8c47-ea06-43fe-ab42-ed8406764e84-kube-api-access-52p58\") pod \"auto-csr-approver-29557006-4q2sc\" (UID: \"40aa8c47-ea06-43fe-ab42-ed8406764e84\") " pod="openshift-infra/auto-csr-approver-29557006-4q2sc" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.528717 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" Mar 13 16:46:00 crc kubenswrapper[4786]: I0313 16:46:00.567520 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2643f396-8f5e-4143-9d1f-afa693746d58" path="/var/lib/kubelet/pods/2643f396-8f5e-4143-9d1f-afa693746d58/volumes" Mar 13 16:46:01 crc kubenswrapper[4786]: W0313 16:46:01.030829 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40aa8c47_ea06_43fe_ab42_ed8406764e84.slice/crio-97403aa2057fc5cf4645fda77679f17849135dfd7f506dbdb8f4e41051f778af WatchSource:0}: Error finding container 97403aa2057fc5cf4645fda77679f17849135dfd7f506dbdb8f4e41051f778af: Status 404 returned error can't find the container with id 97403aa2057fc5cf4645fda77679f17849135dfd7f506dbdb8f4e41051f778af Mar 13 16:46:01 crc kubenswrapper[4786]: I0313 16:46:01.031193 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557006-4q2sc"] Mar 13 16:46:01 crc kubenswrapper[4786]: I0313 16:46:01.394237 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" event={"ID":"40aa8c47-ea06-43fe-ab42-ed8406764e84","Type":"ContainerStarted","Data":"97403aa2057fc5cf4645fda77679f17849135dfd7f506dbdb8f4e41051f778af"} Mar 13 16:46:01 crc kubenswrapper[4786]: I0313 16:46:01.530503 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 16:46:01 crc kubenswrapper[4786]: I0313 16:46:01.530563 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 16:46:01 crc kubenswrapper[4786]: I0313 16:46:01.554158 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:46:01 crc kubenswrapper[4786]: E0313 16:46:01.554613 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:46:02 crc kubenswrapper[4786]: I0313 16:46:02.404300 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" event={"ID":"40aa8c47-ea06-43fe-ab42-ed8406764e84","Type":"ContainerStarted","Data":"f3bfcf3d00095b6b6b237110f166707860784a0b3f833d97f60db20aedea3a4e"} Mar 13 16:46:02 crc kubenswrapper[4786]: I0313 16:46:02.420837 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" podStartSLOduration=1.517247844 podStartE2EDuration="2.420819855s" podCreationTimestamp="2026-03-13 16:46:00 +0000 UTC" firstStartedPulling="2026-03-13 16:46:01.037725576 +0000 UTC m=+6191.200937427" lastFinishedPulling="2026-03-13 16:46:01.941297607 +0000 UTC m=+6192.104509438" observedRunningTime="2026-03-13 16:46:02.416091956 +0000 UTC m=+6192.579303787" watchObservedRunningTime="2026-03-13 16:46:02.420819855 +0000 UTC m=+6192.584031666" Mar 13 16:46:03 crc kubenswrapper[4786]: I0313 16:46:03.426617 4786 generic.go:334] "Generic (PLEG): container finished" podID="40aa8c47-ea06-43fe-ab42-ed8406764e84" containerID="f3bfcf3d00095b6b6b237110f166707860784a0b3f833d97f60db20aedea3a4e" exitCode=0 Mar 13 16:46:03 crc kubenswrapper[4786]: I0313 16:46:03.426681 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" event={"ID":"40aa8c47-ea06-43fe-ab42-ed8406764e84","Type":"ContainerDied","Data":"f3bfcf3d00095b6b6b237110f166707860784a0b3f833d97f60db20aedea3a4e"} Mar 13 16:46:03 crc kubenswrapper[4786]: I0313 16:46:03.540076 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 16:46:03 crc kubenswrapper[4786]: I0313 16:46:03.541808 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 16:46:03 crc kubenswrapper[4786]: I0313 16:46:03.552595 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 16:46:04 crc kubenswrapper[4786]: I0313 16:46:04.449850 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.118769 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.205776 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52p58\" (UniqueName: \"kubernetes.io/projected/40aa8c47-ea06-43fe-ab42-ed8406764e84-kube-api-access-52p58\") pod \"40aa8c47-ea06-43fe-ab42-ed8406764e84\" (UID: \"40aa8c47-ea06-43fe-ab42-ed8406764e84\") " Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.211967 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40aa8c47-ea06-43fe-ab42-ed8406764e84-kube-api-access-52p58" (OuterVolumeSpecName: "kube-api-access-52p58") pod "40aa8c47-ea06-43fe-ab42-ed8406764e84" (UID: "40aa8c47-ea06-43fe-ab42-ed8406764e84"). InnerVolumeSpecName "kube-api-access-52p58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.307733 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52p58\" (UniqueName: \"kubernetes.io/projected/40aa8c47-ea06-43fe-ab42-ed8406764e84-kube-api-access-52p58\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.457649 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.457715 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557006-4q2sc" event={"ID":"40aa8c47-ea06-43fe-ab42-ed8406764e84","Type":"ContainerDied","Data":"97403aa2057fc5cf4645fda77679f17849135dfd7f506dbdb8f4e41051f778af"} Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.457801 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97403aa2057fc5cf4645fda77679f17849135dfd7f506dbdb8f4e41051f778af" Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.508187 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557000-q87vh"] Mar 13 16:46:05 crc kubenswrapper[4786]: I0313 16:46:05.518226 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557000-q87vh"] Mar 13 16:46:06 crc kubenswrapper[4786]: I0313 16:46:06.569835 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3" path="/var/lib/kubelet/pods/6e51eaa4-2e55-4ef6-9c44-9f4aff02abf3/volumes" Mar 13 16:46:14 crc kubenswrapper[4786]: I0313 16:46:14.551984 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:46:14 crc kubenswrapper[4786]: E0313 16:46:14.552618 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:46:15 crc kubenswrapper[4786]: I0313 16:46:15.772554 4786 scope.go:117] "RemoveContainer" containerID="eb7386a704c52262879074bdbc92fe97e0437efc339d35aada03b0b5b97614f8" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.125581 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jksdp"] Mar 13 16:46:24 crc kubenswrapper[4786]: E0313 16:46:24.127742 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40aa8c47-ea06-43fe-ab42-ed8406764e84" containerName="oc" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.127763 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="40aa8c47-ea06-43fe-ab42-ed8406764e84" containerName="oc" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.129777 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="40aa8c47-ea06-43fe-ab42-ed8406764e84" containerName="oc" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.130616 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.137282 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8vvrn" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.137500 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.140084 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.143775 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jksdp"] Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.201418 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7trr6"] Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.203300 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.209310 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-ovn-controller-tls-certs\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.209356 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-combined-ca-bundle\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.209376 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-run-ovn\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.209410 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-scripts\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.209449 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l88px\" (UniqueName: \"kubernetes.io/projected/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-kube-api-access-l88px\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.209489 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-run\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.209523 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-log-ovn\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.218472 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7trr6"] Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.311791 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-ovn-controller-tls-certs\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.311848 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-combined-ca-bundle\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.311885 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ead52d0-3013-43d3-976f-6db98a50ce3b-scripts\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.311907 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-run-ovn\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.311928 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-run\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.311960 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-scripts\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.311988 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l88px\" (UniqueName: \"kubernetes.io/projected/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-kube-api-access-l88px\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.312029 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-run\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.312048 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-etc-ovs\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.312066 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fs7s\" (UniqueName: \"kubernetes.io/projected/2ead52d0-3013-43d3-976f-6db98a50ce3b-kube-api-access-4fs7s\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.312091 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-lib\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.312116 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-log-ovn\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.312157 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-log\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.313070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-run-ovn\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.313094 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-run\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.313261 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-var-log-ovn\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.315247 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-scripts\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.319108 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-combined-ca-bundle\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.322120 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-ovn-controller-tls-certs\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.335383 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l88px\" (UniqueName: \"kubernetes.io/projected/ac0b2bbd-5bcb-4537-8273-09f3fdc43d61-kube-api-access-l88px\") pod \"ovn-controller-jksdp\" (UID: \"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61\") " pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413338 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-log\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413419 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ead52d0-3013-43d3-976f-6db98a50ce3b-scripts\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413440 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-run\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413503 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-etc-ovs\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413523 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fs7s\" (UniqueName: \"kubernetes.io/projected/2ead52d0-3013-43d3-976f-6db98a50ce3b-kube-api-access-4fs7s\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413545 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-lib\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413831 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-lib\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413824 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-run\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.413916 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-etc-ovs\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.414196 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2ead52d0-3013-43d3-976f-6db98a50ce3b-var-log\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.415733 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ead52d0-3013-43d3-976f-6db98a50ce3b-scripts\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.431073 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fs7s\" (UniqueName: \"kubernetes.io/projected/2ead52d0-3013-43d3-976f-6db98a50ce3b-kube-api-access-4fs7s\") pod \"ovn-controller-ovs-7trr6\" (UID: \"2ead52d0-3013-43d3-976f-6db98a50ce3b\") " pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.462374 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.529505 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:24 crc kubenswrapper[4786]: I0313 16:46:24.972903 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jksdp"] Mar 13 16:46:25 crc kubenswrapper[4786]: W0313 16:46:25.428895 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ead52d0_3013_43d3_976f_6db98a50ce3b.slice/crio-598bfebd6814492d9177dca68faa9db84e9c7edca8e91fadb0e93624ef544321 WatchSource:0}: Error finding container 598bfebd6814492d9177dca68faa9db84e9c7edca8e91fadb0e93624ef544321: Status 404 returned error can't find the container with id 598bfebd6814492d9177dca68faa9db84e9c7edca8e91fadb0e93624ef544321 Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.433777 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7trr6"] Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.703776 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6gkng"] Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.705194 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.711124 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.713209 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7trr6" event={"ID":"2ead52d0-3013-43d3-976f-6db98a50ce3b","Type":"ContainerStarted","Data":"598bfebd6814492d9177dca68faa9db84e9c7edca8e91fadb0e93624ef544321"} Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.715491 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp" event={"ID":"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61","Type":"ContainerStarted","Data":"474d8e650513c657a513fc793da8eaaefc5584a460ac5387309168141863283a"} Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.715562 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp" event={"ID":"ac0b2bbd-5bcb-4537-8273-09f3fdc43d61","Type":"ContainerStarted","Data":"ea5a1c73753c16950894b4e3660cd1635879c1dea98c080aa9016c4ae1461376"} Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.715666 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-jksdp" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.730669 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6gkng"] Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.767031 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jksdp" podStartSLOduration=1.765843559 podStartE2EDuration="1.765843559s" podCreationTimestamp="2026-03-13 16:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:46:25.755461018 +0000 UTC m=+6215.918672829" watchObservedRunningTime="2026-03-13 16:46:25.765843559 +0000 UTC m=+6215.929055370" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.844811 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r87gr\" (UniqueName: \"kubernetes.io/projected/937fe94a-0226-43da-963b-2c4d605b71de-kube-api-access-r87gr\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.844933 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/937fe94a-0226-43da-963b-2c4d605b71de-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.844991 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/937fe94a-0226-43da-963b-2c4d605b71de-config\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.845050 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/937fe94a-0226-43da-963b-2c4d605b71de-ovs-rundir\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.845129 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937fe94a-0226-43da-963b-2c4d605b71de-combined-ca-bundle\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.845320 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/937fe94a-0226-43da-963b-2c4d605b71de-ovn-rundir\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.946456 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/937fe94a-0226-43da-963b-2c4d605b71de-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.946544 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/937fe94a-0226-43da-963b-2c4d605b71de-config\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.946638 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/937fe94a-0226-43da-963b-2c4d605b71de-ovs-rundir\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.946675 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937fe94a-0226-43da-963b-2c4d605b71de-combined-ca-bundle\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.946751 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/937fe94a-0226-43da-963b-2c4d605b71de-ovn-rundir\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.946821 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r87gr\" (UniqueName: \"kubernetes.io/projected/937fe94a-0226-43da-963b-2c4d605b71de-kube-api-access-r87gr\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.946844 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/937fe94a-0226-43da-963b-2c4d605b71de-ovs-rundir\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.946963 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/937fe94a-0226-43da-963b-2c4d605b71de-ovn-rundir\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.947527 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/937fe94a-0226-43da-963b-2c4d605b71de-config\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.951807 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/937fe94a-0226-43da-963b-2c4d605b71de-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.951953 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937fe94a-0226-43da-963b-2c4d605b71de-combined-ca-bundle\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:25 crc kubenswrapper[4786]: I0313 16:46:25.964062 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r87gr\" (UniqueName: \"kubernetes.io/projected/937fe94a-0226-43da-963b-2c4d605b71de-kube-api-access-r87gr\") pod \"ovn-controller-metrics-6gkng\" (UID: \"937fe94a-0226-43da-963b-2c4d605b71de\") " pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:26 crc kubenswrapper[4786]: I0313 16:46:26.039410 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6gkng" Mar 13 16:46:26 crc kubenswrapper[4786]: I0313 16:46:26.348186 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6gkng"] Mar 13 16:46:26 crc kubenswrapper[4786]: I0313 16:46:26.552502 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:46:26 crc kubenswrapper[4786]: E0313 16:46:26.552977 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:46:26 crc kubenswrapper[4786]: I0313 16:46:26.726038 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6gkng" event={"ID":"937fe94a-0226-43da-963b-2c4d605b71de","Type":"ContainerStarted","Data":"f02f2705ed22d117449c060d2c1c13053e5a2dea3c283119c99477c0941eef04"} Mar 13 16:46:26 crc kubenswrapper[4786]: I0313 16:46:26.726078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6gkng" event={"ID":"937fe94a-0226-43da-963b-2c4d605b71de","Type":"ContainerStarted","Data":"3c6cda7d9d5b8d9fecc30505ddca70f5b7755ab627b30d47da4c99b2781ac032"} Mar 13 16:46:26 crc kubenswrapper[4786]: I0313 16:46:26.728331 4786 generic.go:334] "Generic (PLEG): container finished" podID="2ead52d0-3013-43d3-976f-6db98a50ce3b" containerID="eaeca82f73783e2884d61eb3bcffc05777cf144937a9d73a521b4c11c5ba1803" exitCode=0 Mar 13 16:46:26 crc kubenswrapper[4786]: I0313 16:46:26.728376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7trr6" event={"ID":"2ead52d0-3013-43d3-976f-6db98a50ce3b","Type":"ContainerDied","Data":"eaeca82f73783e2884d61eb3bcffc05777cf144937a9d73a521b4c11c5ba1803"} Mar 13 16:46:26 crc kubenswrapper[4786]: I0313 16:46:26.754095 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6gkng" podStartSLOduration=1.754075447 podStartE2EDuration="1.754075447s" podCreationTimestamp="2026-03-13 16:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:46:26.74743039 +0000 UTC m=+6216.910642211" watchObservedRunningTime="2026-03-13 16:46:26.754075447 +0000 UTC m=+6216.917287268" Mar 13 16:46:27 crc kubenswrapper[4786]: I0313 16:46:27.746414 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7trr6" event={"ID":"2ead52d0-3013-43d3-976f-6db98a50ce3b","Type":"ContainerStarted","Data":"0318853d72bf3c7433030765fb2f45f678819705877989034a4afee4dd20e645"} Mar 13 16:46:27 crc kubenswrapper[4786]: I0313 16:46:27.746944 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7trr6" event={"ID":"2ead52d0-3013-43d3-976f-6db98a50ce3b","Type":"ContainerStarted","Data":"b303bb4d953ea7341d8afeb1b06603b97424fa2915f6662417b1fca5b11bda28"} Mar 13 16:46:27 crc kubenswrapper[4786]: I0313 16:46:27.772661 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7trr6" podStartSLOduration=3.772647148 podStartE2EDuration="3.772647148s" podCreationTimestamp="2026-03-13 16:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:46:27.7647824 +0000 UTC m=+6217.927994241" watchObservedRunningTime="2026-03-13 16:46:27.772647148 +0000 UTC m=+6217.935858959" Mar 13 16:46:28 crc kubenswrapper[4786]: I0313 16:46:28.755602 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:28 crc kubenswrapper[4786]: I0313 16:46:28.756077 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:39 crc kubenswrapper[4786]: I0313 16:46:39.552934 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:46:39 crc kubenswrapper[4786]: E0313 16:46:39.554165 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:46:45 crc kubenswrapper[4786]: I0313 16:46:45.860899 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-qgn7f"] Mar 13 16:46:45 crc kubenswrapper[4786]: I0313 16:46:45.867597 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:45 crc kubenswrapper[4786]: I0313 16:46:45.876338 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-qgn7f"] Mar 13 16:46:45 crc kubenswrapper[4786]: I0313 16:46:45.972811 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231c608d-b1ee-445f-af64-45aa3e82cca0-operator-scripts\") pod \"octavia-db-create-qgn7f\" (UID: \"231c608d-b1ee-445f-af64-45aa3e82cca0\") " pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:45 crc kubenswrapper[4786]: I0313 16:46:45.992458 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8p4z\" (UniqueName: \"kubernetes.io/projected/231c608d-b1ee-445f-af64-45aa3e82cca0-kube-api-access-z8p4z\") pod \"octavia-db-create-qgn7f\" (UID: \"231c608d-b1ee-445f-af64-45aa3e82cca0\") " pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.093799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8p4z\" (UniqueName: \"kubernetes.io/projected/231c608d-b1ee-445f-af64-45aa3e82cca0-kube-api-access-z8p4z\") pod \"octavia-db-create-qgn7f\" (UID: \"231c608d-b1ee-445f-af64-45aa3e82cca0\") " pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.094184 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231c608d-b1ee-445f-af64-45aa3e82cca0-operator-scripts\") pod \"octavia-db-create-qgn7f\" (UID: \"231c608d-b1ee-445f-af64-45aa3e82cca0\") " pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.094917 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231c608d-b1ee-445f-af64-45aa3e82cca0-operator-scripts\") pod \"octavia-db-create-qgn7f\" (UID: \"231c608d-b1ee-445f-af64-45aa3e82cca0\") " pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.123124 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8p4z\" (UniqueName: \"kubernetes.io/projected/231c608d-b1ee-445f-af64-45aa3e82cca0-kube-api-access-z8p4z\") pod \"octavia-db-create-qgn7f\" (UID: \"231c608d-b1ee-445f-af64-45aa3e82cca0\") " pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.185973 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.628782 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-qgn7f"] Mar 13 16:46:46 crc kubenswrapper[4786]: W0313 16:46:46.633569 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod231c608d_b1ee_445f_af64_45aa3e82cca0.slice/crio-4d787e6f6a41e5337f2d03f3916976773c09aa81653c25baa1e3bf0f9ae21080 WatchSource:0}: Error finding container 4d787e6f6a41e5337f2d03f3916976773c09aa81653c25baa1e3bf0f9ae21080: Status 404 returned error can't find the container with id 4d787e6f6a41e5337f2d03f3916976773c09aa81653c25baa1e3bf0f9ae21080 Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.801789 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-9515-account-create-update-nqqng"] Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.803174 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.805915 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.833979 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9515-account-create-update-nqqng"] Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.907951 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d662abaa-4326-48f5-b0ff-7ab37206b48c-operator-scripts\") pod \"octavia-9515-account-create-update-nqqng\" (UID: \"d662abaa-4326-48f5-b0ff-7ab37206b48c\") " pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.908403 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz842\" (UniqueName: \"kubernetes.io/projected/d662abaa-4326-48f5-b0ff-7ab37206b48c-kube-api-access-mz842\") pod \"octavia-9515-account-create-update-nqqng\" (UID: \"d662abaa-4326-48f5-b0ff-7ab37206b48c\") " pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.987601 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-qgn7f" event={"ID":"231c608d-b1ee-445f-af64-45aa3e82cca0","Type":"ContainerStarted","Data":"536dd126e5528f984a7ded958ac4998ac0db1c246ddbfd37f21c88773e2ad582"} Mar 13 16:46:46 crc kubenswrapper[4786]: I0313 16:46:46.987655 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-qgn7f" event={"ID":"231c608d-b1ee-445f-af64-45aa3e82cca0","Type":"ContainerStarted","Data":"4d787e6f6a41e5337f2d03f3916976773c09aa81653c25baa1e3bf0f9ae21080"} Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.010671 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz842\" (UniqueName: \"kubernetes.io/projected/d662abaa-4326-48f5-b0ff-7ab37206b48c-kube-api-access-mz842\") pod \"octavia-9515-account-create-update-nqqng\" (UID: \"d662abaa-4326-48f5-b0ff-7ab37206b48c\") " pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.010800 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d662abaa-4326-48f5-b0ff-7ab37206b48c-operator-scripts\") pod \"octavia-9515-account-create-update-nqqng\" (UID: \"d662abaa-4326-48f5-b0ff-7ab37206b48c\") " pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.011935 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d662abaa-4326-48f5-b0ff-7ab37206b48c-operator-scripts\") pod \"octavia-9515-account-create-update-nqqng\" (UID: \"d662abaa-4326-48f5-b0ff-7ab37206b48c\") " pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.019361 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-create-qgn7f" podStartSLOduration=2.019337134 podStartE2EDuration="2.019337134s" podCreationTimestamp="2026-03-13 16:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:46:47.00208098 +0000 UTC m=+6237.165292811" watchObservedRunningTime="2026-03-13 16:46:47.019337134 +0000 UTC m=+6237.182548945" Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.036893 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz842\" (UniqueName: \"kubernetes.io/projected/d662abaa-4326-48f5-b0ff-7ab37206b48c-kube-api-access-mz842\") pod \"octavia-9515-account-create-update-nqqng\" (UID: \"d662abaa-4326-48f5-b0ff-7ab37206b48c\") " pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.126838 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:47 crc kubenswrapper[4786]: W0313 16:46:47.581360 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd662abaa_4326_48f5_b0ff_7ab37206b48c.slice/crio-5601bf5579bfa484b08364d4a1e51aa7797cbbf3184523142d9ad8ea8b966ccb WatchSource:0}: Error finding container 5601bf5579bfa484b08364d4a1e51aa7797cbbf3184523142d9ad8ea8b966ccb: Status 404 returned error can't find the container with id 5601bf5579bfa484b08364d4a1e51aa7797cbbf3184523142d9ad8ea8b966ccb Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.586078 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9515-account-create-update-nqqng"] Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.998956 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9515-account-create-update-nqqng" event={"ID":"d662abaa-4326-48f5-b0ff-7ab37206b48c","Type":"ContainerStarted","Data":"c4064521b891c40bb276a45439e34babfff6d9016d94b0404c3466785e0bf398"} Mar 13 16:46:47 crc kubenswrapper[4786]: I0313 16:46:47.999008 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9515-account-create-update-nqqng" event={"ID":"d662abaa-4326-48f5-b0ff-7ab37206b48c","Type":"ContainerStarted","Data":"5601bf5579bfa484b08364d4a1e51aa7797cbbf3184523142d9ad8ea8b966ccb"} Mar 13 16:46:48 crc kubenswrapper[4786]: I0313 16:46:48.001447 4786 generic.go:334] "Generic (PLEG): container finished" podID="231c608d-b1ee-445f-af64-45aa3e82cca0" containerID="536dd126e5528f984a7ded958ac4998ac0db1c246ddbfd37f21c88773e2ad582" exitCode=0 Mar 13 16:46:48 crc kubenswrapper[4786]: I0313 16:46:48.001475 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-qgn7f" event={"ID":"231c608d-b1ee-445f-af64-45aa3e82cca0","Type":"ContainerDied","Data":"536dd126e5528f984a7ded958ac4998ac0db1c246ddbfd37f21c88773e2ad582"} Mar 13 16:46:48 crc kubenswrapper[4786]: I0313 16:46:48.015822 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-9515-account-create-update-nqqng" podStartSLOduration=2.015801549 podStartE2EDuration="2.015801549s" podCreationTimestamp="2026-03-13 16:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:46:48.012410074 +0000 UTC m=+6238.175621885" watchObservedRunningTime="2026-03-13 16:46:48.015801549 +0000 UTC m=+6238.179013370" Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.024493 4786 generic.go:334] "Generic (PLEG): container finished" podID="d662abaa-4326-48f5-b0ff-7ab37206b48c" containerID="c4064521b891c40bb276a45439e34babfff6d9016d94b0404c3466785e0bf398" exitCode=0 Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.024573 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9515-account-create-update-nqqng" event={"ID":"d662abaa-4326-48f5-b0ff-7ab37206b48c","Type":"ContainerDied","Data":"c4064521b891c40bb276a45439e34babfff6d9016d94b0404c3466785e0bf398"} Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.392392 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.468459 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231c608d-b1ee-445f-af64-45aa3e82cca0-operator-scripts\") pod \"231c608d-b1ee-445f-af64-45aa3e82cca0\" (UID: \"231c608d-b1ee-445f-af64-45aa3e82cca0\") " Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.469055 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8p4z\" (UniqueName: \"kubernetes.io/projected/231c608d-b1ee-445f-af64-45aa3e82cca0-kube-api-access-z8p4z\") pod \"231c608d-b1ee-445f-af64-45aa3e82cca0\" (UID: \"231c608d-b1ee-445f-af64-45aa3e82cca0\") " Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.469540 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/231c608d-b1ee-445f-af64-45aa3e82cca0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "231c608d-b1ee-445f-af64-45aa3e82cca0" (UID: "231c608d-b1ee-445f-af64-45aa3e82cca0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.477320 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231c608d-b1ee-445f-af64-45aa3e82cca0-kube-api-access-z8p4z" (OuterVolumeSpecName: "kube-api-access-z8p4z") pod "231c608d-b1ee-445f-af64-45aa3e82cca0" (UID: "231c608d-b1ee-445f-af64-45aa3e82cca0"). InnerVolumeSpecName "kube-api-access-z8p4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.571372 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231c608d-b1ee-445f-af64-45aa3e82cca0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:49 crc kubenswrapper[4786]: I0313 16:46:49.571417 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8p4z\" (UniqueName: \"kubernetes.io/projected/231c608d-b1ee-445f-af64-45aa3e82cca0-kube-api-access-z8p4z\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.054988 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-qgn7f" event={"ID":"231c608d-b1ee-445f-af64-45aa3e82cca0","Type":"ContainerDied","Data":"4d787e6f6a41e5337f2d03f3916976773c09aa81653c25baa1e3bf0f9ae21080"} Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.055052 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d787e6f6a41e5337f2d03f3916976773c09aa81653c25baa1e3bf0f9ae21080" Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.055094 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-qgn7f" Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.408510 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.489749 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d662abaa-4326-48f5-b0ff-7ab37206b48c-operator-scripts\") pod \"d662abaa-4326-48f5-b0ff-7ab37206b48c\" (UID: \"d662abaa-4326-48f5-b0ff-7ab37206b48c\") " Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.489938 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz842\" (UniqueName: \"kubernetes.io/projected/d662abaa-4326-48f5-b0ff-7ab37206b48c-kube-api-access-mz842\") pod \"d662abaa-4326-48f5-b0ff-7ab37206b48c\" (UID: \"d662abaa-4326-48f5-b0ff-7ab37206b48c\") " Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.490750 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d662abaa-4326-48f5-b0ff-7ab37206b48c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d662abaa-4326-48f5-b0ff-7ab37206b48c" (UID: "d662abaa-4326-48f5-b0ff-7ab37206b48c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.496332 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d662abaa-4326-48f5-b0ff-7ab37206b48c-kube-api-access-mz842" (OuterVolumeSpecName: "kube-api-access-mz842") pod "d662abaa-4326-48f5-b0ff-7ab37206b48c" (UID: "d662abaa-4326-48f5-b0ff-7ab37206b48c"). InnerVolumeSpecName "kube-api-access-mz842". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.558470 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:46:50 crc kubenswrapper[4786]: E0313 16:46:50.558835 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.592615 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d662abaa-4326-48f5-b0ff-7ab37206b48c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:50 crc kubenswrapper[4786]: I0313 16:46:50.592813 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz842\" (UniqueName: \"kubernetes.io/projected/d662abaa-4326-48f5-b0ff-7ab37206b48c-kube-api-access-mz842\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:51 crc kubenswrapper[4786]: I0313 16:46:51.075622 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9515-account-create-update-nqqng" event={"ID":"d662abaa-4326-48f5-b0ff-7ab37206b48c","Type":"ContainerDied","Data":"5601bf5579bfa484b08364d4a1e51aa7797cbbf3184523142d9ad8ea8b966ccb"} Mar 13 16:46:51 crc kubenswrapper[4786]: I0313 16:46:51.076976 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5601bf5579bfa484b08364d4a1e51aa7797cbbf3184523142d9ad8ea8b966ccb" Mar 13 16:46:51 crc kubenswrapper[4786]: I0313 16:46:51.075676 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9515-account-create-update-nqqng" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.252467 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-gk5br"] Mar 13 16:46:52 crc kubenswrapper[4786]: E0313 16:46:52.253420 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231c608d-b1ee-445f-af64-45aa3e82cca0" containerName="mariadb-database-create" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.253443 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="231c608d-b1ee-445f-af64-45aa3e82cca0" containerName="mariadb-database-create" Mar 13 16:46:52 crc kubenswrapper[4786]: E0313 16:46:52.253459 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d662abaa-4326-48f5-b0ff-7ab37206b48c" containerName="mariadb-account-create-update" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.253468 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d662abaa-4326-48f5-b0ff-7ab37206b48c" containerName="mariadb-account-create-update" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.253776 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d662abaa-4326-48f5-b0ff-7ab37206b48c" containerName="mariadb-account-create-update" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.253815 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="231c608d-b1ee-445f-af64-45aa3e82cca0" containerName="mariadb-database-create" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.254724 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.268917 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-gk5br"] Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.331972 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crqhb\" (UniqueName: \"kubernetes.io/projected/5a6472d1-1200-49d1-876c-b688bb2f4a14-kube-api-access-crqhb\") pod \"octavia-persistence-db-create-gk5br\" (UID: \"5a6472d1-1200-49d1-876c-b688bb2f4a14\") " pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.332072 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6472d1-1200-49d1-876c-b688bb2f4a14-operator-scripts\") pod \"octavia-persistence-db-create-gk5br\" (UID: \"5a6472d1-1200-49d1-876c-b688bb2f4a14\") " pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.433676 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crqhb\" (UniqueName: \"kubernetes.io/projected/5a6472d1-1200-49d1-876c-b688bb2f4a14-kube-api-access-crqhb\") pod \"octavia-persistence-db-create-gk5br\" (UID: \"5a6472d1-1200-49d1-876c-b688bb2f4a14\") " pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.433770 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6472d1-1200-49d1-876c-b688bb2f4a14-operator-scripts\") pod \"octavia-persistence-db-create-gk5br\" (UID: \"5a6472d1-1200-49d1-876c-b688bb2f4a14\") " pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.434563 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6472d1-1200-49d1-876c-b688bb2f4a14-operator-scripts\") pod \"octavia-persistence-db-create-gk5br\" (UID: \"5a6472d1-1200-49d1-876c-b688bb2f4a14\") " pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.453841 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crqhb\" (UniqueName: \"kubernetes.io/projected/5a6472d1-1200-49d1-876c-b688bb2f4a14-kube-api-access-crqhb\") pod \"octavia-persistence-db-create-gk5br\" (UID: \"5a6472d1-1200-49d1-876c-b688bb2f4a14\") " pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:52 crc kubenswrapper[4786]: I0313 16:46:52.618917 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.092904 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-gk5br"] Mar 13 16:46:53 crc kubenswrapper[4786]: W0313 16:46:53.106765 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a6472d1_1200_49d1_876c_b688bb2f4a14.slice/crio-6681bf7bc6a19220f6d572510647fa277a89e0128c673d3d7e6947294041876a WatchSource:0}: Error finding container 6681bf7bc6a19220f6d572510647fa277a89e0128c673d3d7e6947294041876a: Status 404 returned error can't find the container with id 6681bf7bc6a19220f6d572510647fa277a89e0128c673d3d7e6947294041876a Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.628369 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-8518-account-create-update-52bsl"] Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.630560 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.632590 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.639126 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-8518-account-create-update-52bsl"] Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.660282 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e558f33f-13cc-451c-8d7c-93990aa61ad7-operator-scripts\") pod \"octavia-8518-account-create-update-52bsl\" (UID: \"e558f33f-13cc-451c-8d7c-93990aa61ad7\") " pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.660332 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4ptp\" (UniqueName: \"kubernetes.io/projected/e558f33f-13cc-451c-8d7c-93990aa61ad7-kube-api-access-g4ptp\") pod \"octavia-8518-account-create-update-52bsl\" (UID: \"e558f33f-13cc-451c-8d7c-93990aa61ad7\") " pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.761754 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e558f33f-13cc-451c-8d7c-93990aa61ad7-operator-scripts\") pod \"octavia-8518-account-create-update-52bsl\" (UID: \"e558f33f-13cc-451c-8d7c-93990aa61ad7\") " pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.761813 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4ptp\" (UniqueName: \"kubernetes.io/projected/e558f33f-13cc-451c-8d7c-93990aa61ad7-kube-api-access-g4ptp\") pod \"octavia-8518-account-create-update-52bsl\" (UID: \"e558f33f-13cc-451c-8d7c-93990aa61ad7\") " pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.762772 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e558f33f-13cc-451c-8d7c-93990aa61ad7-operator-scripts\") pod \"octavia-8518-account-create-update-52bsl\" (UID: \"e558f33f-13cc-451c-8d7c-93990aa61ad7\") " pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.796094 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4ptp\" (UniqueName: \"kubernetes.io/projected/e558f33f-13cc-451c-8d7c-93990aa61ad7-kube-api-access-g4ptp\") pod \"octavia-8518-account-create-update-52bsl\" (UID: \"e558f33f-13cc-451c-8d7c-93990aa61ad7\") " pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:53 crc kubenswrapper[4786]: I0313 16:46:53.957021 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:54 crc kubenswrapper[4786]: I0313 16:46:54.124539 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-gk5br" event={"ID":"5a6472d1-1200-49d1-876c-b688bb2f4a14","Type":"ContainerDied","Data":"5bd36d7bc8267ea28728850cbbfeacbdc8837a28a601fdb9983e9d47eae18d2a"} Mar 13 16:46:54 crc kubenswrapper[4786]: I0313 16:46:54.126919 4786 generic.go:334] "Generic (PLEG): container finished" podID="5a6472d1-1200-49d1-876c-b688bb2f4a14" containerID="5bd36d7bc8267ea28728850cbbfeacbdc8837a28a601fdb9983e9d47eae18d2a" exitCode=0 Mar 13 16:46:54 crc kubenswrapper[4786]: I0313 16:46:54.126977 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-gk5br" event={"ID":"5a6472d1-1200-49d1-876c-b688bb2f4a14","Type":"ContainerStarted","Data":"6681bf7bc6a19220f6d572510647fa277a89e0128c673d3d7e6947294041876a"} Mar 13 16:46:54 crc kubenswrapper[4786]: I0313 16:46:54.440831 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-8518-account-create-update-52bsl"] Mar 13 16:46:54 crc kubenswrapper[4786]: W0313 16:46:54.442690 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode558f33f_13cc_451c_8d7c_93990aa61ad7.slice/crio-e2fac2b4dd8a5f2d693e7b9a1f5ffb505c04f525c239f4c35b9ce432d23b40e5 WatchSource:0}: Error finding container e2fac2b4dd8a5f2d693e7b9a1f5ffb505c04f525c239f4c35b9ce432d23b40e5: Status 404 returned error can't find the container with id e2fac2b4dd8a5f2d693e7b9a1f5ffb505c04f525c239f4c35b9ce432d23b40e5 Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.140826 4786 generic.go:334] "Generic (PLEG): container finished" podID="e558f33f-13cc-451c-8d7c-93990aa61ad7" containerID="91d398c9242cbb5a9c80557765acf9b55a66ede9d7aa1c1c239d0e6e73e44b07" exitCode=0 Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.141094 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8518-account-create-update-52bsl" event={"ID":"e558f33f-13cc-451c-8d7c-93990aa61ad7","Type":"ContainerDied","Data":"91d398c9242cbb5a9c80557765acf9b55a66ede9d7aa1c1c239d0e6e73e44b07"} Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.141473 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8518-account-create-update-52bsl" event={"ID":"e558f33f-13cc-451c-8d7c-93990aa61ad7","Type":"ContainerStarted","Data":"e2fac2b4dd8a5f2d693e7b9a1f5ffb505c04f525c239f4c35b9ce432d23b40e5"} Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.587126 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.598823 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crqhb\" (UniqueName: \"kubernetes.io/projected/5a6472d1-1200-49d1-876c-b688bb2f4a14-kube-api-access-crqhb\") pod \"5a6472d1-1200-49d1-876c-b688bb2f4a14\" (UID: \"5a6472d1-1200-49d1-876c-b688bb2f4a14\") " Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.598988 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6472d1-1200-49d1-876c-b688bb2f4a14-operator-scripts\") pod \"5a6472d1-1200-49d1-876c-b688bb2f4a14\" (UID: \"5a6472d1-1200-49d1-876c-b688bb2f4a14\") " Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.601102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a6472d1-1200-49d1-876c-b688bb2f4a14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a6472d1-1200-49d1-876c-b688bb2f4a14" (UID: "5a6472d1-1200-49d1-876c-b688bb2f4a14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.605586 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6472d1-1200-49d1-876c-b688bb2f4a14-kube-api-access-crqhb" (OuterVolumeSpecName: "kube-api-access-crqhb") pod "5a6472d1-1200-49d1-876c-b688bb2f4a14" (UID: "5a6472d1-1200-49d1-876c-b688bb2f4a14"). InnerVolumeSpecName "kube-api-access-crqhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.701549 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crqhb\" (UniqueName: \"kubernetes.io/projected/5a6472d1-1200-49d1-876c-b688bb2f4a14-kube-api-access-crqhb\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:55 crc kubenswrapper[4786]: I0313 16:46:55.701589 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6472d1-1200-49d1-876c-b688bb2f4a14-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.154440 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-gk5br" Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.155030 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-gk5br" event={"ID":"5a6472d1-1200-49d1-876c-b688bb2f4a14","Type":"ContainerDied","Data":"6681bf7bc6a19220f6d572510647fa277a89e0128c673d3d7e6947294041876a"} Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.155113 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6681bf7bc6a19220f6d572510647fa277a89e0128c673d3d7e6947294041876a" Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.559536 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.622342 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4ptp\" (UniqueName: \"kubernetes.io/projected/e558f33f-13cc-451c-8d7c-93990aa61ad7-kube-api-access-g4ptp\") pod \"e558f33f-13cc-451c-8d7c-93990aa61ad7\" (UID: \"e558f33f-13cc-451c-8d7c-93990aa61ad7\") " Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.622519 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e558f33f-13cc-451c-8d7c-93990aa61ad7-operator-scripts\") pod \"e558f33f-13cc-451c-8d7c-93990aa61ad7\" (UID: \"e558f33f-13cc-451c-8d7c-93990aa61ad7\") " Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.623435 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e558f33f-13cc-451c-8d7c-93990aa61ad7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e558f33f-13cc-451c-8d7c-93990aa61ad7" (UID: "e558f33f-13cc-451c-8d7c-93990aa61ad7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.639856 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e558f33f-13cc-451c-8d7c-93990aa61ad7-kube-api-access-g4ptp" (OuterVolumeSpecName: "kube-api-access-g4ptp") pod "e558f33f-13cc-451c-8d7c-93990aa61ad7" (UID: "e558f33f-13cc-451c-8d7c-93990aa61ad7"). InnerVolumeSpecName "kube-api-access-g4ptp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.725275 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4ptp\" (UniqueName: \"kubernetes.io/projected/e558f33f-13cc-451c-8d7c-93990aa61ad7-kube-api-access-g4ptp\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:56 crc kubenswrapper[4786]: I0313 16:46:56.725321 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e558f33f-13cc-451c-8d7c-93990aa61ad7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:46:57 crc kubenswrapper[4786]: I0313 16:46:57.208737 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8518-account-create-update-52bsl" event={"ID":"e558f33f-13cc-451c-8d7c-93990aa61ad7","Type":"ContainerDied","Data":"e2fac2b4dd8a5f2d693e7b9a1f5ffb505c04f525c239f4c35b9ce432d23b40e5"} Mar 13 16:46:57 crc kubenswrapper[4786]: I0313 16:46:57.209231 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2fac2b4dd8a5f2d693e7b9a1f5ffb505c04f525c239f4c35b9ce432d23b40e5" Mar 13 16:46:57 crc kubenswrapper[4786]: I0313 16:46:57.208807 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8518-account-create-update-52bsl" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.529971 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-jksdp" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.600592 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.690088 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7trr6" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.889356 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-5f6c445cdc-gklls"] Mar 13 16:46:59 crc kubenswrapper[4786]: E0313 16:46:59.890053 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6472d1-1200-49d1-876c-b688bb2f4a14" containerName="mariadb-database-create" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.890084 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6472d1-1200-49d1-876c-b688bb2f4a14" containerName="mariadb-database-create" Mar 13 16:46:59 crc kubenswrapper[4786]: E0313 16:46:59.890115 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e558f33f-13cc-451c-8d7c-93990aa61ad7" containerName="mariadb-account-create-update" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.890126 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e558f33f-13cc-451c-8d7c-93990aa61ad7" containerName="mariadb-account-create-update" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.890447 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e558f33f-13cc-451c-8d7c-93990aa61ad7" containerName="mariadb-account-create-update" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.890479 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6472d1-1200-49d1-876c-b688bb2f4a14" containerName="mariadb-database-create" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.892763 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.895774 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.896179 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.896347 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-ccwkp" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.897399 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.900671 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jksdp-config-m2m99"] Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.901861 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.908942 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.910119 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.910219 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-scripts\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.910269 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-ovndb-tls-certs\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.910293 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data-merged\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.910318 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-octavia-run\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.910344 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-combined-ca-bundle\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.925527 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-5f6c445cdc-gklls"] Mar 13 16:46:59 crc kubenswrapper[4786]: I0313 16:46:59.925593 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jksdp-config-m2m99"] Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.011966 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-additional-scripts\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012178 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-scripts\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012386 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-ovndb-tls-certs\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012440 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data-merged\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012485 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmtqw\" (UniqueName: \"kubernetes.io/projected/9a57abd7-a71c-448a-8b78-91627ad190f4-kube-api-access-xmtqw\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012534 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-octavia-run\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012563 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-scripts\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012618 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run-ovn\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012648 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-combined-ca-bundle\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012776 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012806 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data-merged\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012822 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-log-ovn\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.012976 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.013362 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-octavia-run\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.017650 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.018283 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-ovndb-tls-certs\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.018962 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-scripts\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.020449 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-combined-ca-bundle\") pod \"octavia-api-5f6c445cdc-gklls\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.115482 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmtqw\" (UniqueName: \"kubernetes.io/projected/9a57abd7-a71c-448a-8b78-91627ad190f4-kube-api-access-xmtqw\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.115743 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-scripts\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.115841 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run-ovn\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.115960 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-log-ovn\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.116037 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.116138 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-additional-scripts\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.116231 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-log-ovn\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.116236 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run-ovn\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.116261 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.116703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-additional-scripts\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.117601 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-scripts\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.131667 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmtqw\" (UniqueName: \"kubernetes.io/projected/9a57abd7-a71c-448a-8b78-91627ad190f4-kube-api-access-xmtqw\") pod \"ovn-controller-jksdp-config-m2m99\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.210764 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.219680 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.724844 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jksdp-config-m2m99"] Mar 13 16:47:00 crc kubenswrapper[4786]: I0313 16:47:00.824233 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-5f6c445cdc-gklls"] Mar 13 16:47:00 crc kubenswrapper[4786]: W0313 16:47:00.845561 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38318612_ca22_4ebd_99c8_bc2c582b45b1.slice/crio-665c83a301330236e74c1e2d37be9c134e82718c8c94bdc257e3e66978502f87 WatchSource:0}: Error finding container 665c83a301330236e74c1e2d37be9c134e82718c8c94bdc257e3e66978502f87: Status 404 returned error can't find the container with id 665c83a301330236e74c1e2d37be9c134e82718c8c94bdc257e3e66978502f87 Mar 13 16:47:01 crc kubenswrapper[4786]: I0313 16:47:01.249377 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f6c445cdc-gklls" event={"ID":"38318612-ca22-4ebd-99c8-bc2c582b45b1","Type":"ContainerStarted","Data":"665c83a301330236e74c1e2d37be9c134e82718c8c94bdc257e3e66978502f87"} Mar 13 16:47:01 crc kubenswrapper[4786]: I0313 16:47:01.251120 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp-config-m2m99" event={"ID":"9a57abd7-a71c-448a-8b78-91627ad190f4","Type":"ContainerStarted","Data":"8089a9933b42cfbda9422dc906b9d45c0b5b96ae1f07194d156593394e18489e"} Mar 13 16:47:01 crc kubenswrapper[4786]: I0313 16:47:01.251167 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp-config-m2m99" event={"ID":"9a57abd7-a71c-448a-8b78-91627ad190f4","Type":"ContainerStarted","Data":"784ee22c97103afcbbab38ee3818d77ad3733bf753d3fe65fc3a1c526959e5d4"} Mar 13 16:47:02 crc kubenswrapper[4786]: I0313 16:47:02.263056 4786 generic.go:334] "Generic (PLEG): container finished" podID="9a57abd7-a71c-448a-8b78-91627ad190f4" containerID="8089a9933b42cfbda9422dc906b9d45c0b5b96ae1f07194d156593394e18489e" exitCode=0 Mar 13 16:47:02 crc kubenswrapper[4786]: I0313 16:47:02.263284 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp-config-m2m99" event={"ID":"9a57abd7-a71c-448a-8b78-91627ad190f4","Type":"ContainerDied","Data":"8089a9933b42cfbda9422dc906b9d45c0b5b96ae1f07194d156593394e18489e"} Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.701680 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.799749 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmtqw\" (UniqueName: \"kubernetes.io/projected/9a57abd7-a71c-448a-8b78-91627ad190f4-kube-api-access-xmtqw\") pod \"9a57abd7-a71c-448a-8b78-91627ad190f4\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.799805 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-additional-scripts\") pod \"9a57abd7-a71c-448a-8b78-91627ad190f4\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.799831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run\") pod \"9a57abd7-a71c-448a-8b78-91627ad190f4\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.799945 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-log-ovn\") pod \"9a57abd7-a71c-448a-8b78-91627ad190f4\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.800053 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-scripts\") pod \"9a57abd7-a71c-448a-8b78-91627ad190f4\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.800100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run-ovn\") pod \"9a57abd7-a71c-448a-8b78-91627ad190f4\" (UID: \"9a57abd7-a71c-448a-8b78-91627ad190f4\") " Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.800491 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run" (OuterVolumeSpecName: "var-run") pod "9a57abd7-a71c-448a-8b78-91627ad190f4" (UID: "9a57abd7-a71c-448a-8b78-91627ad190f4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.801262 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.801315 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9a57abd7-a71c-448a-8b78-91627ad190f4" (UID: "9a57abd7-a71c-448a-8b78-91627ad190f4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.801349 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9a57abd7-a71c-448a-8b78-91627ad190f4" (UID: "9a57abd7-a71c-448a-8b78-91627ad190f4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.802885 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-scripts" (OuterVolumeSpecName: "scripts") pod "9a57abd7-a71c-448a-8b78-91627ad190f4" (UID: "9a57abd7-a71c-448a-8b78-91627ad190f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.803430 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9a57abd7-a71c-448a-8b78-91627ad190f4" (UID: "9a57abd7-a71c-448a-8b78-91627ad190f4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.807258 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a57abd7-a71c-448a-8b78-91627ad190f4-kube-api-access-xmtqw" (OuterVolumeSpecName: "kube-api-access-xmtqw") pod "9a57abd7-a71c-448a-8b78-91627ad190f4" (UID: "9a57abd7-a71c-448a-8b78-91627ad190f4"). InnerVolumeSpecName "kube-api-access-xmtqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.903952 4786 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.903994 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.904005 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a57abd7-a71c-448a-8b78-91627ad190f4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.904018 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a57abd7-a71c-448a-8b78-91627ad190f4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:03 crc kubenswrapper[4786]: I0313 16:47:03.904029 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmtqw\" (UniqueName: \"kubernetes.io/projected/9a57abd7-a71c-448a-8b78-91627ad190f4-kube-api-access-xmtqw\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.285778 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp-config-m2m99" event={"ID":"9a57abd7-a71c-448a-8b78-91627ad190f4","Type":"ContainerDied","Data":"784ee22c97103afcbbab38ee3818d77ad3733bf753d3fe65fc3a1c526959e5d4"} Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.285815 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="784ee22c97103afcbbab38ee3818d77ad3733bf753d3fe65fc3a1c526959e5d4" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.285885 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp-config-m2m99" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.365479 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jksdp-config-m2m99"] Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.373451 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jksdp-config-m2m99"] Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.489485 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jksdp-config-cwv4n"] Mar 13 16:47:04 crc kubenswrapper[4786]: E0313 16:47:04.491575 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a57abd7-a71c-448a-8b78-91627ad190f4" containerName="ovn-config" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.491602 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a57abd7-a71c-448a-8b78-91627ad190f4" containerName="ovn-config" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.492163 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a57abd7-a71c-448a-8b78-91627ad190f4" containerName="ovn-config" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.493548 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.499155 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.507070 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jksdp-config-cwv4n"] Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.529497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-scripts\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.529628 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-log-ovn\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.529726 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run-ovn\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.529957 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxtdb\" (UniqueName: \"kubernetes.io/projected/0865e9e5-0e72-4644-a9ef-5380f6d15552-kube-api-access-jxtdb\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.530093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-additional-scripts\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.530819 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.576961 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a57abd7-a71c-448a-8b78-91627ad190f4" path="/var/lib/kubelet/pods/9a57abd7-a71c-448a-8b78-91627ad190f4/volumes" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.636271 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxtdb\" (UniqueName: \"kubernetes.io/projected/0865e9e5-0e72-4644-a9ef-5380f6d15552-kube-api-access-jxtdb\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.636439 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-additional-scripts\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.636591 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.636797 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-scripts\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.636887 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-log-ovn\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.636955 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run-ovn\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.637250 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-additional-scripts\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.637608 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run-ovn\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.637696 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-log-ovn\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.637762 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.639025 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-scripts\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.655767 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxtdb\" (UniqueName: \"kubernetes.io/projected/0865e9e5-0e72-4644-a9ef-5380f6d15552-kube-api-access-jxtdb\") pod \"ovn-controller-jksdp-config-cwv4n\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:04 crc kubenswrapper[4786]: I0313 16:47:04.870452 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:05 crc kubenswrapper[4786]: I0313 16:47:05.593160 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:47:05 crc kubenswrapper[4786]: E0313 16:47:05.596244 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:47:08 crc kubenswrapper[4786]: I0313 16:47:08.031821 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-51aa-account-create-update-k7cfk"] Mar 13 16:47:08 crc kubenswrapper[4786]: I0313 16:47:08.040696 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gztgw"] Mar 13 16:47:08 crc kubenswrapper[4786]: I0313 16:47:08.050306 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gztgw"] Mar 13 16:47:08 crc kubenswrapper[4786]: I0313 16:47:08.058475 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-51aa-account-create-update-k7cfk"] Mar 13 16:47:08 crc kubenswrapper[4786]: I0313 16:47:08.568560 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64219e76-8dbd-4a87-8b6e-a6cc0818f23e" path="/var/lib/kubelet/pods/64219e76-8dbd-4a87-8b6e-a6cc0818f23e/volumes" Mar 13 16:47:08 crc kubenswrapper[4786]: I0313 16:47:08.570093 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba347180-a21a-4e7b-be37-99d9f2ce2cdb" path="/var/lib/kubelet/pods/ba347180-a21a-4e7b-be37-99d9f2ce2cdb/volumes" Mar 13 16:47:12 crc kubenswrapper[4786]: I0313 16:47:12.018073 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jksdp-config-cwv4n"] Mar 13 16:47:12 crc kubenswrapper[4786]: I0313 16:47:12.370538 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp-config-cwv4n" event={"ID":"0865e9e5-0e72-4644-a9ef-5380f6d15552","Type":"ContainerStarted","Data":"1846f149546fcd3ed8c5a1e6544f0fb2357fc1acae46694e661f808d8e459e1c"} Mar 13 16:47:12 crc kubenswrapper[4786]: I0313 16:47:12.372345 4786 generic.go:334] "Generic (PLEG): container finished" podID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerID="2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc" exitCode=0 Mar 13 16:47:12 crc kubenswrapper[4786]: I0313 16:47:12.372433 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f6c445cdc-gklls" event={"ID":"38318612-ca22-4ebd-99c8-bc2c582b45b1","Type":"ContainerDied","Data":"2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc"} Mar 13 16:47:13 crc kubenswrapper[4786]: I0313 16:47:13.400005 4786 generic.go:334] "Generic (PLEG): container finished" podID="0865e9e5-0e72-4644-a9ef-5380f6d15552" containerID="b7d902e76f51b7964af1cd42a40023e68e91f010a7413dade62022cc54f82a1c" exitCode=0 Mar 13 16:47:13 crc kubenswrapper[4786]: I0313 16:47:13.400729 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp-config-cwv4n" event={"ID":"0865e9e5-0e72-4644-a9ef-5380f6d15552","Type":"ContainerDied","Data":"b7d902e76f51b7964af1cd42a40023e68e91f010a7413dade62022cc54f82a1c"} Mar 13 16:47:13 crc kubenswrapper[4786]: I0313 16:47:13.406800 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f6c445cdc-gklls" event={"ID":"38318612-ca22-4ebd-99c8-bc2c582b45b1","Type":"ContainerStarted","Data":"acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0"} Mar 13 16:47:13 crc kubenswrapper[4786]: I0313 16:47:13.406897 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f6c445cdc-gklls" event={"ID":"38318612-ca22-4ebd-99c8-bc2c582b45b1","Type":"ContainerStarted","Data":"fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c"} Mar 13 16:47:13 crc kubenswrapper[4786]: I0313 16:47:13.407376 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:13 crc kubenswrapper[4786]: I0313 16:47:13.407581 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:13 crc kubenswrapper[4786]: I0313 16:47:13.467921 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-5f6c445cdc-gklls" podStartSLOduration=3.67408466 podStartE2EDuration="14.467900411s" podCreationTimestamp="2026-03-13 16:46:59 +0000 UTC" firstStartedPulling="2026-03-13 16:47:00.849193432 +0000 UTC m=+6251.012405243" lastFinishedPulling="2026-03-13 16:47:11.643009193 +0000 UTC m=+6261.806220994" observedRunningTime="2026-03-13 16:47:13.456258209 +0000 UTC m=+6263.619470020" watchObservedRunningTime="2026-03-13 16:47:13.467900411 +0000 UTC m=+6263.631112232" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.040386 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-chmdg"] Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.052341 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-chmdg"] Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.580597 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea63697-eb03-42fe-b25f-cb6b5e2d7a91" path="/var/lib/kubelet/pods/2ea63697-eb03-42fe-b25f-cb6b5e2d7a91/volumes" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.858308 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.895039 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run-ovn\") pod \"0865e9e5-0e72-4644-a9ef-5380f6d15552\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.895146 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-log-ovn\") pod \"0865e9e5-0e72-4644-a9ef-5380f6d15552\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.895223 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0865e9e5-0e72-4644-a9ef-5380f6d15552" (UID: "0865e9e5-0e72-4644-a9ef-5380f6d15552"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.895348 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0865e9e5-0e72-4644-a9ef-5380f6d15552" (UID: "0865e9e5-0e72-4644-a9ef-5380f6d15552"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.895887 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.895923 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.996650 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-scripts\") pod \"0865e9e5-0e72-4644-a9ef-5380f6d15552\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.996745 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxtdb\" (UniqueName: \"kubernetes.io/projected/0865e9e5-0e72-4644-a9ef-5380f6d15552-kube-api-access-jxtdb\") pod \"0865e9e5-0e72-4644-a9ef-5380f6d15552\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.996841 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-additional-scripts\") pod \"0865e9e5-0e72-4644-a9ef-5380f6d15552\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.996903 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run\") pod \"0865e9e5-0e72-4644-a9ef-5380f6d15552\" (UID: \"0865e9e5-0e72-4644-a9ef-5380f6d15552\") " Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.997106 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run" (OuterVolumeSpecName: "var-run") pod "0865e9e5-0e72-4644-a9ef-5380f6d15552" (UID: "0865e9e5-0e72-4644-a9ef-5380f6d15552"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.997715 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0865e9e5-0e72-4644-a9ef-5380f6d15552-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.997845 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0865e9e5-0e72-4644-a9ef-5380f6d15552" (UID: "0865e9e5-0e72-4644-a9ef-5380f6d15552"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:47:14 crc kubenswrapper[4786]: I0313 16:47:14.998170 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-scripts" (OuterVolumeSpecName: "scripts") pod "0865e9e5-0e72-4644-a9ef-5380f6d15552" (UID: "0865e9e5-0e72-4644-a9ef-5380f6d15552"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.004093 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0865e9e5-0e72-4644-a9ef-5380f6d15552-kube-api-access-jxtdb" (OuterVolumeSpecName: "kube-api-access-jxtdb") pod "0865e9e5-0e72-4644-a9ef-5380f6d15552" (UID: "0865e9e5-0e72-4644-a9ef-5380f6d15552"). InnerVolumeSpecName "kube-api-access-jxtdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.099082 4786 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.099118 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0865e9e5-0e72-4644-a9ef-5380f6d15552-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.099130 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxtdb\" (UniqueName: \"kubernetes.io/projected/0865e9e5-0e72-4644-a9ef-5380f6d15552-kube-api-access-jxtdb\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.429478 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jksdp-config-cwv4n" event={"ID":"0865e9e5-0e72-4644-a9ef-5380f6d15552","Type":"ContainerDied","Data":"1846f149546fcd3ed8c5a1e6544f0fb2357fc1acae46694e661f808d8e459e1c"} Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.429532 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1846f149546fcd3ed8c5a1e6544f0fb2357fc1acae46694e661f808d8e459e1c" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.429605 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jksdp-config-cwv4n" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.910705 4786 scope.go:117] "RemoveContainer" containerID="fdf08bab267ccb7f6152d257e3ce4bf692ff63df2a66cf568de98dd900984de6" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.942580 4786 scope.go:117] "RemoveContainer" containerID="cc0e2d7576cc4e62a498528df1756ac070f13283e5a8e61a6fb19cdebe4040b1" Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.958645 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jksdp-config-cwv4n"] Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.966082 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jksdp-config-cwv4n"] Mar 13 16:47:15 crc kubenswrapper[4786]: I0313 16:47:15.981236 4786 scope.go:117] "RemoveContainer" containerID="3838c659a549c7e7c045b5cd9d636f886abfeb6f13aa9ce9e948cc7bf18c231c" Mar 13 16:47:16 crc kubenswrapper[4786]: I0313 16:47:16.557354 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:47:16 crc kubenswrapper[4786]: E0313 16:47:16.557829 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:47:16 crc kubenswrapper[4786]: I0313 16:47:16.570083 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0865e9e5-0e72-4644-a9ef-5380f6d15552" path="/var/lib/kubelet/pods/0865e9e5-0e72-4644-a9ef-5380f6d15552/volumes" Mar 13 16:47:27 crc kubenswrapper[4786]: I0313 16:47:27.054324 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s7m2q"] Mar 13 16:47:27 crc kubenswrapper[4786]: I0313 16:47:27.063526 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s7m2q"] Mar 13 16:47:27 crc kubenswrapper[4786]: I0313 16:47:27.554645 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:47:27 crc kubenswrapper[4786]: E0313 16:47:27.555472 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:47:28 crc kubenswrapper[4786]: I0313 16:47:28.576169 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add6353c-ce2d-41c9-a724-d5c834495653" path="/var/lib/kubelet/pods/add6353c-ce2d-41c9-a724-d5c834495653/volumes" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.229272 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-q4769"] Mar 13 16:47:30 crc kubenswrapper[4786]: E0313 16:47:30.230342 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0865e9e5-0e72-4644-a9ef-5380f6d15552" containerName="ovn-config" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.230367 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0865e9e5-0e72-4644-a9ef-5380f6d15552" containerName="ovn-config" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.230679 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0865e9e5-0e72-4644-a9ef-5380f6d15552" containerName="ovn-config" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.233599 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.238233 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.239049 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.255693 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.256724 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-q4769"] Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.433009 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7b17f538-d321-4719-a3e1-651226075939-config-data-merged\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.433217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b17f538-d321-4719-a3e1-651226075939-config-data\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.433454 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b17f538-d321-4719-a3e1-651226075939-scripts\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.434800 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7b17f538-d321-4719-a3e1-651226075939-hm-ports\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.537392 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b17f538-d321-4719-a3e1-651226075939-config-data\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.537576 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b17f538-d321-4719-a3e1-651226075939-scripts\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.537721 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7b17f538-d321-4719-a3e1-651226075939-hm-ports\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.539498 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7b17f538-d321-4719-a3e1-651226075939-hm-ports\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.540747 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7b17f538-d321-4719-a3e1-651226075939-config-data-merged\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.539850 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7b17f538-d321-4719-a3e1-651226075939-config-data-merged\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.547820 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b17f538-d321-4719-a3e1-651226075939-scripts\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.562810 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b17f538-d321-4719-a3e1-651226075939-config-data\") pod \"octavia-rsyslog-q4769\" (UID: \"7b17f538-d321-4719-a3e1-651226075939\") " pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:30 crc kubenswrapper[4786]: I0313 16:47:30.575321 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.070243 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-6f45c4fb85-pfljf"] Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.072037 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.074811 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.084484 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f45c4fb85-pfljf"] Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.153336 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9582ec85-15ab-4015-bbf5-659c24d1071b-amphora-image\") pod \"octavia-image-upload-6f45c4fb85-pfljf\" (UID: \"9582ec85-15ab-4015-bbf5-659c24d1071b\") " pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.153402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9582ec85-15ab-4015-bbf5-659c24d1071b-httpd-config\") pod \"octavia-image-upload-6f45c4fb85-pfljf\" (UID: \"9582ec85-15ab-4015-bbf5-659c24d1071b\") " pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.190802 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-q4769"] Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.254526 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9582ec85-15ab-4015-bbf5-659c24d1071b-httpd-config\") pod \"octavia-image-upload-6f45c4fb85-pfljf\" (UID: \"9582ec85-15ab-4015-bbf5-659c24d1071b\") " pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.254666 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9582ec85-15ab-4015-bbf5-659c24d1071b-amphora-image\") pod \"octavia-image-upload-6f45c4fb85-pfljf\" (UID: \"9582ec85-15ab-4015-bbf5-659c24d1071b\") " pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.255127 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9582ec85-15ab-4015-bbf5-659c24d1071b-amphora-image\") pod \"octavia-image-upload-6f45c4fb85-pfljf\" (UID: \"9582ec85-15ab-4015-bbf5-659c24d1071b\") " pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.261128 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9582ec85-15ab-4015-bbf5-659c24d1071b-httpd-config\") pod \"octavia-image-upload-6f45c4fb85-pfljf\" (UID: \"9582ec85-15ab-4015-bbf5-659c24d1071b\") " pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.309988 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-q4769"] Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.432403 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.622507 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-q4769" event={"ID":"7b17f538-d321-4719-a3e1-651226075939","Type":"ContainerStarted","Data":"7daf6968ba9957e881c38ccdc7b25d67f6e3cf8dbbe9e79d19bfc58783be865c"} Mar 13 16:47:31 crc kubenswrapper[4786]: I0313 16:47:31.935518 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f45c4fb85-pfljf"] Mar 13 16:47:31 crc kubenswrapper[4786]: W0313 16:47:31.942752 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9582ec85_15ab_4015_bbf5_659c24d1071b.slice/crio-e3990d24f5c3f1c5f6c8cb70da95e91871f7c37957c7044af987bba8c09f4fc1 WatchSource:0}: Error finding container e3990d24f5c3f1c5f6c8cb70da95e91871f7c37957c7044af987bba8c09f4fc1: Status 404 returned error can't find the container with id e3990d24f5c3f1c5f6c8cb70da95e91871f7c37957c7044af987bba8c09f4fc1 Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.648114 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" event={"ID":"9582ec85-15ab-4015-bbf5-659c24d1071b","Type":"ContainerStarted","Data":"e3990d24f5c3f1c5f6c8cb70da95e91871f7c37957c7044af987bba8c09f4fc1"} Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.713642 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-7d68854bb9-vv8ck"] Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.716890 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.723574 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.723988 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.731467 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7d68854bb9-vv8ck"] Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.784095 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7b597ad4-6e16-4499-b61e-a1d01d253164-config-data-merged\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.784550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-ovndb-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.784588 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-internal-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.784660 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-config-data\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.784679 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-combined-ca-bundle\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.784698 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-scripts\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.784737 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-public-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.784764 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/7b597ad4-6e16-4499-b61e-a1d01d253164-octavia-run\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.886411 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-config-data\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.886471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-combined-ca-bundle\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.886504 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-scripts\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.886575 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-public-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.886618 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/7b597ad4-6e16-4499-b61e-a1d01d253164-octavia-run\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.886700 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7b597ad4-6e16-4499-b61e-a1d01d253164-config-data-merged\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.886723 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-ovndb-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.886760 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-internal-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.887497 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/7b597ad4-6e16-4499-b61e-a1d01d253164-octavia-run\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.887526 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7b597ad4-6e16-4499-b61e-a1d01d253164-config-data-merged\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.892454 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-public-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.892505 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-ovndb-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.892793 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-scripts\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.892805 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-internal-tls-certs\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.893582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-combined-ca-bundle\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:32 crc kubenswrapper[4786]: I0313 16:47:32.896367 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b597ad4-6e16-4499-b61e-a1d01d253164-config-data\") pod \"octavia-api-7d68854bb9-vv8ck\" (UID: \"7b597ad4-6e16-4499-b61e-a1d01d253164\") " pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:33 crc kubenswrapper[4786]: I0313 16:47:33.092898 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:33 crc kubenswrapper[4786]: I0313 16:47:33.670702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-q4769" event={"ID":"7b17f538-d321-4719-a3e1-651226075939","Type":"ContainerStarted","Data":"d5244180bf5f4ed3dff96967d8a82ae068cb8bfb9e84470b7245fb1466e1cded"} Mar 13 16:47:33 crc kubenswrapper[4786]: I0313 16:47:33.985105 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7d68854bb9-vv8ck"] Mar 13 16:47:34 crc kubenswrapper[4786]: I0313 16:47:34.331210 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:34 crc kubenswrapper[4786]: I0313 16:47:34.414297 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:34 crc kubenswrapper[4786]: I0313 16:47:34.682480 4786 generic.go:334] "Generic (PLEG): container finished" podID="7b597ad4-6e16-4499-b61e-a1d01d253164" containerID="262ef00cadae9ca1c3cc11171fd24d435eb0ebffcd8703148b8325f4c7cfea83" exitCode=0 Mar 13 16:47:34 crc kubenswrapper[4786]: I0313 16:47:34.682587 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7d68854bb9-vv8ck" event={"ID":"7b597ad4-6e16-4499-b61e-a1d01d253164","Type":"ContainerDied","Data":"262ef00cadae9ca1c3cc11171fd24d435eb0ebffcd8703148b8325f4c7cfea83"} Mar 13 16:47:34 crc kubenswrapper[4786]: I0313 16:47:34.682663 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7d68854bb9-vv8ck" event={"ID":"7b597ad4-6e16-4499-b61e-a1d01d253164","Type":"ContainerStarted","Data":"d5eccdf1d4a5b71822c364f699188c4c5d3c591cbe8b69dafe99a6038c73f4d8"} Mar 13 16:47:35 crc kubenswrapper[4786]: I0313 16:47:35.695333 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7d68854bb9-vv8ck" event={"ID":"7b597ad4-6e16-4499-b61e-a1d01d253164","Type":"ContainerStarted","Data":"e08bee8213187bee4dc33ed2946aba9b860c19b1e35d4591ef6938bf404534a4"} Mar 13 16:47:35 crc kubenswrapper[4786]: I0313 16:47:35.695630 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7d68854bb9-vv8ck" event={"ID":"7b597ad4-6e16-4499-b61e-a1d01d253164","Type":"ContainerStarted","Data":"9d638f81d404932a6bc9eca644d58855b405cf3642a6197a73c08f68948a7098"} Mar 13 16:47:35 crc kubenswrapper[4786]: I0313 16:47:35.697784 4786 generic.go:334] "Generic (PLEG): container finished" podID="7b17f538-d321-4719-a3e1-651226075939" containerID="d5244180bf5f4ed3dff96967d8a82ae068cb8bfb9e84470b7245fb1466e1cded" exitCode=0 Mar 13 16:47:35 crc kubenswrapper[4786]: I0313 16:47:35.697811 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-q4769" event={"ID":"7b17f538-d321-4719-a3e1-651226075939","Type":"ContainerDied","Data":"d5244180bf5f4ed3dff96967d8a82ae068cb8bfb9e84470b7245fb1466e1cded"} Mar 13 16:47:36 crc kubenswrapper[4786]: I0313 16:47:36.704899 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:36 crc kubenswrapper[4786]: I0313 16:47:36.705236 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:36 crc kubenswrapper[4786]: I0313 16:47:36.736645 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-7d68854bb9-vv8ck" podStartSLOduration=4.736625927 podStartE2EDuration="4.736625927s" podCreationTimestamp="2026-03-13 16:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:47:36.728957334 +0000 UTC m=+6286.892169145" watchObservedRunningTime="2026-03-13 16:47:36.736625927 +0000 UTC m=+6286.899837738" Mar 13 16:47:37 crc kubenswrapper[4786]: I0313 16:47:37.723564 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-q4769" event={"ID":"7b17f538-d321-4719-a3e1-651226075939","Type":"ContainerStarted","Data":"0b555e2eb70b624b8020eaeb0e16490273e8a88f48e9c72b6c9202d93adf3c18"} Mar 13 16:47:37 crc kubenswrapper[4786]: I0313 16:47:37.724075 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:37 crc kubenswrapper[4786]: I0313 16:47:37.756160 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-q4769" podStartSLOduration=2.020520811 podStartE2EDuration="7.756137831s" podCreationTimestamp="2026-03-13 16:47:30 +0000 UTC" firstStartedPulling="2026-03-13 16:47:31.214009469 +0000 UTC m=+6281.377221280" lastFinishedPulling="2026-03-13 16:47:36.949626489 +0000 UTC m=+6287.112838300" observedRunningTime="2026-03-13 16:47:37.75012996 +0000 UTC m=+6287.913341771" watchObservedRunningTime="2026-03-13 16:47:37.756137831 +0000 UTC m=+6287.919349642" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.611299 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-ffctd"] Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.613266 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.615391 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.625672 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-ffctd"] Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.680951 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-scripts\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.681013 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.681082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-combined-ca-bundle\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.681167 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data-merged\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.784600 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-combined-ca-bundle\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.784701 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data-merged\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.784751 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-scripts\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.784792 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.785352 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data-merged\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.803019 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.807500 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-combined-ca-bundle\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.817894 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-scripts\") pod \"octavia-db-sync-ffctd\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:39 crc kubenswrapper[4786]: I0313 16:47:39.930968 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:41 crc kubenswrapper[4786]: I0313 16:47:41.048717 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-ffctd"] Mar 13 16:47:41 crc kubenswrapper[4786]: I0313 16:47:41.769311 4786 generic.go:334] "Generic (PLEG): container finished" podID="9582ec85-15ab-4015-bbf5-659c24d1071b" containerID="cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a" exitCode=0 Mar 13 16:47:41 crc kubenswrapper[4786]: I0313 16:47:41.769416 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" event={"ID":"9582ec85-15ab-4015-bbf5-659c24d1071b","Type":"ContainerDied","Data":"cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a"} Mar 13 16:47:41 crc kubenswrapper[4786]: I0313 16:47:41.772326 4786 generic.go:334] "Generic (PLEG): container finished" podID="afbd4809-109d-4fa3-83f5-64a4f78d24a9" containerID="404495674c7cb2940e2d3bc73ec98e48c84052dce98db11e233c192fbe35d01c" exitCode=0 Mar 13 16:47:41 crc kubenswrapper[4786]: I0313 16:47:41.772363 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ffctd" event={"ID":"afbd4809-109d-4fa3-83f5-64a4f78d24a9","Type":"ContainerDied","Data":"404495674c7cb2940e2d3bc73ec98e48c84052dce98db11e233c192fbe35d01c"} Mar 13 16:47:41 crc kubenswrapper[4786]: I0313 16:47:41.772392 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ffctd" event={"ID":"afbd4809-109d-4fa3-83f5-64a4f78d24a9","Type":"ContainerStarted","Data":"3f0b9b1e16d96cd794fb6889f68d853dcba9208e498a1db129e02986a79d3ae3"} Mar 13 16:47:42 crc kubenswrapper[4786]: I0313 16:47:42.552459 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:47:42 crc kubenswrapper[4786]: E0313 16:47:42.553271 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:47:42 crc kubenswrapper[4786]: I0313 16:47:42.791310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" event={"ID":"9582ec85-15ab-4015-bbf5-659c24d1071b","Type":"ContainerStarted","Data":"d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677"} Mar 13 16:47:42 crc kubenswrapper[4786]: I0313 16:47:42.797988 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ffctd" event={"ID":"afbd4809-109d-4fa3-83f5-64a4f78d24a9","Type":"ContainerStarted","Data":"4ed1be7a06b205e2feca3f39bc4950d406cf286ad1e42266de30b5a964c7a111"} Mar 13 16:47:42 crc kubenswrapper[4786]: I0313 16:47:42.843013 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" podStartSLOduration=3.037207619 podStartE2EDuration="11.842922391s" podCreationTimestamp="2026-03-13 16:47:31 +0000 UTC" firstStartedPulling="2026-03-13 16:47:31.945536748 +0000 UTC m=+6282.108748559" lastFinishedPulling="2026-03-13 16:47:40.75125152 +0000 UTC m=+6290.914463331" observedRunningTime="2026-03-13 16:47:42.816167569 +0000 UTC m=+6292.979379430" watchObservedRunningTime="2026-03-13 16:47:42.842922391 +0000 UTC m=+6293.006134232" Mar 13 16:47:42 crc kubenswrapper[4786]: I0313 16:47:42.870496 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-ffctd" podStartSLOduration=3.870473763 podStartE2EDuration="3.870473763s" podCreationTimestamp="2026-03-13 16:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:47:42.839415203 +0000 UTC m=+6293.002627034" watchObservedRunningTime="2026-03-13 16:47:42.870473763 +0000 UTC m=+6293.033685584" Mar 13 16:47:44 crc kubenswrapper[4786]: E0313 16:47:44.570157 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafbd4809_109d_4fa3_83f5_64a4f78d24a9.slice/crio-conmon-4ed1be7a06b205e2feca3f39bc4950d406cf286ad1e42266de30b5a964c7a111.scope\": RecentStats: unable to find data in memory cache]" Mar 13 16:47:44 crc kubenswrapper[4786]: I0313 16:47:44.826310 4786 generic.go:334] "Generic (PLEG): container finished" podID="afbd4809-109d-4fa3-83f5-64a4f78d24a9" containerID="4ed1be7a06b205e2feca3f39bc4950d406cf286ad1e42266de30b5a964c7a111" exitCode=0 Mar 13 16:47:44 crc kubenswrapper[4786]: I0313 16:47:44.826376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ffctd" event={"ID":"afbd4809-109d-4fa3-83f5-64a4f78d24a9","Type":"ContainerDied","Data":"4ed1be7a06b205e2feca3f39bc4950d406cf286ad1e42266de30b5a964c7a111"} Mar 13 16:47:45 crc kubenswrapper[4786]: I0313 16:47:45.636013 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-q4769" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.241664 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.436235 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-scripts\") pod \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.436324 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data-merged\") pod \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.436453 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-combined-ca-bundle\") pod \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.436515 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data\") pod \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\" (UID: \"afbd4809-109d-4fa3-83f5-64a4f78d24a9\") " Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.442483 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data" (OuterVolumeSpecName: "config-data") pod "afbd4809-109d-4fa3-83f5-64a4f78d24a9" (UID: "afbd4809-109d-4fa3-83f5-64a4f78d24a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.442720 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-scripts" (OuterVolumeSpecName: "scripts") pod "afbd4809-109d-4fa3-83f5-64a4f78d24a9" (UID: "afbd4809-109d-4fa3-83f5-64a4f78d24a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.494172 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afbd4809-109d-4fa3-83f5-64a4f78d24a9" (UID: "afbd4809-109d-4fa3-83f5-64a4f78d24a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.507150 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "afbd4809-109d-4fa3-83f5-64a4f78d24a9" (UID: "afbd4809-109d-4fa3-83f5-64a4f78d24a9"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.539181 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.539440 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data-merged\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.539603 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.539732 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbd4809-109d-4fa3-83f5-64a4f78d24a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.854624 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ffctd" event={"ID":"afbd4809-109d-4fa3-83f5-64a4f78d24a9","Type":"ContainerDied","Data":"3f0b9b1e16d96cd794fb6889f68d853dcba9208e498a1db129e02986a79d3ae3"} Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.854721 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f0b9b1e16d96cd794fb6889f68d853dcba9208e498a1db129e02986a79d3ae3" Mar 13 16:47:46 crc kubenswrapper[4786]: I0313 16:47:46.854652 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-ffctd" Mar 13 16:47:52 crc kubenswrapper[4786]: I0313 16:47:52.188395 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:52 crc kubenswrapper[4786]: I0313 16:47:52.934467 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7d68854bb9-vv8ck" Mar 13 16:47:53 crc kubenswrapper[4786]: I0313 16:47:53.006085 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-5f6c445cdc-gklls"] Mar 13 16:47:53 crc kubenswrapper[4786]: I0313 16:47:53.006609 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-5f6c445cdc-gklls" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="octavia-api" containerID="cri-o://fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c" gracePeriod=30 Mar 13 16:47:53 crc kubenswrapper[4786]: I0313 16:47:53.006724 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-5f6c445cdc-gklls" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="octavia-api-provider-agent" containerID="cri-o://acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0" gracePeriod=30 Mar 13 16:47:53 crc kubenswrapper[4786]: I0313 16:47:53.552131 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:47:53 crc kubenswrapper[4786]: E0313 16:47:53.552453 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:47:53 crc kubenswrapper[4786]: I0313 16:47:53.924713 4786 generic.go:334] "Generic (PLEG): container finished" podID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerID="acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0" exitCode=0 Mar 13 16:47:53 crc kubenswrapper[4786]: I0313 16:47:53.924779 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f6c445cdc-gklls" event={"ID":"38318612-ca22-4ebd-99c8-bc2c582b45b1","Type":"ContainerDied","Data":"acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0"} Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.818549 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.957586 4786 generic.go:334] "Generic (PLEG): container finished" podID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerID="fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c" exitCode=0 Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.957681 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5f6c445cdc-gklls" Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.957654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f6c445cdc-gklls" event={"ID":"38318612-ca22-4ebd-99c8-bc2c582b45b1","Type":"ContainerDied","Data":"fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c"} Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.958098 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5f6c445cdc-gklls" event={"ID":"38318612-ca22-4ebd-99c8-bc2c582b45b1","Type":"ContainerDied","Data":"665c83a301330236e74c1e2d37be9c134e82718c8c94bdc257e3e66978502f87"} Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.958131 4786 scope.go:117] "RemoveContainer" containerID="acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0" Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.963703 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data\") pod \"38318612-ca22-4ebd-99c8-bc2c582b45b1\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.963778 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-scripts\") pod \"38318612-ca22-4ebd-99c8-bc2c582b45b1\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.963922 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-combined-ca-bundle\") pod \"38318612-ca22-4ebd-99c8-bc2c582b45b1\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.964001 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data-merged\") pod \"38318612-ca22-4ebd-99c8-bc2c582b45b1\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.964069 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-ovndb-tls-certs\") pod \"38318612-ca22-4ebd-99c8-bc2c582b45b1\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.964110 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-octavia-run\") pod \"38318612-ca22-4ebd-99c8-bc2c582b45b1\" (UID: \"38318612-ca22-4ebd-99c8-bc2c582b45b1\") " Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.965011 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "38318612-ca22-4ebd-99c8-bc2c582b45b1" (UID: "38318612-ca22-4ebd-99c8-bc2c582b45b1"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.977114 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-scripts" (OuterVolumeSpecName: "scripts") pod "38318612-ca22-4ebd-99c8-bc2c582b45b1" (UID: "38318612-ca22-4ebd-99c8-bc2c582b45b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.977172 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data" (OuterVolumeSpecName: "config-data") pod "38318612-ca22-4ebd-99c8-bc2c582b45b1" (UID: "38318612-ca22-4ebd-99c8-bc2c582b45b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:47:56 crc kubenswrapper[4786]: I0313 16:47:56.993926 4786 scope.go:117] "RemoveContainer" containerID="fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.019511 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38318612-ca22-4ebd-99c8-bc2c582b45b1" (UID: "38318612-ca22-4ebd-99c8-bc2c582b45b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.027952 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "38318612-ca22-4ebd-99c8-bc2c582b45b1" (UID: "38318612-ca22-4ebd-99c8-bc2c582b45b1"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.066597 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.066639 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data-merged\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.066676 4786 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/38318612-ca22-4ebd-99c8-bc2c582b45b1-octavia-run\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.066688 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.066699 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.099042 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "38318612-ca22-4ebd-99c8-bc2c582b45b1" (UID: "38318612-ca22-4ebd-99c8-bc2c582b45b1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.100477 4786 scope.go:117] "RemoveContainer" containerID="2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.137251 4786 scope.go:117] "RemoveContainer" containerID="acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0" Mar 13 16:47:57 crc kubenswrapper[4786]: E0313 16:47:57.138283 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0\": container with ID starting with acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0 not found: ID does not exist" containerID="acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.138338 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0"} err="failed to get container status \"acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0\": rpc error: code = NotFound desc = could not find container \"acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0\": container with ID starting with acb21b13dad9264bda6ae03ae862fa0642dfd392c17062225d2e1172c8ca62b0 not found: ID does not exist" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.138383 4786 scope.go:117] "RemoveContainer" containerID="fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c" Mar 13 16:47:57 crc kubenswrapper[4786]: E0313 16:47:57.142132 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c\": container with ID starting with fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c not found: ID does not exist" containerID="fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.142180 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c"} err="failed to get container status \"fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c\": rpc error: code = NotFound desc = could not find container \"fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c\": container with ID starting with fb6848a74784bc78b464dc46dc32045f0ba99ac04ec0e8ffff64bb4164c56f9c not found: ID does not exist" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.142209 4786 scope.go:117] "RemoveContainer" containerID="2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc" Mar 13 16:47:57 crc kubenswrapper[4786]: E0313 16:47:57.142528 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc\": container with ID starting with 2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc not found: ID does not exist" containerID="2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.142572 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc"} err="failed to get container status \"2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc\": rpc error: code = NotFound desc = could not find container \"2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc\": container with ID starting with 2a92296c808195c319ef9705e27282c605a871af2664fcfc9b3658bee1b2cecc not found: ID does not exist" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.168125 4786 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38318612-ca22-4ebd-99c8-bc2c582b45b1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.357122 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-5f6c445cdc-gklls"] Mar 13 16:47:57 crc kubenswrapper[4786]: I0313 16:47:57.364967 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-5f6c445cdc-gklls"] Mar 13 16:47:58 crc kubenswrapper[4786]: I0313 16:47:58.575988 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" path="/var/lib/kubelet/pods/38318612-ca22-4ebd-99c8-bc2c582b45b1/volumes" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157139 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557008-qg64l"] Mar 13 16:48:00 crc kubenswrapper[4786]: E0313 16:48:00.157589 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbd4809-109d-4fa3-83f5-64a4f78d24a9" containerName="octavia-db-sync" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157605 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbd4809-109d-4fa3-83f5-64a4f78d24a9" containerName="octavia-db-sync" Mar 13 16:48:00 crc kubenswrapper[4786]: E0313 16:48:00.157625 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="octavia-api-provider-agent" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157633 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="octavia-api-provider-agent" Mar 13 16:48:00 crc kubenswrapper[4786]: E0313 16:48:00.157650 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="octavia-api" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157657 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="octavia-api" Mar 13 16:48:00 crc kubenswrapper[4786]: E0313 16:48:00.157678 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbd4809-109d-4fa3-83f5-64a4f78d24a9" containerName="init" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157684 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbd4809-109d-4fa3-83f5-64a4f78d24a9" containerName="init" Mar 13 16:48:00 crc kubenswrapper[4786]: E0313 16:48:00.157710 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="init" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157717 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="init" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157948 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbd4809-109d-4fa3-83f5-64a4f78d24a9" containerName="octavia-db-sync" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157967 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="octavia-api-provider-agent" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.157978 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="38318612-ca22-4ebd-99c8-bc2c582b45b1" containerName="octavia-api" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.158713 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557008-qg64l" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.162197 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.163198 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.163282 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.175520 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557008-qg64l"] Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.343102 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78fjt\" (UniqueName: \"kubernetes.io/projected/6c96bc48-a941-4a7a-a113-904255c84492-kube-api-access-78fjt\") pod \"auto-csr-approver-29557008-qg64l\" (UID: \"6c96bc48-a941-4a7a-a113-904255c84492\") " pod="openshift-infra/auto-csr-approver-29557008-qg64l" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.446220 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78fjt\" (UniqueName: \"kubernetes.io/projected/6c96bc48-a941-4a7a-a113-904255c84492-kube-api-access-78fjt\") pod \"auto-csr-approver-29557008-qg64l\" (UID: \"6c96bc48-a941-4a7a-a113-904255c84492\") " pod="openshift-infra/auto-csr-approver-29557008-qg64l" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.498844 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78fjt\" (UniqueName: \"kubernetes.io/projected/6c96bc48-a941-4a7a-a113-904255c84492-kube-api-access-78fjt\") pod \"auto-csr-approver-29557008-qg64l\" (UID: \"6c96bc48-a941-4a7a-a113-904255c84492\") " pod="openshift-infra/auto-csr-approver-29557008-qg64l" Mar 13 16:48:00 crc kubenswrapper[4786]: I0313 16:48:00.783635 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557008-qg64l" Mar 13 16:48:01 crc kubenswrapper[4786]: I0313 16:48:01.277723 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557008-qg64l"] Mar 13 16:48:02 crc kubenswrapper[4786]: I0313 16:48:02.061952 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557008-qg64l" event={"ID":"6c96bc48-a941-4a7a-a113-904255c84492","Type":"ContainerStarted","Data":"b8a7f3483ce17904c2c02e9915ded307d988efffc5893cea0119d4aaf90a7d76"} Mar 13 16:48:03 crc kubenswrapper[4786]: I0313 16:48:03.072726 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557008-qg64l" event={"ID":"6c96bc48-a941-4a7a-a113-904255c84492","Type":"ContainerStarted","Data":"6f5ef5d122d206dae85ba659cc44b6b768a80c67f38ce420bb81bf715a40f12a"} Mar 13 16:48:03 crc kubenswrapper[4786]: I0313 16:48:03.098978 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557008-qg64l" podStartSLOduration=1.742197091 podStartE2EDuration="3.098959748s" podCreationTimestamp="2026-03-13 16:48:00 +0000 UTC" firstStartedPulling="2026-03-13 16:48:01.283810395 +0000 UTC m=+6311.447022206" lastFinishedPulling="2026-03-13 16:48:02.640573012 +0000 UTC m=+6312.803784863" observedRunningTime="2026-03-13 16:48:03.090831624 +0000 UTC m=+6313.254043435" watchObservedRunningTime="2026-03-13 16:48:03.098959748 +0000 UTC m=+6313.262171559" Mar 13 16:48:04 crc kubenswrapper[4786]: I0313 16:48:04.088176 4786 generic.go:334] "Generic (PLEG): container finished" podID="6c96bc48-a941-4a7a-a113-904255c84492" containerID="6f5ef5d122d206dae85ba659cc44b6b768a80c67f38ce420bb81bf715a40f12a" exitCode=0 Mar 13 16:48:04 crc kubenswrapper[4786]: I0313 16:48:04.088233 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557008-qg64l" event={"ID":"6c96bc48-a941-4a7a-a113-904255c84492","Type":"ContainerDied","Data":"6f5ef5d122d206dae85ba659cc44b6b768a80c67f38ce420bb81bf715a40f12a"} Mar 13 16:48:06 crc kubenswrapper[4786]: I0313 16:48:06.114338 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557008-qg64l" event={"ID":"6c96bc48-a941-4a7a-a113-904255c84492","Type":"ContainerDied","Data":"b8a7f3483ce17904c2c02e9915ded307d988efffc5893cea0119d4aaf90a7d76"} Mar 13 16:48:06 crc kubenswrapper[4786]: I0313 16:48:06.115041 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a7f3483ce17904c2c02e9915ded307d988efffc5893cea0119d4aaf90a7d76" Mar 13 16:48:06 crc kubenswrapper[4786]: I0313 16:48:06.147753 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557008-qg64l" Mar 13 16:48:06 crc kubenswrapper[4786]: I0313 16:48:06.271362 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78fjt\" (UniqueName: \"kubernetes.io/projected/6c96bc48-a941-4a7a-a113-904255c84492-kube-api-access-78fjt\") pod \"6c96bc48-a941-4a7a-a113-904255c84492\" (UID: \"6c96bc48-a941-4a7a-a113-904255c84492\") " Mar 13 16:48:06 crc kubenswrapper[4786]: I0313 16:48:06.279035 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c96bc48-a941-4a7a-a113-904255c84492-kube-api-access-78fjt" (OuterVolumeSpecName: "kube-api-access-78fjt") pod "6c96bc48-a941-4a7a-a113-904255c84492" (UID: "6c96bc48-a941-4a7a-a113-904255c84492"). InnerVolumeSpecName "kube-api-access-78fjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:48:06 crc kubenswrapper[4786]: I0313 16:48:06.374608 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78fjt\" (UniqueName: \"kubernetes.io/projected/6c96bc48-a941-4a7a-a113-904255c84492-kube-api-access-78fjt\") on node \"crc\" DevicePath \"\"" Mar 13 16:48:07 crc kubenswrapper[4786]: I0313 16:48:07.123820 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557008-qg64l" Mar 13 16:48:07 crc kubenswrapper[4786]: I0313 16:48:07.255612 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557002-jfkcc"] Mar 13 16:48:07 crc kubenswrapper[4786]: I0313 16:48:07.267985 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557002-jfkcc"] Mar 13 16:48:07 crc kubenswrapper[4786]: I0313 16:48:07.552279 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:48:07 crc kubenswrapper[4786]: E0313 16:48:07.552636 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:48:08 crc kubenswrapper[4786]: I0313 16:48:08.563544 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec517d8e-ab19-4446-b4a4-17bd55010656" path="/var/lib/kubelet/pods/ec517d8e-ab19-4446-b4a4-17bd55010656/volumes" Mar 13 16:48:16 crc kubenswrapper[4786]: I0313 16:48:16.113998 4786 scope.go:117] "RemoveContainer" containerID="5f594217d322ddbad8b5f3d6331cb156e8fedf101e5c7c7b31b080746007d084" Mar 13 16:48:16 crc kubenswrapper[4786]: I0313 16:48:16.163901 4786 scope.go:117] "RemoveContainer" containerID="bcae50cc8be497e6f2da73ae7b1ec0602d18a604b764ff4ca71f0e0a5666ca3e" Mar 13 16:48:19 crc kubenswrapper[4786]: I0313 16:48:19.559997 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:48:19 crc kubenswrapper[4786]: E0313 16:48:19.562820 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:48:24 crc kubenswrapper[4786]: I0313 16:48:24.208771 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-6f45c4fb85-pfljf"] Mar 13 16:48:24 crc kubenswrapper[4786]: I0313 16:48:24.209521 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" podUID="9582ec85-15ab-4015-bbf5-659c24d1071b" containerName="octavia-amphora-httpd" containerID="cri-o://d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677" gracePeriod=30 Mar 13 16:48:24 crc kubenswrapper[4786]: I0313 16:48:24.829598 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:48:24 crc kubenswrapper[4786]: I0313 16:48:24.986817 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9582ec85-15ab-4015-bbf5-659c24d1071b-httpd-config\") pod \"9582ec85-15ab-4015-bbf5-659c24d1071b\" (UID: \"9582ec85-15ab-4015-bbf5-659c24d1071b\") " Mar 13 16:48:24 crc kubenswrapper[4786]: I0313 16:48:24.987081 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9582ec85-15ab-4015-bbf5-659c24d1071b-amphora-image\") pod \"9582ec85-15ab-4015-bbf5-659c24d1071b\" (UID: \"9582ec85-15ab-4015-bbf5-659c24d1071b\") " Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.026757 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9582ec85-15ab-4015-bbf5-659c24d1071b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9582ec85-15ab-4015-bbf5-659c24d1071b" (UID: "9582ec85-15ab-4015-bbf5-659c24d1071b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.073809 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9582ec85-15ab-4015-bbf5-659c24d1071b-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "9582ec85-15ab-4015-bbf5-659c24d1071b" (UID: "9582ec85-15ab-4015-bbf5-659c24d1071b"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.090390 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9582ec85-15ab-4015-bbf5-659c24d1071b-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.090428 4786 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/9582ec85-15ab-4015-bbf5-659c24d1071b-amphora-image\") on node \"crc\" DevicePath \"\"" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.321496 4786 generic.go:334] "Generic (PLEG): container finished" podID="9582ec85-15ab-4015-bbf5-659c24d1071b" containerID="d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677" exitCode=0 Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.321853 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" event={"ID":"9582ec85-15ab-4015-bbf5-659c24d1071b","Type":"ContainerDied","Data":"d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677"} Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.321933 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" event={"ID":"9582ec85-15ab-4015-bbf5-659c24d1071b","Type":"ContainerDied","Data":"e3990d24f5c3f1c5f6c8cb70da95e91871f7c37957c7044af987bba8c09f4fc1"} Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.321974 4786 scope.go:117] "RemoveContainer" containerID="d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.322174 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f45c4fb85-pfljf" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.361140 4786 scope.go:117] "RemoveContainer" containerID="cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.370749 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-6f45c4fb85-pfljf"] Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.382740 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-6f45c4fb85-pfljf"] Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.397665 4786 scope.go:117] "RemoveContainer" containerID="d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677" Mar 13 16:48:25 crc kubenswrapper[4786]: E0313 16:48:25.398116 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677\": container with ID starting with d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677 not found: ID does not exist" containerID="d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.398152 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677"} err="failed to get container status \"d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677\": rpc error: code = NotFound desc = could not find container \"d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677\": container with ID starting with d86d2a32277ff20f1330f481b07556dac3a05762132357c11962b76380787677 not found: ID does not exist" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.398178 4786 scope.go:117] "RemoveContainer" containerID="cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a" Mar 13 16:48:25 crc kubenswrapper[4786]: E0313 16:48:25.398522 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a\": container with ID starting with cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a not found: ID does not exist" containerID="cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a" Mar 13 16:48:25 crc kubenswrapper[4786]: I0313 16:48:25.398585 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a"} err="failed to get container status \"cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a\": rpc error: code = NotFound desc = could not find container \"cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a\": container with ID starting with cb958f0c4e43b9113daa33dddbe68656bc8c5ea256984eaeb842d9f708bcf03a not found: ID does not exist" Mar 13 16:48:26 crc kubenswrapper[4786]: I0313 16:48:26.573675 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9582ec85-15ab-4015-bbf5-659c24d1071b" path="/var/lib/kubelet/pods/9582ec85-15ab-4015-bbf5-659c24d1071b/volumes" Mar 13 16:48:34 crc kubenswrapper[4786]: I0313 16:48:34.553473 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:48:34 crc kubenswrapper[4786]: E0313 16:48:34.554113 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.249848 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-fpprt"] Mar 13 16:48:35 crc kubenswrapper[4786]: E0313 16:48:35.250768 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9582ec85-15ab-4015-bbf5-659c24d1071b" containerName="octavia-amphora-httpd" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.250817 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9582ec85-15ab-4015-bbf5-659c24d1071b" containerName="octavia-amphora-httpd" Mar 13 16:48:35 crc kubenswrapper[4786]: E0313 16:48:35.250842 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c96bc48-a941-4a7a-a113-904255c84492" containerName="oc" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.250856 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c96bc48-a941-4a7a-a113-904255c84492" containerName="oc" Mar 13 16:48:35 crc kubenswrapper[4786]: E0313 16:48:35.250934 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9582ec85-15ab-4015-bbf5-659c24d1071b" containerName="init" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.250948 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9582ec85-15ab-4015-bbf5-659c24d1071b" containerName="init" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.251295 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c96bc48-a941-4a7a-a113-904255c84492" containerName="oc" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.251352 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9582ec85-15ab-4015-bbf5-659c24d1071b" containerName="octavia-amphora-httpd" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.253129 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.263579 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-fpprt"] Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.274371 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.274464 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.274579 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.416659 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-config-data\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.416728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-scripts\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.417447 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-hm-ports\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.417533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-combined-ca-bundle\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.417638 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-config-data-merged\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.417767 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-amphora-certs\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.519637 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-scripts\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.519831 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-hm-ports\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.519881 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-combined-ca-bundle\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.519916 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-config-data-merged\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.519972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-amphora-certs\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.520000 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-config-data\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.520929 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-config-data-merged\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.521618 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-hm-ports\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.526940 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-combined-ca-bundle\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.528132 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-amphora-certs\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.528440 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-config-data\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.534445 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a64beb5-4de2-40c1-9fc9-86d2f5b36048-scripts\") pod \"octavia-healthmanager-fpprt\" (UID: \"1a64beb5-4de2-40c1-9fc9-86d2f5b36048\") " pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:35 crc kubenswrapper[4786]: I0313 16:48:35.575382 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:36 crc kubenswrapper[4786]: I0313 16:48:36.209107 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-fpprt"] Mar 13 16:48:36 crc kubenswrapper[4786]: I0313 16:48:36.473462 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-fpprt" event={"ID":"1a64beb5-4de2-40c1-9fc9-86d2f5b36048","Type":"ContainerStarted","Data":"9e4fa5ca6f1e38c598634758e9a5a13d10a49f6be5402013329abb7c754c9f63"} Mar 13 16:48:36 crc kubenswrapper[4786]: I0313 16:48:36.900588 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-vsftn"] Mar 13 16:48:36 crc kubenswrapper[4786]: I0313 16:48:36.902249 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:36 crc kubenswrapper[4786]: I0313 16:48:36.908247 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Mar 13 16:48:36 crc kubenswrapper[4786]: I0313 16:48:36.908368 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Mar 13 16:48:36 crc kubenswrapper[4786]: I0313 16:48:36.926559 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-vsftn"] Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.067542 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-scripts\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.068528 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-config-data\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.068573 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-combined-ca-bundle\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.068618 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e7bcea0d-ef91-4510-8698-dea274f82f83-hm-ports\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.068915 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-amphora-certs\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.068982 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e7bcea0d-ef91-4510-8698-dea274f82f83-config-data-merged\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.171119 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-config-data\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.171181 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-combined-ca-bundle\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.171223 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e7bcea0d-ef91-4510-8698-dea274f82f83-hm-ports\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.171286 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-amphora-certs\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.171318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e7bcea0d-ef91-4510-8698-dea274f82f83-config-data-merged\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.171446 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-scripts\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.172033 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e7bcea0d-ef91-4510-8698-dea274f82f83-config-data-merged\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.172397 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e7bcea0d-ef91-4510-8698-dea274f82f83-hm-ports\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.178788 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-config-data\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.180750 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-amphora-certs\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.181530 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-combined-ca-bundle\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.184212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7bcea0d-ef91-4510-8698-dea274f82f83-scripts\") pod \"octavia-housekeeping-vsftn\" (UID: \"e7bcea0d-ef91-4510-8698-dea274f82f83\") " pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.223749 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.491817 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-fpprt" event={"ID":"1a64beb5-4de2-40c1-9fc9-86d2f5b36048","Type":"ContainerStarted","Data":"a6d27ae71ce2763868cadee69f9138bbdadf1878dc755cae4a1e767a87100ea2"} Mar 13 16:48:37 crc kubenswrapper[4786]: I0313 16:48:37.844901 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-vsftn"] Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.503649 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-vsftn" event={"ID":"e7bcea0d-ef91-4510-8698-dea274f82f83","Type":"ContainerStarted","Data":"50c8055b31047969e3a9ba3b5840a8092fef9097537b9e618012dc5c6646d732"} Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.505292 4786 generic.go:334] "Generic (PLEG): container finished" podID="1a64beb5-4de2-40c1-9fc9-86d2f5b36048" containerID="a6d27ae71ce2763868cadee69f9138bbdadf1878dc755cae4a1e767a87100ea2" exitCode=0 Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.505331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-fpprt" event={"ID":"1a64beb5-4de2-40c1-9fc9-86d2f5b36048","Type":"ContainerDied","Data":"a6d27ae71ce2763868cadee69f9138bbdadf1878dc755cae4a1e767a87100ea2"} Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.785956 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-flqbr"] Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.787830 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-flqbr" Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.790685 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.791321 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.799047 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-flqbr"] Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.904003 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-scripts\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.904071 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9de4aa0b-2f11-404c-9ae7-913912454f89-hm-ports\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.904179 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-config-data\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.904223 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-combined-ca-bundle\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.904252 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9de4aa0b-2f11-404c-9ae7-913912454f89-config-data-merged\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:38 crc kubenswrapper[4786]: I0313 16:48:38.904293 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-amphora-certs\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.005650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-config-data\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.005717 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-combined-ca-bundle\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.005754 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9de4aa0b-2f11-404c-9ae7-913912454f89-config-data-merged\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.005786 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-amphora-certs\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.005832 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-scripts\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.005876 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9de4aa0b-2f11-404c-9ae7-913912454f89-hm-ports\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.006460 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9de4aa0b-2f11-404c-9ae7-913912454f89-config-data-merged\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.006921 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9de4aa0b-2f11-404c-9ae7-913912454f89-hm-ports\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.009974 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-combined-ca-bundle\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.010098 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-scripts\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.010348 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-amphora-certs\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.011574 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de4aa0b-2f11-404c-9ae7-913912454f89-config-data\") pod \"octavia-worker-flqbr\" (UID: \"9de4aa0b-2f11-404c-9ae7-913912454f89\") " pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.121839 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-flqbr" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.540956 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-fpprt" event={"ID":"1a64beb5-4de2-40c1-9fc9-86d2f5b36048","Type":"ContainerStarted","Data":"684bba48a1a6d737c1917c88437c07e54dc503f06c69eb7bc10942c38e8bf704"} Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.541268 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.592431 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-fpprt" podStartSLOduration=4.59241425 podStartE2EDuration="4.59241425s" podCreationTimestamp="2026-03-13 16:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:48:39.585609649 +0000 UTC m=+6349.748821460" watchObservedRunningTime="2026-03-13 16:48:39.59241425 +0000 UTC m=+6349.755626061" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.630323 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-fpprt"] Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.765710 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjzww"] Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.769348 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.775194 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjzww"] Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.922940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-catalog-content\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.923256 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbsz\" (UniqueName: \"kubernetes.io/projected/6105e5ae-c666-4926-91d0-fb90af43e2aa-kube-api-access-lhbsz\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:39 crc kubenswrapper[4786]: I0313 16:48:39.923481 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-utilities\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.026053 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-utilities\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.026434 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-catalog-content\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.026548 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-utilities\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.026556 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhbsz\" (UniqueName: \"kubernetes.io/projected/6105e5ae-c666-4926-91d0-fb90af43e2aa-kube-api-access-lhbsz\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.026978 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-catalog-content\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.044349 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhbsz\" (UniqueName: \"kubernetes.io/projected/6105e5ae-c666-4926-91d0-fb90af43e2aa-kube-api-access-lhbsz\") pod \"redhat-operators-gjzww\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.091158 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.158637 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-flqbr"] Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.549500 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-vsftn" event={"ID":"e7bcea0d-ef91-4510-8698-dea274f82f83","Type":"ContainerStarted","Data":"5df12ea62ea69ab62eedf50dfcf497b8dfe35b0f920ef220945d10070d0e5b30"} Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.550479 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-flqbr" event={"ID":"9de4aa0b-2f11-404c-9ae7-913912454f89","Type":"ContainerStarted","Data":"2123a051b54c5ac6c3ade70474fbaa654a24ac46dce198a34dfbc48cfe19dae3"} Mar 13 16:48:40 crc kubenswrapper[4786]: I0313 16:48:40.624154 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjzww"] Mar 13 16:48:41 crc kubenswrapper[4786]: I0313 16:48:41.561714 4786 generic.go:334] "Generic (PLEG): container finished" podID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerID="ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71" exitCode=0 Mar 13 16:48:41 crc kubenswrapper[4786]: I0313 16:48:41.561853 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzww" event={"ID":"6105e5ae-c666-4926-91d0-fb90af43e2aa","Type":"ContainerDied","Data":"ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71"} Mar 13 16:48:41 crc kubenswrapper[4786]: I0313 16:48:41.562063 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzww" event={"ID":"6105e5ae-c666-4926-91d0-fb90af43e2aa","Type":"ContainerStarted","Data":"613bf6af9d02ce30b5a168f5b1ca729fa50ba87a042298d6d7397b6f5b784331"} Mar 13 16:48:41 crc kubenswrapper[4786]: I0313 16:48:41.563848 4786 generic.go:334] "Generic (PLEG): container finished" podID="e7bcea0d-ef91-4510-8698-dea274f82f83" containerID="5df12ea62ea69ab62eedf50dfcf497b8dfe35b0f920ef220945d10070d0e5b30" exitCode=0 Mar 13 16:48:41 crc kubenswrapper[4786]: I0313 16:48:41.563926 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-vsftn" event={"ID":"e7bcea0d-ef91-4510-8698-dea274f82f83","Type":"ContainerDied","Data":"5df12ea62ea69ab62eedf50dfcf497b8dfe35b0f920ef220945d10070d0e5b30"} Mar 13 16:48:42 crc kubenswrapper[4786]: I0313 16:48:42.575381 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-vsftn" event={"ID":"e7bcea0d-ef91-4510-8698-dea274f82f83","Type":"ContainerStarted","Data":"bffe9dd365868eff78dd7c11a02a2466c4710db0061b6e69d79cdcaad4bbbf13"} Mar 13 16:48:42 crc kubenswrapper[4786]: I0313 16:48:42.576098 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:42 crc kubenswrapper[4786]: I0313 16:48:42.578726 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-flqbr" event={"ID":"9de4aa0b-2f11-404c-9ae7-913912454f89","Type":"ContainerStarted","Data":"c980734f5dd1b37358b0dd919ba81f66f93acc337cb66e2c5bd21aa7d3ebec14"} Mar 13 16:48:42 crc kubenswrapper[4786]: I0313 16:48:42.596890 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-vsftn" podStartSLOduration=4.885543477 podStartE2EDuration="6.596873183s" podCreationTimestamp="2026-03-13 16:48:36 +0000 UTC" firstStartedPulling="2026-03-13 16:48:37.820445361 +0000 UTC m=+6347.983657202" lastFinishedPulling="2026-03-13 16:48:39.531775107 +0000 UTC m=+6349.694986908" observedRunningTime="2026-03-13 16:48:42.592425071 +0000 UTC m=+6352.755636912" watchObservedRunningTime="2026-03-13 16:48:42.596873183 +0000 UTC m=+6352.760085004" Mar 13 16:48:43 crc kubenswrapper[4786]: I0313 16:48:43.596246 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzww" event={"ID":"6105e5ae-c666-4926-91d0-fb90af43e2aa","Type":"ContainerStarted","Data":"62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39"} Mar 13 16:48:43 crc kubenswrapper[4786]: I0313 16:48:43.600836 4786 generic.go:334] "Generic (PLEG): container finished" podID="9de4aa0b-2f11-404c-9ae7-913912454f89" containerID="c980734f5dd1b37358b0dd919ba81f66f93acc337cb66e2c5bd21aa7d3ebec14" exitCode=0 Mar 13 16:48:43 crc kubenswrapper[4786]: I0313 16:48:43.601088 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-flqbr" event={"ID":"9de4aa0b-2f11-404c-9ae7-913912454f89","Type":"ContainerDied","Data":"c980734f5dd1b37358b0dd919ba81f66f93acc337cb66e2c5bd21aa7d3ebec14"} Mar 13 16:48:44 crc kubenswrapper[4786]: I0313 16:48:44.613356 4786 generic.go:334] "Generic (PLEG): container finished" podID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerID="62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39" exitCode=0 Mar 13 16:48:44 crc kubenswrapper[4786]: I0313 16:48:44.613452 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzww" event={"ID":"6105e5ae-c666-4926-91d0-fb90af43e2aa","Type":"ContainerDied","Data":"62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39"} Mar 13 16:48:44 crc kubenswrapper[4786]: I0313 16:48:44.619261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-flqbr" event={"ID":"9de4aa0b-2f11-404c-9ae7-913912454f89","Type":"ContainerStarted","Data":"cad7542a52bcf1f1681b18668e9d604d5646cdcefb0c41cfbd3d0be21a938f55"} Mar 13 16:48:44 crc kubenswrapper[4786]: I0313 16:48:44.619738 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-flqbr" Mar 13 16:48:44 crc kubenswrapper[4786]: I0313 16:48:44.666771 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-flqbr" podStartSLOduration=4.935224594 podStartE2EDuration="6.666752986s" podCreationTimestamp="2026-03-13 16:48:38 +0000 UTC" firstStartedPulling="2026-03-13 16:48:40.175055908 +0000 UTC m=+6350.338267709" lastFinishedPulling="2026-03-13 16:48:41.90658428 +0000 UTC m=+6352.069796101" observedRunningTime="2026-03-13 16:48:44.655820451 +0000 UTC m=+6354.819032262" watchObservedRunningTime="2026-03-13 16:48:44.666752986 +0000 UTC m=+6354.829964797" Mar 13 16:48:45 crc kubenswrapper[4786]: I0313 16:48:45.629460 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzww" event={"ID":"6105e5ae-c666-4926-91d0-fb90af43e2aa","Type":"ContainerStarted","Data":"0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798"} Mar 13 16:48:45 crc kubenswrapper[4786]: I0313 16:48:45.653351 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjzww" podStartSLOduration=3.338713067 podStartE2EDuration="6.653295772s" podCreationTimestamp="2026-03-13 16:48:39 +0000 UTC" firstStartedPulling="2026-03-13 16:48:41.84409191 +0000 UTC m=+6352.007303731" lastFinishedPulling="2026-03-13 16:48:45.158674605 +0000 UTC m=+6355.321886436" observedRunningTime="2026-03-13 16:48:45.650662686 +0000 UTC m=+6355.813874517" watchObservedRunningTime="2026-03-13 16:48:45.653295772 +0000 UTC m=+6355.816507603" Mar 13 16:48:46 crc kubenswrapper[4786]: I0313 16:48:46.552488 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:48:46 crc kubenswrapper[4786]: E0313 16:48:46.553187 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:48:50 crc kubenswrapper[4786]: I0313 16:48:50.092275 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:50 crc kubenswrapper[4786]: I0313 16:48:50.092356 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:48:50 crc kubenswrapper[4786]: I0313 16:48:50.637761 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-fpprt" Mar 13 16:48:51 crc kubenswrapper[4786]: I0313 16:48:51.158647 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjzww" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="registry-server" probeResult="failure" output=< Mar 13 16:48:51 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 16:48:51 crc kubenswrapper[4786]: > Mar 13 16:48:52 crc kubenswrapper[4786]: I0313 16:48:52.261023 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-vsftn" Mar 13 16:48:54 crc kubenswrapper[4786]: I0313 16:48:54.162422 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-flqbr" Mar 13 16:49:00 crc kubenswrapper[4786]: I0313 16:49:00.147275 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:49:00 crc kubenswrapper[4786]: I0313 16:49:00.236815 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:49:00 crc kubenswrapper[4786]: I0313 16:49:00.397835 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjzww"] Mar 13 16:49:00 crc kubenswrapper[4786]: I0313 16:49:00.579077 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:49:00 crc kubenswrapper[4786]: E0313 16:49:00.580185 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:49:01 crc kubenswrapper[4786]: I0313 16:49:01.831690 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjzww" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="registry-server" containerID="cri-o://0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798" gracePeriod=2 Mar 13 16:49:01 crc kubenswrapper[4786]: I0313 16:49:01.939354 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75b76b5d9f-744t6"] Mar 13 16:49:01 crc kubenswrapper[4786]: I0313 16:49:01.941281 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:01 crc kubenswrapper[4786]: I0313 16:49:01.943905 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8hc9x" Mar 13 16:49:01 crc kubenswrapper[4786]: I0313 16:49:01.944162 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 13 16:49:01 crc kubenswrapper[4786]: I0313 16:49:01.944308 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 13 16:49:01 crc kubenswrapper[4786]: I0313 16:49:01.944462 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 13 16:49:01 crc kubenswrapper[4786]: I0313 16:49:01.952817 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75b76b5d9f-744t6"] Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.002669 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-scripts\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.002804 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnl9j\" (UniqueName: \"kubernetes.io/projected/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-kube-api-access-fnl9j\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.002924 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-config-data\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.003096 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-logs\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.003150 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-horizon-secret-key\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.027776 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-db5d78b6f-28x6k"] Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.029479 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.041906 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.042111 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerName="glance-log" containerID="cri-o://47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4" gracePeriod=30 Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.042250 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerName="glance-httpd" containerID="cri-o://adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3" gracePeriod=30 Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.076034 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-db5d78b6f-28x6k"] Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.105700 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-scripts\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.105762 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-config-data\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.105808 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnl9j\" (UniqueName: \"kubernetes.io/projected/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-kube-api-access-fnl9j\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.105834 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-config-data\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.105968 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28drl\" (UniqueName: \"kubernetes.io/projected/2a05c8cc-9cde-4ed9-b64d-5da623919974-kube-api-access-28drl\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.105996 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a05c8cc-9cde-4ed9-b64d-5da623919974-logs\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.106030 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-logs\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.106049 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-horizon-secret-key\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.106073 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a05c8cc-9cde-4ed9-b64d-5da623919974-horizon-secret-key\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.106109 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-scripts\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.107154 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-scripts\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.107376 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-logs\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.108938 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-config-data\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.123756 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-horizon-secret-key\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.133398 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnl9j\" (UniqueName: \"kubernetes.io/projected/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-kube-api-access-fnl9j\") pod \"horizon-75b76b5d9f-744t6\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.133679 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.133984 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerName="glance-log" containerID="cri-o://8c4fa795b62836e81bef5de48e5fd6b377d23dcdb4cce31d06972c7019b0768c" gracePeriod=30 Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.134499 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerName="glance-httpd" containerID="cri-o://f7a70fcbb6a61b06b3a6852bfd8d57d0a920f0c9db536cf4231324f60c45c7ec" gracePeriod=30 Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.209270 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28drl\" (UniqueName: \"kubernetes.io/projected/2a05c8cc-9cde-4ed9-b64d-5da623919974-kube-api-access-28drl\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.209718 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a05c8cc-9cde-4ed9-b64d-5da623919974-logs\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.209852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a05c8cc-9cde-4ed9-b64d-5da623919974-horizon-secret-key\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.209933 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-scripts\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.210094 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-config-data\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.211533 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-config-data\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.212161 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-scripts\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.214693 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a05c8cc-9cde-4ed9-b64d-5da623919974-horizon-secret-key\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.217356 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a05c8cc-9cde-4ed9-b64d-5da623919974-logs\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.230212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28drl\" (UniqueName: \"kubernetes.io/projected/2a05c8cc-9cde-4ed9-b64d-5da623919974-kube-api-access-28drl\") pod \"horizon-db5d78b6f-28x6k\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.378951 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.404944 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.419391 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.515440 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhbsz\" (UniqueName: \"kubernetes.io/projected/6105e5ae-c666-4926-91d0-fb90af43e2aa-kube-api-access-lhbsz\") pod \"6105e5ae-c666-4926-91d0-fb90af43e2aa\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.515741 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-catalog-content\") pod \"6105e5ae-c666-4926-91d0-fb90af43e2aa\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.515804 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-utilities\") pod \"6105e5ae-c666-4926-91d0-fb90af43e2aa\" (UID: \"6105e5ae-c666-4926-91d0-fb90af43e2aa\") " Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.517096 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-utilities" (OuterVolumeSpecName: "utilities") pod "6105e5ae-c666-4926-91d0-fb90af43e2aa" (UID: "6105e5ae-c666-4926-91d0-fb90af43e2aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.520474 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6105e5ae-c666-4926-91d0-fb90af43e2aa-kube-api-access-lhbsz" (OuterVolumeSpecName: "kube-api-access-lhbsz") pod "6105e5ae-c666-4926-91d0-fb90af43e2aa" (UID: "6105e5ae-c666-4926-91d0-fb90af43e2aa"). InnerVolumeSpecName "kube-api-access-lhbsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.621040 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhbsz\" (UniqueName: \"kubernetes.io/projected/6105e5ae-c666-4926-91d0-fb90af43e2aa-kube-api-access-lhbsz\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.621067 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.665816 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6105e5ae-c666-4926-91d0-fb90af43e2aa" (UID: "6105e5ae-c666-4926-91d0-fb90af43e2aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.723279 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6105e5ae-c666-4926-91d0-fb90af43e2aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.841289 4786 generic.go:334] "Generic (PLEG): container finished" podID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerID="47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4" exitCode=143 Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.841388 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9df4880-8f92-48ed-b070-b4bb67f8743a","Type":"ContainerDied","Data":"47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4"} Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.844732 4786 generic.go:334] "Generic (PLEG): container finished" podID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerID="0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798" exitCode=0 Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.844810 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzww" event={"ID":"6105e5ae-c666-4926-91d0-fb90af43e2aa","Type":"ContainerDied","Data":"0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798"} Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.844813 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjzww" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.844876 4786 scope.go:117] "RemoveContainer" containerID="0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.844847 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjzww" event={"ID":"6105e5ae-c666-4926-91d0-fb90af43e2aa","Type":"ContainerDied","Data":"613bf6af9d02ce30b5a168f5b1ca729fa50ba87a042298d6d7397b6f5b784331"} Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.848279 4786 generic.go:334] "Generic (PLEG): container finished" podID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerID="8c4fa795b62836e81bef5de48e5fd6b377d23dcdb4cce31d06972c7019b0768c" exitCode=143 Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.848307 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"295aee2e-3c33-4ab5-a840-a92aa2fea90a","Type":"ContainerDied","Data":"8c4fa795b62836e81bef5de48e5fd6b377d23dcdb4cce31d06972c7019b0768c"} Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.883706 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjzww"] Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.890296 4786 scope.go:117] "RemoveContainer" containerID="62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.895163 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjzww"] Mar 13 16:49:02 crc kubenswrapper[4786]: W0313 16:49:02.898017 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a05c8cc_9cde_4ed9_b64d_5da623919974.slice/crio-49d75e462d7b40582311f5760554209936420a6d9d6ecfa8694b1d43869a19bd WatchSource:0}: Error finding container 49d75e462d7b40582311f5760554209936420a6d9d6ecfa8694b1d43869a19bd: Status 404 returned error can't find the container with id 49d75e462d7b40582311f5760554209936420a6d9d6ecfa8694b1d43869a19bd Mar 13 16:49:02 crc kubenswrapper[4786]: W0313 16:49:02.900488 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60d99106_40c9_42cd_8ecc_3d552b0f5c8e.slice/crio-11cce582e3d68162e76b5ba2930ccdfb7529b9488570c9f474356626499074b8 WatchSource:0}: Error finding container 11cce582e3d68162e76b5ba2930ccdfb7529b9488570c9f474356626499074b8: Status 404 returned error can't find the container with id 11cce582e3d68162e76b5ba2930ccdfb7529b9488570c9f474356626499074b8 Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.907402 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-db5d78b6f-28x6k"] Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.917262 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75b76b5d9f-744t6"] Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.926483 4786 scope.go:117] "RemoveContainer" containerID="ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.955584 4786 scope.go:117] "RemoveContainer" containerID="0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798" Mar 13 16:49:02 crc kubenswrapper[4786]: E0313 16:49:02.956772 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798\": container with ID starting with 0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798 not found: ID does not exist" containerID="0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.956801 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798"} err="failed to get container status \"0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798\": rpc error: code = NotFound desc = could not find container \"0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798\": container with ID starting with 0651eea96b82659d1c4c1c2504265f73d2b8677d0ad124c7a2e6fde574c67798 not found: ID does not exist" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.956822 4786 scope.go:117] "RemoveContainer" containerID="62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39" Mar 13 16:49:02 crc kubenswrapper[4786]: E0313 16:49:02.957359 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39\": container with ID starting with 62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39 not found: ID does not exist" containerID="62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.957409 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39"} err="failed to get container status \"62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39\": rpc error: code = NotFound desc = could not find container \"62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39\": container with ID starting with 62bc75dcbeca2c828abdd2b3b26cea5f55ecdd3378a2f92e0bbf62e83af61a39 not found: ID does not exist" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.957489 4786 scope.go:117] "RemoveContainer" containerID="ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71" Mar 13 16:49:02 crc kubenswrapper[4786]: E0313 16:49:02.958355 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71\": container with ID starting with ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71 not found: ID does not exist" containerID="ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71" Mar 13 16:49:02 crc kubenswrapper[4786]: I0313 16:49:02.958396 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71"} err="failed to get container status \"ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71\": rpc error: code = NotFound desc = could not find container \"ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71\": container with ID starting with ca252105e3ecf2d6e5c0b1ff24753cdfb1749a9bcd112fdd947ea70ea6708c71 not found: ID does not exist" Mar 13 16:49:03 crc kubenswrapper[4786]: I0313 16:49:03.867761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b76b5d9f-744t6" event={"ID":"60d99106-40c9-42cd-8ecc-3d552b0f5c8e","Type":"ContainerStarted","Data":"11cce582e3d68162e76b5ba2930ccdfb7529b9488570c9f474356626499074b8"} Mar 13 16:49:03 crc kubenswrapper[4786]: I0313 16:49:03.870429 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-db5d78b6f-28x6k" event={"ID":"2a05c8cc-9cde-4ed9-b64d-5da623919974","Type":"ContainerStarted","Data":"49d75e462d7b40582311f5760554209936420a6d9d6ecfa8694b1d43869a19bd"} Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.267130 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75b76b5d9f-744t6"] Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.305038 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-745cccdd94-hp7vs"] Mar 13 16:49:04 crc kubenswrapper[4786]: E0313 16:49:04.305477 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="extract-content" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.305495 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="extract-content" Mar 13 16:49:04 crc kubenswrapper[4786]: E0313 16:49:04.305507 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="registry-server" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.305515 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="registry-server" Mar 13 16:49:04 crc kubenswrapper[4786]: E0313 16:49:04.305528 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="extract-utilities" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.305535 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="extract-utilities" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.305698 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" containerName="registry-server" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.306813 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.314891 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.325059 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-745cccdd94-hp7vs"] Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.383094 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-db5d78b6f-28x6k"] Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.420763 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6667bbdf64-hcxzc"] Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.422462 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.440074 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6667bbdf64-hcxzc"] Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.470263 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8hjp\" (UniqueName: \"kubernetes.io/projected/76e917bb-519d-4e0f-ba89-2272b7642137-kube-api-access-b8hjp\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.470314 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-tls-certs\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.470406 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-scripts\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.470441 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-config-data\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.470475 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e917bb-519d-4e0f-ba89-2272b7642137-logs\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.470533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-secret-key\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.470610 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-combined-ca-bundle\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.565402 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6105e5ae-c666-4926-91d0-fb90af43e2aa" path="/var/lib/kubelet/pods/6105e5ae-c666-4926-91d0-fb90af43e2aa/volumes" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572600 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-config-data\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572652 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e917bb-519d-4e0f-ba89-2272b7642137-logs\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572687 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-tls-certs\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572724 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-secret-key\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572747 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtz95\" (UniqueName: \"kubernetes.io/projected/a2213440-3009-4db0-a1c4-7d5d4a12481b-kube-api-access-wtz95\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572781 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-combined-ca-bundle\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572816 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-combined-ca-bundle\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572874 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8hjp\" (UniqueName: \"kubernetes.io/projected/76e917bb-519d-4e0f-ba89-2272b7642137-kube-api-access-b8hjp\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572897 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-tls-certs\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572919 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2213440-3009-4db0-a1c4-7d5d4a12481b-logs\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-config-data\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.572970 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-secret-key\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.573007 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-scripts\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.573025 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-scripts\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.574438 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-scripts\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.574761 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e917bb-519d-4e0f-ba89-2272b7642137-logs\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.575112 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-config-data\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.579104 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-tls-certs\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.579372 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-secret-key\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.580117 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-combined-ca-bundle\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.588030 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8hjp\" (UniqueName: \"kubernetes.io/projected/76e917bb-519d-4e0f-ba89-2272b7642137-kube-api-access-b8hjp\") pod \"horizon-745cccdd94-hp7vs\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.625472 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.675138 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtz95\" (UniqueName: \"kubernetes.io/projected/a2213440-3009-4db0-a1c4-7d5d4a12481b-kube-api-access-wtz95\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.675197 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-combined-ca-bundle\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.675269 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2213440-3009-4db0-a1c4-7d5d4a12481b-logs\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.675292 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-config-data\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.675322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-secret-key\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.675361 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-scripts\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.675397 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-tls-certs\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.676216 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2213440-3009-4db0-a1c4-7d5d4a12481b-logs\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.678112 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-config-data\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.678939 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-scripts\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.680398 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-combined-ca-bundle\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.689323 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-secret-key\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.691660 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtz95\" (UniqueName: \"kubernetes.io/projected/a2213440-3009-4db0-a1c4-7d5d4a12481b-kube-api-access-wtz95\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.694054 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-tls-certs\") pod \"horizon-6667bbdf64-hcxzc\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:04 crc kubenswrapper[4786]: I0313 16:49:04.750342 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.168977 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6667bbdf64-hcxzc"] Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.195789 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-745cccdd94-hp7vs"] Mar 13 16:49:05 crc kubenswrapper[4786]: W0313 16:49:05.200997 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76e917bb_519d_4e0f_ba89_2272b7642137.slice/crio-6be3b47e39471a8f63dba9eb90c0adfb3668e55c31403a1dc7b795a19b403da3 WatchSource:0}: Error finding container 6be3b47e39471a8f63dba9eb90c0adfb3668e55c31403a1dc7b795a19b403da3: Status 404 returned error can't find the container with id 6be3b47e39471a8f63dba9eb90c0adfb3668e55c31403a1dc7b795a19b403da3 Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.847238 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.914785 4786 generic.go:334] "Generic (PLEG): container finished" podID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerID="adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3" exitCode=0 Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.914852 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9df4880-8f92-48ed-b070-b4bb67f8743a","Type":"ContainerDied","Data":"adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3"} Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.914895 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a9df4880-8f92-48ed-b070-b4bb67f8743a","Type":"ContainerDied","Data":"1c4ee6ea70e0ce40acceba7b3759439662cdf4050014d73b9cab3edc71a497c9"} Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.914954 4786 scope.go:117] "RemoveContainer" containerID="adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3" Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.915093 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.937152 4786 generic.go:334] "Generic (PLEG): container finished" podID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerID="f7a70fcbb6a61b06b3a6852bfd8d57d0a920f0c9db536cf4231324f60c45c7ec" exitCode=0 Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.937228 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"295aee2e-3c33-4ab5-a840-a92aa2fea90a","Type":"ContainerDied","Data":"f7a70fcbb6a61b06b3a6852bfd8d57d0a920f0c9db536cf4231324f60c45c7ec"} Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.937301 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"295aee2e-3c33-4ab5-a840-a92aa2fea90a","Type":"ContainerDied","Data":"1f177cfe0684c2369568eca6dcef3e9cc375721de746e51ab8b8bcc38756cada"} Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.937316 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f177cfe0684c2369568eca6dcef3e9cc375721de746e51ab8b8bcc38756cada" Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.947348 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667bbdf64-hcxzc" event={"ID":"a2213440-3009-4db0-a1c4-7d5d4a12481b","Type":"ContainerStarted","Data":"f32e7fdc3358b37e56e73572ebd753139df3204e7a4da38e21f1a41e80c234cd"} Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.957829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745cccdd94-hp7vs" event={"ID":"76e917bb-519d-4e0f-ba89-2272b7642137","Type":"ContainerStarted","Data":"6be3b47e39471a8f63dba9eb90c0adfb3668e55c31403a1dc7b795a19b403da3"} Mar 13 16:49:05 crc kubenswrapper[4786]: I0313 16:49:05.989642 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000133 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-config-data\") pod \"a9df4880-8f92-48ed-b070-b4bb67f8743a\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000172 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-httpd-run\") pod \"a9df4880-8f92-48ed-b070-b4bb67f8743a\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000310 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-logs\") pod \"a9df4880-8f92-48ed-b070-b4bb67f8743a\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000339 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-scripts\") pod \"a9df4880-8f92-48ed-b070-b4bb67f8743a\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000389 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-public-tls-certs\") pod \"a9df4880-8f92-48ed-b070-b4bb67f8743a\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000432 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-combined-ca-bundle\") pod \"a9df4880-8f92-48ed-b070-b4bb67f8743a\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000763 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27v95\" (UniqueName: \"kubernetes.io/projected/a9df4880-8f92-48ed-b070-b4bb67f8743a-kube-api-access-27v95\") pod \"a9df4880-8f92-48ed-b070-b4bb67f8743a\" (UID: \"a9df4880-8f92-48ed-b070-b4bb67f8743a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000797 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-logs" (OuterVolumeSpecName: "logs") pod "a9df4880-8f92-48ed-b070-b4bb67f8743a" (UID: "a9df4880-8f92-48ed-b070-b4bb67f8743a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.001326 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.000645 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a9df4880-8f92-48ed-b070-b4bb67f8743a" (UID: "a9df4880-8f92-48ed-b070-b4bb67f8743a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.015228 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-scripts" (OuterVolumeSpecName: "scripts") pod "a9df4880-8f92-48ed-b070-b4bb67f8743a" (UID: "a9df4880-8f92-48ed-b070-b4bb67f8743a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.015367 4786 scope.go:117] "RemoveContainer" containerID="47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.032810 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9df4880-8f92-48ed-b070-b4bb67f8743a-kube-api-access-27v95" (OuterVolumeSpecName: "kube-api-access-27v95") pod "a9df4880-8f92-48ed-b070-b4bb67f8743a" (UID: "a9df4880-8f92-48ed-b070-b4bb67f8743a"). InnerVolumeSpecName "kube-api-access-27v95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.055333 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9df4880-8f92-48ed-b070-b4bb67f8743a" (UID: "a9df4880-8f92-48ed-b070-b4bb67f8743a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.073810 4786 scope.go:117] "RemoveContainer" containerID="adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3" Mar 13 16:49:06 crc kubenswrapper[4786]: E0313 16:49:06.075021 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3\": container with ID starting with adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3 not found: ID does not exist" containerID="adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.075067 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3"} err="failed to get container status \"adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3\": rpc error: code = NotFound desc = could not find container \"adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3\": container with ID starting with adaeaec80930993c89a63df7a3c24f3fc45780f64d1f0ee0ea0dc2d75eb319c3 not found: ID does not exist" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.075095 4786 scope.go:117] "RemoveContainer" containerID="47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4" Mar 13 16:49:06 crc kubenswrapper[4786]: E0313 16:49:06.076834 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4\": container with ID starting with 47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4 not found: ID does not exist" containerID="47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.076892 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4"} err="failed to get container status \"47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4\": rpc error: code = NotFound desc = could not find container \"47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4\": container with ID starting with 47c5f03c5a2e0800ff272bd0503733121b412825902e5a6017903472cdb9add4 not found: ID does not exist" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.102362 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pwgt\" (UniqueName: \"kubernetes.io/projected/295aee2e-3c33-4ab5-a840-a92aa2fea90a-kube-api-access-7pwgt\") pod \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.102415 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-combined-ca-bundle\") pod \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.102459 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-httpd-run\") pod \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.102517 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-internal-tls-certs\") pod \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.102555 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-config-data\") pod \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.102603 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-scripts\") pod \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.102736 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-logs\") pod \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\" (UID: \"295aee2e-3c33-4ab5-a840-a92aa2fea90a\") " Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.103883 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "295aee2e-3c33-4ab5-a840-a92aa2fea90a" (UID: "295aee2e-3c33-4ab5-a840-a92aa2fea90a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.104168 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27v95\" (UniqueName: \"kubernetes.io/projected/a9df4880-8f92-48ed-b070-b4bb67f8743a-kube-api-access-27v95\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.104188 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9df4880-8f92-48ed-b070-b4bb67f8743a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.104197 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.104206 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.104214 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.104919 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-logs" (OuterVolumeSpecName: "logs") pod "295aee2e-3c33-4ab5-a840-a92aa2fea90a" (UID: "295aee2e-3c33-4ab5-a840-a92aa2fea90a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.106665 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-scripts" (OuterVolumeSpecName: "scripts") pod "295aee2e-3c33-4ab5-a840-a92aa2fea90a" (UID: "295aee2e-3c33-4ab5-a840-a92aa2fea90a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.107185 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295aee2e-3c33-4ab5-a840-a92aa2fea90a-kube-api-access-7pwgt" (OuterVolumeSpecName: "kube-api-access-7pwgt") pod "295aee2e-3c33-4ab5-a840-a92aa2fea90a" (UID: "295aee2e-3c33-4ab5-a840-a92aa2fea90a"). InnerVolumeSpecName "kube-api-access-7pwgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.121425 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a9df4880-8f92-48ed-b070-b4bb67f8743a" (UID: "a9df4880-8f92-48ed-b070-b4bb67f8743a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.127466 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-config-data" (OuterVolumeSpecName: "config-data") pod "a9df4880-8f92-48ed-b070-b4bb67f8743a" (UID: "a9df4880-8f92-48ed-b070-b4bb67f8743a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.137099 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "295aee2e-3c33-4ab5-a840-a92aa2fea90a" (UID: "295aee2e-3c33-4ab5-a840-a92aa2fea90a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.161001 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "295aee2e-3c33-4ab5-a840-a92aa2fea90a" (UID: "295aee2e-3c33-4ab5-a840-a92aa2fea90a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.168813 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-config-data" (OuterVolumeSpecName: "config-data") pod "295aee2e-3c33-4ab5-a840-a92aa2fea90a" (UID: "295aee2e-3c33-4ab5-a840-a92aa2fea90a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.205722 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.205754 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.205763 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.205771 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.205779 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/295aee2e-3c33-4ab5-a840-a92aa2fea90a-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.205787 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pwgt\" (UniqueName: \"kubernetes.io/projected/295aee2e-3c33-4ab5-a840-a92aa2fea90a-kube-api-access-7pwgt\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.205797 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295aee2e-3c33-4ab5-a840-a92aa2fea90a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.205806 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df4880-8f92-48ed-b070-b4bb67f8743a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.250872 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.270934 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.282519 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:49:06 crc kubenswrapper[4786]: E0313 16:49:06.283489 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerName="glance-log" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.283508 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerName="glance-log" Mar 13 16:49:06 crc kubenswrapper[4786]: E0313 16:49:06.283534 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerName="glance-log" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.283539 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerName="glance-log" Mar 13 16:49:06 crc kubenswrapper[4786]: E0313 16:49:06.283553 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerName="glance-httpd" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.283560 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerName="glance-httpd" Mar 13 16:49:06 crc kubenswrapper[4786]: E0313 16:49:06.283573 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerName="glance-httpd" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.283579 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerName="glance-httpd" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.283749 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerName="glance-log" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.283761 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerName="glance-httpd" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.283776 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" containerName="glance-log" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.283787 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" containerName="glance-httpd" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.284850 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.286835 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.289220 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.289534 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.408403 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.408520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc05c795-f8c5-42ef-a000-1932740ca77a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.408561 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.408628 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.408676 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.408730 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc05c795-f8c5-42ef-a000-1932740ca77a-logs\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.408774 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhc82\" (UniqueName: \"kubernetes.io/projected/dc05c795-f8c5-42ef-a000-1932740ca77a-kube-api-access-dhc82\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.509892 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.509938 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.509969 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc05c795-f8c5-42ef-a000-1932740ca77a-logs\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.510017 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhc82\" (UniqueName: \"kubernetes.io/projected/dc05c795-f8c5-42ef-a000-1932740ca77a-kube-api-access-dhc82\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.510053 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.510101 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc05c795-f8c5-42ef-a000-1932740ca77a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.510134 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.510854 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc05c795-f8c5-42ef-a000-1932740ca77a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.511267 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc05c795-f8c5-42ef-a000-1932740ca77a-logs\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.514659 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.515953 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.516399 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.517227 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc05c795-f8c5-42ef-a000-1932740ca77a-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.526660 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhc82\" (UniqueName: \"kubernetes.io/projected/dc05c795-f8c5-42ef-a000-1932740ca77a-kube-api-access-dhc82\") pod \"glance-default-external-api-0\" (UID: \"dc05c795-f8c5-42ef-a000-1932740ca77a\") " pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.565114 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9df4880-8f92-48ed-b070-b4bb67f8743a" path="/var/lib/kubelet/pods/a9df4880-8f92-48ed-b070-b4bb67f8743a/volumes" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.610581 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 16:49:06 crc kubenswrapper[4786]: I0313 16:49:06.968770 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.008370 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.020509 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.063373 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.064945 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.067271 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.067876 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.089658 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.124208 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-logs\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.124521 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9nk\" (UniqueName: \"kubernetes.io/projected/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-kube-api-access-xn9nk\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.124555 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.124695 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.124788 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.125087 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.125197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.227104 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9nk\" (UniqueName: \"kubernetes.io/projected/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-kube-api-access-xn9nk\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.227363 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.227399 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.227436 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.227480 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.227514 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.227552 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-logs\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.228072 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-logs\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.228809 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.232390 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.232732 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.238319 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.239706 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.242632 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9nk\" (UniqueName: \"kubernetes.io/projected/f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4-kube-api-access-xn9nk\") pod \"glance-default-internal-api-0\" (UID: \"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4\") " pod="openstack/glance-default-internal-api-0" Mar 13 16:49:07 crc kubenswrapper[4786]: I0313 16:49:07.384478 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:08 crc kubenswrapper[4786]: I0313 16:49:08.570719 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295aee2e-3c33-4ab5-a840-a92aa2fea90a" path="/var/lib/kubelet/pods/295aee2e-3c33-4ab5-a840-a92aa2fea90a/volumes" Mar 13 16:49:11 crc kubenswrapper[4786]: I0313 16:49:11.552734 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:49:11 crc kubenswrapper[4786]: I0313 16:49:11.831967 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 16:49:11 crc kubenswrapper[4786]: W0313 16:49:11.856843 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e03eb4_2eeb_40a3_8783_c75c5e55c4b4.slice/crio-01745a6ae34598b1bc7762a5cc3d8e31a90fec464e53b79d3251912d54d45904 WatchSource:0}: Error finding container 01745a6ae34598b1bc7762a5cc3d8e31a90fec464e53b79d3251912d54d45904: Status 404 returned error can't find the container with id 01745a6ae34598b1bc7762a5cc3d8e31a90fec464e53b79d3251912d54d45904 Mar 13 16:49:11 crc kubenswrapper[4786]: I0313 16:49:11.928584 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 16:49:11 crc kubenswrapper[4786]: W0313 16:49:11.935101 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc05c795_f8c5_42ef_a000_1932740ca77a.slice/crio-b49085e1c3d487db7ed5c940e36bfd3ea8dfb3d0d2118f9399ea180dd62bbb12 WatchSource:0}: Error finding container b49085e1c3d487db7ed5c940e36bfd3ea8dfb3d0d2118f9399ea180dd62bbb12: Status 404 returned error can't find the container with id b49085e1c3d487db7ed5c940e36bfd3ea8dfb3d0d2118f9399ea180dd62bbb12 Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.019648 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-db5d78b6f-28x6k" event={"ID":"2a05c8cc-9cde-4ed9-b64d-5da623919974","Type":"ContainerStarted","Data":"8f0fe444a1c45cf8e0812c73c9fc2538f2443cc0198b4ca37e0955cec46fabd8"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.019691 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-db5d78b6f-28x6k" event={"ID":"2a05c8cc-9cde-4ed9-b64d-5da623919974","Type":"ContainerStarted","Data":"41424f54151ea8ee0bf4319ce995282fdb064cde970f67a82ed5e1cc219bcd94"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.019789 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-db5d78b6f-28x6k" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerName="horizon-log" containerID="cri-o://41424f54151ea8ee0bf4319ce995282fdb064cde970f67a82ed5e1cc219bcd94" gracePeriod=30 Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.020260 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-db5d78b6f-28x6k" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerName="horizon" containerID="cri-o://8f0fe444a1c45cf8e0812c73c9fc2538f2443cc0198b4ca37e0955cec46fabd8" gracePeriod=30 Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.022082 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4","Type":"ContainerStarted","Data":"01745a6ae34598b1bc7762a5cc3d8e31a90fec464e53b79d3251912d54d45904"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.029848 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b76b5d9f-744t6" event={"ID":"60d99106-40c9-42cd-8ecc-3d552b0f5c8e","Type":"ContainerStarted","Data":"57754a3f0a06d7aeb70a87b5df72b1e27c37ad8e2ca789dc877afeba763a9fb0"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.029933 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b76b5d9f-744t6" event={"ID":"60d99106-40c9-42cd-8ecc-3d552b0f5c8e","Type":"ContainerStarted","Data":"f83e30501cafb5bc87b2ad745afa98f7f98281e257009c4ce91bd3bb235f481a"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.030101 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75b76b5d9f-744t6" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerName="horizon-log" containerID="cri-o://f83e30501cafb5bc87b2ad745afa98f7f98281e257009c4ce91bd3bb235f481a" gracePeriod=30 Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.030244 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75b76b5d9f-744t6" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerName="horizon" containerID="cri-o://57754a3f0a06d7aeb70a87b5df72b1e27c37ad8e2ca789dc877afeba763a9fb0" gracePeriod=30 Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.033418 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc05c795-f8c5-42ef-a000-1932740ca77a","Type":"ContainerStarted","Data":"b49085e1c3d487db7ed5c940e36bfd3ea8dfb3d0d2118f9399ea180dd62bbb12"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.035713 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745cccdd94-hp7vs" event={"ID":"76e917bb-519d-4e0f-ba89-2272b7642137","Type":"ContainerStarted","Data":"74b79f22d7aea4a3cd956a3aa73064e79120bf38391c6bdcc073913e20621c65"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.035739 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745cccdd94-hp7vs" event={"ID":"76e917bb-519d-4e0f-ba89-2272b7642137","Type":"ContainerStarted","Data":"8c1fb2bbc08ee24a5a123583a026fdde9f6c51c761de56a3acc15ccc34a8d3a3"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.038981 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667bbdf64-hcxzc" event={"ID":"a2213440-3009-4db0-a1c4-7d5d4a12481b","Type":"ContainerStarted","Data":"586157e2794d58c3593b5f8272f7d0fc627737484ca31706ba7d96225f6404e0"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.039013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667bbdf64-hcxzc" event={"ID":"a2213440-3009-4db0-a1c4-7d5d4a12481b","Type":"ContainerStarted","Data":"028ac99db632f4107ca629da7a995de2180c047ada30e587a00d63b7fabc1ae2"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.041896 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"b49392c99cf5e104c67fff6a8b879c097bd7fea9986e8b283b323621dcd6d857"} Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.047548 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-db5d78b6f-28x6k" podStartSLOduration=2.609028477 podStartE2EDuration="11.047535855s" podCreationTimestamp="2026-03-13 16:49:01 +0000 UTC" firstStartedPulling="2026-03-13 16:49:02.899938462 +0000 UTC m=+6373.063150273" lastFinishedPulling="2026-03-13 16:49:11.33844585 +0000 UTC m=+6381.501657651" observedRunningTime="2026-03-13 16:49:12.044991711 +0000 UTC m=+6382.208203522" watchObservedRunningTime="2026-03-13 16:49:12.047535855 +0000 UTC m=+6382.210747666" Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.075301 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75b76b5d9f-744t6" podStartSLOduration=2.642210671 podStartE2EDuration="11.075284042s" podCreationTimestamp="2026-03-13 16:49:01 +0000 UTC" firstStartedPulling="2026-03-13 16:49:02.903962003 +0000 UTC m=+6373.067173814" lastFinishedPulling="2026-03-13 16:49:11.337035374 +0000 UTC m=+6381.500247185" observedRunningTime="2026-03-13 16:49:12.067379183 +0000 UTC m=+6382.230590994" watchObservedRunningTime="2026-03-13 16:49:12.075284042 +0000 UTC m=+6382.238495853" Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.095618 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-745cccdd94-hp7vs" podStartSLOduration=1.901477853 podStartE2EDuration="8.095593442s" podCreationTimestamp="2026-03-13 16:49:04 +0000 UTC" firstStartedPulling="2026-03-13 16:49:05.203369864 +0000 UTC m=+6375.366581675" lastFinishedPulling="2026-03-13 16:49:11.397485463 +0000 UTC m=+6381.560697264" observedRunningTime="2026-03-13 16:49:12.085994521 +0000 UTC m=+6382.249206342" watchObservedRunningTime="2026-03-13 16:49:12.095593442 +0000 UTC m=+6382.258805253" Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.124784 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6667bbdf64-hcxzc" podStartSLOduration=1.9009323089999999 podStartE2EDuration="8.124766075s" podCreationTimestamp="2026-03-13 16:49:04 +0000 UTC" firstStartedPulling="2026-03-13 16:49:05.175243587 +0000 UTC m=+6375.338455398" lastFinishedPulling="2026-03-13 16:49:11.399077353 +0000 UTC m=+6381.562289164" observedRunningTime="2026-03-13 16:49:12.11024565 +0000 UTC m=+6382.273457461" watchObservedRunningTime="2026-03-13 16:49:12.124766075 +0000 UTC m=+6382.287977886" Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.379104 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:12 crc kubenswrapper[4786]: I0313 16:49:12.405560 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:13 crc kubenswrapper[4786]: I0313 16:49:13.064651 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc05c795-f8c5-42ef-a000-1932740ca77a","Type":"ContainerStarted","Data":"e2c1ca547a1d515263251038d39fbf313b5ae1ea6668f5f436ce065af29c259b"} Mar 13 16:49:13 crc kubenswrapper[4786]: I0313 16:49:13.073695 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4","Type":"ContainerStarted","Data":"b1e32f2a0ca4b84f177dd5de02f70eb8b389e07cb20266a112902b9d0b13f743"} Mar 13 16:49:14 crc kubenswrapper[4786]: I0313 16:49:14.088588 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc05c795-f8c5-42ef-a000-1932740ca77a","Type":"ContainerStarted","Data":"b8c30451a72f8d322c2a1bca4b379f9a494b944c1196043e74f1788fa8a01537"} Mar 13 16:49:14 crc kubenswrapper[4786]: I0313 16:49:14.098174 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4","Type":"ContainerStarted","Data":"194e201ff82272bda6d298fe3160a85088c88b28f03f48f5d9027533f70d86f0"} Mar 13 16:49:14 crc kubenswrapper[4786]: I0313 16:49:14.137377 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.137358468 podStartE2EDuration="8.137358468s" podCreationTimestamp="2026-03-13 16:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:49:14.118450673 +0000 UTC m=+6384.281662494" watchObservedRunningTime="2026-03-13 16:49:14.137358468 +0000 UTC m=+6384.300570279" Mar 13 16:49:14 crc kubenswrapper[4786]: I0313 16:49:14.153784 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.15376573 podStartE2EDuration="8.15376573s" podCreationTimestamp="2026-03-13 16:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:49:14.143768769 +0000 UTC m=+6384.306980580" watchObservedRunningTime="2026-03-13 16:49:14.15376573 +0000 UTC m=+6384.316977541" Mar 13 16:49:14 crc kubenswrapper[4786]: I0313 16:49:14.626010 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:14 crc kubenswrapper[4786]: I0313 16:49:14.626331 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:14 crc kubenswrapper[4786]: I0313 16:49:14.751513 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:14 crc kubenswrapper[4786]: I0313 16:49:14.752830 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:16 crc kubenswrapper[4786]: I0313 16:49:16.322965 4786 scope.go:117] "RemoveContainer" containerID="f7a70fcbb6a61b06b3a6852bfd8d57d0a920f0c9db536cf4231324f60c45c7ec" Mar 13 16:49:16 crc kubenswrapper[4786]: I0313 16:49:16.350801 4786 scope.go:117] "RemoveContainer" containerID="8c4fa795b62836e81bef5de48e5fd6b377d23dcdb4cce31d06972c7019b0768c" Mar 13 16:49:16 crc kubenswrapper[4786]: I0313 16:49:16.611618 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 16:49:16 crc kubenswrapper[4786]: I0313 16:49:16.611885 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 16:49:16 crc kubenswrapper[4786]: I0313 16:49:16.652363 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 16:49:16 crc kubenswrapper[4786]: I0313 16:49:16.662143 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 16:49:17 crc kubenswrapper[4786]: I0313 16:49:17.130093 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 16:49:17 crc kubenswrapper[4786]: I0313 16:49:17.130333 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 16:49:17 crc kubenswrapper[4786]: I0313 16:49:17.385395 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:17 crc kubenswrapper[4786]: I0313 16:49:17.385458 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:17 crc kubenswrapper[4786]: I0313 16:49:17.422820 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:17 crc kubenswrapper[4786]: I0313 16:49:17.436049 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:18 crc kubenswrapper[4786]: I0313 16:49:18.137962 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:18 crc kubenswrapper[4786]: I0313 16:49:18.138363 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:19 crc kubenswrapper[4786]: I0313 16:49:19.144416 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 16:49:19 crc kubenswrapper[4786]: I0313 16:49:19.227458 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 16:49:19 crc kubenswrapper[4786]: I0313 16:49:19.909739 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 16:49:19 crc kubenswrapper[4786]: I0313 16:49:19.985744 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:19 crc kubenswrapper[4786]: I0313 16:49:19.993661 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 16:49:24 crc kubenswrapper[4786]: I0313 16:49:24.627257 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-745cccdd94-hp7vs" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 13 16:49:24 crc kubenswrapper[4786]: I0313 16:49:24.753097 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6667bbdf64-hcxzc" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.160:8443: connect: connection refused" Mar 13 16:49:36 crc kubenswrapper[4786]: I0313 16:49:36.506282 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:36 crc kubenswrapper[4786]: I0313 16:49:36.583489 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:38 crc kubenswrapper[4786]: I0313 16:49:38.183499 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:49:38 crc kubenswrapper[4786]: I0313 16:49:38.405185 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:49:38 crc kubenswrapper[4786]: I0313 16:49:38.457067 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-745cccdd94-hp7vs"] Mar 13 16:49:38 crc kubenswrapper[4786]: I0313 16:49:38.457293 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-745cccdd94-hp7vs" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon-log" containerID="cri-o://8c1fb2bbc08ee24a5a123583a026fdde9f6c51c761de56a3acc15ccc34a8d3a3" gracePeriod=30 Mar 13 16:49:38 crc kubenswrapper[4786]: I0313 16:49:38.457350 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-745cccdd94-hp7vs" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon" containerID="cri-o://74b79f22d7aea4a3cd956a3aa73064e79120bf38391c6bdcc073913e20621c65" gracePeriod=30 Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.407606 4786 generic.go:334] "Generic (PLEG): container finished" podID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerID="57754a3f0a06d7aeb70a87b5df72b1e27c37ad8e2ca789dc877afeba763a9fb0" exitCode=137 Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.408177 4786 generic.go:334] "Generic (PLEG): container finished" podID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerID="f83e30501cafb5bc87b2ad745afa98f7f98281e257009c4ce91bd3bb235f481a" exitCode=137 Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.407723 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b76b5d9f-744t6" event={"ID":"60d99106-40c9-42cd-8ecc-3d552b0f5c8e","Type":"ContainerDied","Data":"57754a3f0a06d7aeb70a87b5df72b1e27c37ad8e2ca789dc877afeba763a9fb0"} Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.408261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b76b5d9f-744t6" event={"ID":"60d99106-40c9-42cd-8ecc-3d552b0f5c8e","Type":"ContainerDied","Data":"f83e30501cafb5bc87b2ad745afa98f7f98281e257009c4ce91bd3bb235f481a"} Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.410712 4786 generic.go:334] "Generic (PLEG): container finished" podID="76e917bb-519d-4e0f-ba89-2272b7642137" containerID="74b79f22d7aea4a3cd956a3aa73064e79120bf38391c6bdcc073913e20621c65" exitCode=0 Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.410771 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745cccdd94-hp7vs" event={"ID":"76e917bb-519d-4e0f-ba89-2272b7642137","Type":"ContainerDied","Data":"74b79f22d7aea4a3cd956a3aa73064e79120bf38391c6bdcc073913e20621c65"} Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.413726 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerID="8f0fe444a1c45cf8e0812c73c9fc2538f2443cc0198b4ca37e0955cec46fabd8" exitCode=137 Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.413747 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerID="41424f54151ea8ee0bf4319ce995282fdb064cde970f67a82ed5e1cc219bcd94" exitCode=137 Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.413764 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-db5d78b6f-28x6k" event={"ID":"2a05c8cc-9cde-4ed9-b64d-5da623919974","Type":"ContainerDied","Data":"8f0fe444a1c45cf8e0812c73c9fc2538f2443cc0198b4ca37e0955cec46fabd8"} Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.413784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-db5d78b6f-28x6k" event={"ID":"2a05c8cc-9cde-4ed9-b64d-5da623919974","Type":"ContainerDied","Data":"41424f54151ea8ee0bf4319ce995282fdb064cde970f67a82ed5e1cc219bcd94"} Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.553433 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.611505 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-scripts\") pod \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.611693 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-config-data\") pod \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.611783 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-logs\") pod \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.611816 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-horizon-secret-key\") pod \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.611973 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnl9j\" (UniqueName: \"kubernetes.io/projected/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-kube-api-access-fnl9j\") pod \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\" (UID: \"60d99106-40c9-42cd-8ecc-3d552b0f5c8e\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.617706 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-logs" (OuterVolumeSpecName: "logs") pod "60d99106-40c9-42cd-8ecc-3d552b0f5c8e" (UID: "60d99106-40c9-42cd-8ecc-3d552b0f5c8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.620618 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "60d99106-40c9-42cd-8ecc-3d552b0f5c8e" (UID: "60d99106-40c9-42cd-8ecc-3d552b0f5c8e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.626293 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-kube-api-access-fnl9j" (OuterVolumeSpecName: "kube-api-access-fnl9j") pod "60d99106-40c9-42cd-8ecc-3d552b0f5c8e" (UID: "60d99106-40c9-42cd-8ecc-3d552b0f5c8e"). InnerVolumeSpecName "kube-api-access-fnl9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.649304 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-scripts" (OuterVolumeSpecName: "scripts") pod "60d99106-40c9-42cd-8ecc-3d552b0f5c8e" (UID: "60d99106-40c9-42cd-8ecc-3d552b0f5c8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.651304 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-config-data" (OuterVolumeSpecName: "config-data") pod "60d99106-40c9-42cd-8ecc-3d552b0f5c8e" (UID: "60d99106-40c9-42cd-8ecc-3d552b0f5c8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.714471 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.714680 4786 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.714771 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnl9j\" (UniqueName: \"kubernetes.io/projected/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-kube-api-access-fnl9j\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.714850 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.714962 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60d99106-40c9-42cd-8ecc-3d552b0f5c8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.773681 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.919791 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-config-data\") pod \"2a05c8cc-9cde-4ed9-b64d-5da623919974\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.919869 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28drl\" (UniqueName: \"kubernetes.io/projected/2a05c8cc-9cde-4ed9-b64d-5da623919974-kube-api-access-28drl\") pod \"2a05c8cc-9cde-4ed9-b64d-5da623919974\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.919930 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-scripts\") pod \"2a05c8cc-9cde-4ed9-b64d-5da623919974\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.920185 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a05c8cc-9cde-4ed9-b64d-5da623919974-logs\") pod \"2a05c8cc-9cde-4ed9-b64d-5da623919974\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.920242 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a05c8cc-9cde-4ed9-b64d-5da623919974-horizon-secret-key\") pod \"2a05c8cc-9cde-4ed9-b64d-5da623919974\" (UID: \"2a05c8cc-9cde-4ed9-b64d-5da623919974\") " Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.920584 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05c8cc-9cde-4ed9-b64d-5da623919974-logs" (OuterVolumeSpecName: "logs") pod "2a05c8cc-9cde-4ed9-b64d-5da623919974" (UID: "2a05c8cc-9cde-4ed9-b64d-5da623919974"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.923968 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a05c8cc-9cde-4ed9-b64d-5da623919974-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2a05c8cc-9cde-4ed9-b64d-5da623919974" (UID: "2a05c8cc-9cde-4ed9-b64d-5da623919974"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.924402 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a05c8cc-9cde-4ed9-b64d-5da623919974-kube-api-access-28drl" (OuterVolumeSpecName: "kube-api-access-28drl") pod "2a05c8cc-9cde-4ed9-b64d-5da623919974" (UID: "2a05c8cc-9cde-4ed9-b64d-5da623919974"). InnerVolumeSpecName "kube-api-access-28drl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.950982 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-scripts" (OuterVolumeSpecName: "scripts") pod "2a05c8cc-9cde-4ed9-b64d-5da623919974" (UID: "2a05c8cc-9cde-4ed9-b64d-5da623919974"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:49:42 crc kubenswrapper[4786]: I0313 16:49:42.952655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-config-data" (OuterVolumeSpecName: "config-data") pod "2a05c8cc-9cde-4ed9-b64d-5da623919974" (UID: "2a05c8cc-9cde-4ed9-b64d-5da623919974"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.022674 4786 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a05c8cc-9cde-4ed9-b64d-5da623919974-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.022721 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.022731 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28drl\" (UniqueName: \"kubernetes.io/projected/2a05c8cc-9cde-4ed9-b64d-5da623919974-kube-api-access-28drl\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.022743 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a05c8cc-9cde-4ed9-b64d-5da623919974-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.022774 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a05c8cc-9cde-4ed9-b64d-5da623919974-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.430337 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b76b5d9f-744t6" event={"ID":"60d99106-40c9-42cd-8ecc-3d552b0f5c8e","Type":"ContainerDied","Data":"11cce582e3d68162e76b5ba2930ccdfb7529b9488570c9f474356626499074b8"} Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.430434 4786 scope.go:117] "RemoveContainer" containerID="57754a3f0a06d7aeb70a87b5df72b1e27c37ad8e2ca789dc877afeba763a9fb0" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.430707 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b76b5d9f-744t6" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.434413 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-db5d78b6f-28x6k" event={"ID":"2a05c8cc-9cde-4ed9-b64d-5da623919974","Type":"ContainerDied","Data":"49d75e462d7b40582311f5760554209936420a6d9d6ecfa8694b1d43869a19bd"} Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.434555 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-db5d78b6f-28x6k" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.485796 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-db5d78b6f-28x6k"] Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.498195 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-db5d78b6f-28x6k"] Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.508470 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75b76b5d9f-744t6"] Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.517573 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75b76b5d9f-744t6"] Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.723218 4786 scope.go:117] "RemoveContainer" containerID="f83e30501cafb5bc87b2ad745afa98f7f98281e257009c4ce91bd3bb235f481a" Mar 13 16:49:43 crc kubenswrapper[4786]: I0313 16:49:43.750912 4786 scope.go:117] "RemoveContainer" containerID="8f0fe444a1c45cf8e0812c73c9fc2538f2443cc0198b4ca37e0955cec46fabd8" Mar 13 16:49:44 crc kubenswrapper[4786]: I0313 16:49:44.004877 4786 scope.go:117] "RemoveContainer" containerID="41424f54151ea8ee0bf4319ce995282fdb064cde970f67a82ed5e1cc219bcd94" Mar 13 16:49:44 crc kubenswrapper[4786]: I0313 16:49:44.580675 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" path="/var/lib/kubelet/pods/2a05c8cc-9cde-4ed9-b64d-5da623919974/volumes" Mar 13 16:49:44 crc kubenswrapper[4786]: I0313 16:49:44.582345 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" path="/var/lib/kubelet/pods/60d99106-40c9-42cd-8ecc-3d552b0f5c8e/volumes" Mar 13 16:49:44 crc kubenswrapper[4786]: I0313 16:49:44.626830 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-745cccdd94-hp7vs" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 13 16:49:50 crc kubenswrapper[4786]: I0313 16:49:50.042669 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tdp5t"] Mar 13 16:49:50 crc kubenswrapper[4786]: I0313 16:49:50.064656 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-63da-account-create-update-l2b6z"] Mar 13 16:49:50 crc kubenswrapper[4786]: I0313 16:49:50.073327 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-63da-account-create-update-l2b6z"] Mar 13 16:49:50 crc kubenswrapper[4786]: I0313 16:49:50.080579 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tdp5t"] Mar 13 16:49:50 crc kubenswrapper[4786]: I0313 16:49:50.574010 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c25059-cab8-4a39-9e13-4cf776e9177b" path="/var/lib/kubelet/pods/06c25059-cab8-4a39-9e13-4cf776e9177b/volumes" Mar 13 16:49:50 crc kubenswrapper[4786]: I0313 16:49:50.575767 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f53a8f-18fc-493d-99ad-66de081d79ca" path="/var/lib/kubelet/pods/67f53a8f-18fc-493d-99ad-66de081d79ca/volumes" Mar 13 16:49:54 crc kubenswrapper[4786]: I0313 16:49:54.626947 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-745cccdd94-hp7vs" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.170034 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557010-wtt5x"] Mar 13 16:50:00 crc kubenswrapper[4786]: E0313 16:50:00.171488 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerName="horizon-log" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.171516 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerName="horizon-log" Mar 13 16:50:00 crc kubenswrapper[4786]: E0313 16:50:00.171539 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerName="horizon" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.171552 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerName="horizon" Mar 13 16:50:00 crc kubenswrapper[4786]: E0313 16:50:00.171582 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerName="horizon-log" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.171595 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerName="horizon-log" Mar 13 16:50:00 crc kubenswrapper[4786]: E0313 16:50:00.171612 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerName="horizon" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.171625 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerName="horizon" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.171984 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerName="horizon" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.172017 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerName="horizon" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.172057 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a05c8cc-9cde-4ed9-b64d-5da623919974" containerName="horizon-log" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.172072 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d99106-40c9-42cd-8ecc-3d552b0f5c8e" containerName="horizon-log" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.175674 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557010-wtt5x" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.178693 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.178818 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.178996 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.185593 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557010-wtt5x"] Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.238672 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4frkm\" (UniqueName: \"kubernetes.io/projected/c20fb1ba-8c7f-4ca9-a73f-43f515ee9112-kube-api-access-4frkm\") pod \"auto-csr-approver-29557010-wtt5x\" (UID: \"c20fb1ba-8c7f-4ca9-a73f-43f515ee9112\") " pod="openshift-infra/auto-csr-approver-29557010-wtt5x" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.340209 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4frkm\" (UniqueName: \"kubernetes.io/projected/c20fb1ba-8c7f-4ca9-a73f-43f515ee9112-kube-api-access-4frkm\") pod \"auto-csr-approver-29557010-wtt5x\" (UID: \"c20fb1ba-8c7f-4ca9-a73f-43f515ee9112\") " pod="openshift-infra/auto-csr-approver-29557010-wtt5x" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.360001 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4frkm\" (UniqueName: \"kubernetes.io/projected/c20fb1ba-8c7f-4ca9-a73f-43f515ee9112-kube-api-access-4frkm\") pod \"auto-csr-approver-29557010-wtt5x\" (UID: \"c20fb1ba-8c7f-4ca9-a73f-43f515ee9112\") " pod="openshift-infra/auto-csr-approver-29557010-wtt5x" Mar 13 16:50:00 crc kubenswrapper[4786]: I0313 16:50:00.518172 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557010-wtt5x" Mar 13 16:50:01 crc kubenswrapper[4786]: I0313 16:50:01.067549 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557010-wtt5x"] Mar 13 16:50:01 crc kubenswrapper[4786]: I0313 16:50:01.711820 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557010-wtt5x" event={"ID":"c20fb1ba-8c7f-4ca9-a73f-43f515ee9112","Type":"ContainerStarted","Data":"49077d77efad19666dfb1da256311b34fe5a12968ddb95d8964da43df3c2318a"} Mar 13 16:50:02 crc kubenswrapper[4786]: I0313 16:50:02.728664 4786 generic.go:334] "Generic (PLEG): container finished" podID="c20fb1ba-8c7f-4ca9-a73f-43f515ee9112" containerID="14fd9b0a6a02aa1636d59b4a03ad8882b3192a8d047388e85a0163a3f1e4a77a" exitCode=0 Mar 13 16:50:02 crc kubenswrapper[4786]: I0313 16:50:02.728774 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557010-wtt5x" event={"ID":"c20fb1ba-8c7f-4ca9-a73f-43f515ee9112","Type":"ContainerDied","Data":"14fd9b0a6a02aa1636d59b4a03ad8882b3192a8d047388e85a0163a3f1e4a77a"} Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.143657 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557010-wtt5x" Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.335658 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4frkm\" (UniqueName: \"kubernetes.io/projected/c20fb1ba-8c7f-4ca9-a73f-43f515ee9112-kube-api-access-4frkm\") pod \"c20fb1ba-8c7f-4ca9-a73f-43f515ee9112\" (UID: \"c20fb1ba-8c7f-4ca9-a73f-43f515ee9112\") " Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.344753 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20fb1ba-8c7f-4ca9-a73f-43f515ee9112-kube-api-access-4frkm" (OuterVolumeSpecName: "kube-api-access-4frkm") pod "c20fb1ba-8c7f-4ca9-a73f-43f515ee9112" (UID: "c20fb1ba-8c7f-4ca9-a73f-43f515ee9112"). InnerVolumeSpecName "kube-api-access-4frkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.438615 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4frkm\" (UniqueName: \"kubernetes.io/projected/c20fb1ba-8c7f-4ca9-a73f-43f515ee9112-kube-api-access-4frkm\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.626683 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-745cccdd94-hp7vs" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.159:8443: connect: connection refused" Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.627254 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.756663 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557010-wtt5x" event={"ID":"c20fb1ba-8c7f-4ca9-a73f-43f515ee9112","Type":"ContainerDied","Data":"49077d77efad19666dfb1da256311b34fe5a12968ddb95d8964da43df3c2318a"} Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.756717 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49077d77efad19666dfb1da256311b34fe5a12968ddb95d8964da43df3c2318a" Mar 13 16:50:04 crc kubenswrapper[4786]: I0313 16:50:04.756736 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557010-wtt5x" Mar 13 16:50:05 crc kubenswrapper[4786]: I0313 16:50:05.220579 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557004-7wrf2"] Mar 13 16:50:05 crc kubenswrapper[4786]: I0313 16:50:05.231286 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557004-7wrf2"] Mar 13 16:50:06 crc kubenswrapper[4786]: I0313 16:50:06.058648 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bqxwn"] Mar 13 16:50:06 crc kubenswrapper[4786]: I0313 16:50:06.073850 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bqxwn"] Mar 13 16:50:06 crc kubenswrapper[4786]: I0313 16:50:06.564033 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd47d33-4d46-482b-b900-84f35aba9716" path="/var/lib/kubelet/pods/6dd47d33-4d46-482b-b900-84f35aba9716/volumes" Mar 13 16:50:06 crc kubenswrapper[4786]: I0313 16:50:06.564914 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d" path="/var/lib/kubelet/pods/e2bfb8cd-ba17-4a2a-b9dc-e17912c8ae4d/volumes" Mar 13 16:50:08 crc kubenswrapper[4786]: I0313 16:50:08.822532 4786 generic.go:334] "Generic (PLEG): container finished" podID="76e917bb-519d-4e0f-ba89-2272b7642137" containerID="8c1fb2bbc08ee24a5a123583a026fdde9f6c51c761de56a3acc15ccc34a8d3a3" exitCode=137 Mar 13 16:50:08 crc kubenswrapper[4786]: I0313 16:50:08.822605 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745cccdd94-hp7vs" event={"ID":"76e917bb-519d-4e0f-ba89-2272b7642137","Type":"ContainerDied","Data":"8c1fb2bbc08ee24a5a123583a026fdde9f6c51c761de56a3acc15ccc34a8d3a3"} Mar 13 16:50:08 crc kubenswrapper[4786]: I0313 16:50:08.995982 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.145513 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8hjp\" (UniqueName: \"kubernetes.io/projected/76e917bb-519d-4e0f-ba89-2272b7642137-kube-api-access-b8hjp\") pod \"76e917bb-519d-4e0f-ba89-2272b7642137\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.145717 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e917bb-519d-4e0f-ba89-2272b7642137-logs\") pod \"76e917bb-519d-4e0f-ba89-2272b7642137\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.145896 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-combined-ca-bundle\") pod \"76e917bb-519d-4e0f-ba89-2272b7642137\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.146034 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-secret-key\") pod \"76e917bb-519d-4e0f-ba89-2272b7642137\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.146164 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-config-data\") pod \"76e917bb-519d-4e0f-ba89-2272b7642137\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.146221 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-tls-certs\") pod \"76e917bb-519d-4e0f-ba89-2272b7642137\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.146295 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-scripts\") pod \"76e917bb-519d-4e0f-ba89-2272b7642137\" (UID: \"76e917bb-519d-4e0f-ba89-2272b7642137\") " Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.146547 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76e917bb-519d-4e0f-ba89-2272b7642137-logs" (OuterVolumeSpecName: "logs") pod "76e917bb-519d-4e0f-ba89-2272b7642137" (UID: "76e917bb-519d-4e0f-ba89-2272b7642137"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.147126 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e917bb-519d-4e0f-ba89-2272b7642137-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.151550 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "76e917bb-519d-4e0f-ba89-2272b7642137" (UID: "76e917bb-519d-4e0f-ba89-2272b7642137"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.152384 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e917bb-519d-4e0f-ba89-2272b7642137-kube-api-access-b8hjp" (OuterVolumeSpecName: "kube-api-access-b8hjp") pod "76e917bb-519d-4e0f-ba89-2272b7642137" (UID: "76e917bb-519d-4e0f-ba89-2272b7642137"). InnerVolumeSpecName "kube-api-access-b8hjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.191551 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-scripts" (OuterVolumeSpecName: "scripts") pod "76e917bb-519d-4e0f-ba89-2272b7642137" (UID: "76e917bb-519d-4e0f-ba89-2272b7642137"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.192983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76e917bb-519d-4e0f-ba89-2272b7642137" (UID: "76e917bb-519d-4e0f-ba89-2272b7642137"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.200551 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-config-data" (OuterVolumeSpecName: "config-data") pod "76e917bb-519d-4e0f-ba89-2272b7642137" (UID: "76e917bb-519d-4e0f-ba89-2272b7642137"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.221489 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "76e917bb-519d-4e0f-ba89-2272b7642137" (UID: "76e917bb-519d-4e0f-ba89-2272b7642137"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.249269 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8hjp\" (UniqueName: \"kubernetes.io/projected/76e917bb-519d-4e0f-ba89-2272b7642137-kube-api-access-b8hjp\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.249304 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.249313 4786 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.249322 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.249330 4786 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e917bb-519d-4e0f-ba89-2272b7642137-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.249340 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e917bb-519d-4e0f-ba89-2272b7642137-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.842931 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-745cccdd94-hp7vs" event={"ID":"76e917bb-519d-4e0f-ba89-2272b7642137","Type":"ContainerDied","Data":"6be3b47e39471a8f63dba9eb90c0adfb3668e55c31403a1dc7b795a19b403da3"} Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.843947 4786 scope.go:117] "RemoveContainer" containerID="74b79f22d7aea4a3cd956a3aa73064e79120bf38391c6bdcc073913e20621c65" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.843071 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-745cccdd94-hp7vs" Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.900254 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-745cccdd94-hp7vs"] Mar 13 16:50:09 crc kubenswrapper[4786]: I0313 16:50:09.910591 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-745cccdd94-hp7vs"] Mar 13 16:50:10 crc kubenswrapper[4786]: I0313 16:50:10.067382 4786 scope.go:117] "RemoveContainer" containerID="8c1fb2bbc08ee24a5a123583a026fdde9f6c51c761de56a3acc15ccc34a8d3a3" Mar 13 16:50:10 crc kubenswrapper[4786]: I0313 16:50:10.574433 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" path="/var/lib/kubelet/pods/76e917bb-519d-4e0f-ba89-2272b7642137/volumes" Mar 13 16:50:16 crc kubenswrapper[4786]: I0313 16:50:16.455225 4786 scope.go:117] "RemoveContainer" containerID="65715ca4bd8466cd622ab3526dc772011016a62406ce4f7f6492d2ff0262a82a" Mar 13 16:50:16 crc kubenswrapper[4786]: I0313 16:50:16.525219 4786 scope.go:117] "RemoveContainer" containerID="0dc6495f7cca92af457f8bb053b6071890b93a8e43773c4e9177ba4e3ad02528" Mar 13 16:50:16 crc kubenswrapper[4786]: I0313 16:50:16.601189 4786 scope.go:117] "RemoveContainer" containerID="6d97179f85c12ce1d7e844a49536e50883e86d0c91f87272482b92aa3ee37afa" Mar 13 16:50:16 crc kubenswrapper[4786]: I0313 16:50:16.663153 4786 scope.go:117] "RemoveContainer" containerID="9f6d6f94fbc1513d679e0254b54d529ececcf48eb2b324723474242458ee2a6b" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.468213 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b979877c4-zsjfz"] Mar 13 16:50:19 crc kubenswrapper[4786]: E0313 16:50:19.470007 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20fb1ba-8c7f-4ca9-a73f-43f515ee9112" containerName="oc" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.470093 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20fb1ba-8c7f-4ca9-a73f-43f515ee9112" containerName="oc" Mar 13 16:50:19 crc kubenswrapper[4786]: E0313 16:50:19.470200 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.470262 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon" Mar 13 16:50:19 crc kubenswrapper[4786]: E0313 16:50:19.470324 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon-log" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.470389 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon-log" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.470606 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon-log" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.470678 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e917bb-519d-4e0f-ba89-2272b7642137" containerName="horizon" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.473949 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c20fb1ba-8c7f-4ca9-a73f-43f515ee9112" containerName="oc" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.475201 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.489652 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b979877c4-zsjfz"] Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.583497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq8x9\" (UniqueName: \"kubernetes.io/projected/78f5805d-cef6-45e1-bbac-e4edb49b8273-kube-api-access-mq8x9\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.583577 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-horizon-secret-key\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.583618 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-horizon-tls-certs\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.583660 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78f5805d-cef6-45e1-bbac-e4edb49b8273-config-data\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.583693 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f5805d-cef6-45e1-bbac-e4edb49b8273-logs\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.583940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-combined-ca-bundle\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.583986 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78f5805d-cef6-45e1-bbac-e4edb49b8273-scripts\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.685748 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-horizon-secret-key\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.686125 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-horizon-tls-certs\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.686178 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78f5805d-cef6-45e1-bbac-e4edb49b8273-config-data\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.686207 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f5805d-cef6-45e1-bbac-e4edb49b8273-logs\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.686325 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-combined-ca-bundle\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.686348 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78f5805d-cef6-45e1-bbac-e4edb49b8273-scripts\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.686449 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq8x9\" (UniqueName: \"kubernetes.io/projected/78f5805d-cef6-45e1-bbac-e4edb49b8273-kube-api-access-mq8x9\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.687202 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f5805d-cef6-45e1-bbac-e4edb49b8273-logs\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.687274 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78f5805d-cef6-45e1-bbac-e4edb49b8273-scripts\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.687739 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78f5805d-cef6-45e1-bbac-e4edb49b8273-config-data\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.692003 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-horizon-secret-key\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.692231 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-combined-ca-bundle\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.692402 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f5805d-cef6-45e1-bbac-e4edb49b8273-horizon-tls-certs\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.705360 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq8x9\" (UniqueName: \"kubernetes.io/projected/78f5805d-cef6-45e1-bbac-e4edb49b8273-kube-api-access-mq8x9\") pod \"horizon-6b979877c4-zsjfz\" (UID: \"78f5805d-cef6-45e1-bbac-e4edb49b8273\") " pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:19 crc kubenswrapper[4786]: I0313 16:50:19.795411 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.265375 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b979877c4-zsjfz"] Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.792604 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-qxr2j"] Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.794546 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.805613 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-qxr2j"] Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.891645 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-c03e-account-create-update-qjtm7"] Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.892779 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.894839 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.909712 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-c03e-account-create-update-qjtm7"] Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.912033 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vjnm\" (UniqueName: \"kubernetes.io/projected/7af1308e-8fe9-42d7-b748-c9bf713499d3-kube-api-access-9vjnm\") pod \"heat-c03e-account-create-update-qjtm7\" (UID: \"7af1308e-8fe9-42d7-b748-c9bf713499d3\") " pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.912083 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-operator-scripts\") pod \"heat-db-create-qxr2j\" (UID: \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\") " pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.912114 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7af1308e-8fe9-42d7-b748-c9bf713499d3-operator-scripts\") pod \"heat-c03e-account-create-update-qjtm7\" (UID: \"7af1308e-8fe9-42d7-b748-c9bf713499d3\") " pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:20 crc kubenswrapper[4786]: I0313 16:50:20.912261 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgpkn\" (UniqueName: \"kubernetes.io/projected/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-kube-api-access-fgpkn\") pod \"heat-db-create-qxr2j\" (UID: \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\") " pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:20.998316 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b979877c4-zsjfz" event={"ID":"78f5805d-cef6-45e1-bbac-e4edb49b8273","Type":"ContainerStarted","Data":"09b408f1468fe4951b79c40b10711780f10b7a0afef53c14d6005ba4294fff40"} Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:20.998360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b979877c4-zsjfz" event={"ID":"78f5805d-cef6-45e1-bbac-e4edb49b8273","Type":"ContainerStarted","Data":"3dc630715b34716815d0bf5f194e6fa17443b50fea7bd0fa97233dc3a78e0c95"} Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:20.998370 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b979877c4-zsjfz" event={"ID":"78f5805d-cef6-45e1-bbac-e4edb49b8273","Type":"ContainerStarted","Data":"7b52eca4c643c9293c3910b4e3e85dc9697163e1166fcbddea47b599751e4463"} Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.013793 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-operator-scripts\") pod \"heat-db-create-qxr2j\" (UID: \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\") " pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.014134 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7af1308e-8fe9-42d7-b748-c9bf713499d3-operator-scripts\") pod \"heat-c03e-account-create-update-qjtm7\" (UID: \"7af1308e-8fe9-42d7-b748-c9bf713499d3\") " pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.014252 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgpkn\" (UniqueName: \"kubernetes.io/projected/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-kube-api-access-fgpkn\") pod \"heat-db-create-qxr2j\" (UID: \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\") " pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.014301 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vjnm\" (UniqueName: \"kubernetes.io/projected/7af1308e-8fe9-42d7-b748-c9bf713499d3-kube-api-access-9vjnm\") pod \"heat-c03e-account-create-update-qjtm7\" (UID: \"7af1308e-8fe9-42d7-b748-c9bf713499d3\") " pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.015157 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7af1308e-8fe9-42d7-b748-c9bf713499d3-operator-scripts\") pod \"heat-c03e-account-create-update-qjtm7\" (UID: \"7af1308e-8fe9-42d7-b748-c9bf713499d3\") " pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.015429 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-operator-scripts\") pod \"heat-db-create-qxr2j\" (UID: \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\") " pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.022324 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6b979877c4-zsjfz" podStartSLOduration=2.022307882 podStartE2EDuration="2.022307882s" podCreationTimestamp="2026-03-13 16:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:50:21.0190578 +0000 UTC m=+6451.182269631" watchObservedRunningTime="2026-03-13 16:50:21.022307882 +0000 UTC m=+6451.185519693" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.029638 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vjnm\" (UniqueName: \"kubernetes.io/projected/7af1308e-8fe9-42d7-b748-c9bf713499d3-kube-api-access-9vjnm\") pod \"heat-c03e-account-create-update-qjtm7\" (UID: \"7af1308e-8fe9-42d7-b748-c9bf713499d3\") " pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.034812 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgpkn\" (UniqueName: \"kubernetes.io/projected/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-kube-api-access-fgpkn\") pod \"heat-db-create-qxr2j\" (UID: \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\") " pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.117032 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.213732 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.606698 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-qxr2j"] Mar 13 16:50:21 crc kubenswrapper[4786]: W0313 16:50:21.614291 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fbf1fd0_6bcf_47bf_9567_f3abe55a4c37.slice/crio-9b0c6637cb29661094d3207b208a99abad63c7fac40ada04d987bb2c527e7058 WatchSource:0}: Error finding container 9b0c6637cb29661094d3207b208a99abad63c7fac40ada04d987bb2c527e7058: Status 404 returned error can't find the container with id 9b0c6637cb29661094d3207b208a99abad63c7fac40ada04d987bb2c527e7058 Mar 13 16:50:21 crc kubenswrapper[4786]: I0313 16:50:21.762341 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-c03e-account-create-update-qjtm7"] Mar 13 16:50:22 crc kubenswrapper[4786]: I0313 16:50:22.008374 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-qxr2j" event={"ID":"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37","Type":"ContainerStarted","Data":"b5315a5c7ed9c91e293fe0a82d1fb1abb2b6be4f2eec20e61782fb82880c11fe"} Mar 13 16:50:22 crc kubenswrapper[4786]: I0313 16:50:22.008624 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-qxr2j" event={"ID":"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37","Type":"ContainerStarted","Data":"9b0c6637cb29661094d3207b208a99abad63c7fac40ada04d987bb2c527e7058"} Mar 13 16:50:22 crc kubenswrapper[4786]: I0313 16:50:22.010993 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-c03e-account-create-update-qjtm7" event={"ID":"7af1308e-8fe9-42d7-b748-c9bf713499d3","Type":"ContainerStarted","Data":"9f80eb357d84177c0768363e0503ccf4857f9188d1bc18b525c4429d4dd4fb90"} Mar 13 16:50:22 crc kubenswrapper[4786]: I0313 16:50:22.011070 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-c03e-account-create-update-qjtm7" event={"ID":"7af1308e-8fe9-42d7-b748-c9bf713499d3","Type":"ContainerStarted","Data":"6458a387ede7e0ba2fb4392b7ef681cbeb151f080d68874351a36b5bd2fcc567"} Mar 13 16:50:22 crc kubenswrapper[4786]: I0313 16:50:22.029379 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-qxr2j" podStartSLOduration=2.029352413 podStartE2EDuration="2.029352413s" podCreationTimestamp="2026-03-13 16:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:50:22.019839834 +0000 UTC m=+6452.183051655" watchObservedRunningTime="2026-03-13 16:50:22.029352413 +0000 UTC m=+6452.192564254" Mar 13 16:50:22 crc kubenswrapper[4786]: I0313 16:50:22.049194 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-c03e-account-create-update-qjtm7" podStartSLOduration=2.049166341 podStartE2EDuration="2.049166341s" podCreationTimestamp="2026-03-13 16:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:50:22.038518493 +0000 UTC m=+6452.201730344" watchObservedRunningTime="2026-03-13 16:50:22.049166341 +0000 UTC m=+6452.212378162" Mar 13 16:50:23 crc kubenswrapper[4786]: I0313 16:50:23.032874 4786 generic.go:334] "Generic (PLEG): container finished" podID="7af1308e-8fe9-42d7-b748-c9bf713499d3" containerID="9f80eb357d84177c0768363e0503ccf4857f9188d1bc18b525c4429d4dd4fb90" exitCode=0 Mar 13 16:50:23 crc kubenswrapper[4786]: I0313 16:50:23.032979 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-c03e-account-create-update-qjtm7" event={"ID":"7af1308e-8fe9-42d7-b748-c9bf713499d3","Type":"ContainerDied","Data":"9f80eb357d84177c0768363e0503ccf4857f9188d1bc18b525c4429d4dd4fb90"} Mar 13 16:50:23 crc kubenswrapper[4786]: I0313 16:50:23.039719 4786 generic.go:334] "Generic (PLEG): container finished" podID="3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37" containerID="b5315a5c7ed9c91e293fe0a82d1fb1abb2b6be4f2eec20e61782fb82880c11fe" exitCode=0 Mar 13 16:50:23 crc kubenswrapper[4786]: I0313 16:50:23.039785 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-qxr2j" event={"ID":"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37","Type":"ContainerDied","Data":"b5315a5c7ed9c91e293fe0a82d1fb1abb2b6be4f2eec20e61782fb82880c11fe"} Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.626757 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.635844 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.690093 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgpkn\" (UniqueName: \"kubernetes.io/projected/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-kube-api-access-fgpkn\") pod \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\" (UID: \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\") " Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.690152 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7af1308e-8fe9-42d7-b748-c9bf713499d3-operator-scripts\") pod \"7af1308e-8fe9-42d7-b748-c9bf713499d3\" (UID: \"7af1308e-8fe9-42d7-b748-c9bf713499d3\") " Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.690212 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vjnm\" (UniqueName: \"kubernetes.io/projected/7af1308e-8fe9-42d7-b748-c9bf713499d3-kube-api-access-9vjnm\") pod \"7af1308e-8fe9-42d7-b748-c9bf713499d3\" (UID: \"7af1308e-8fe9-42d7-b748-c9bf713499d3\") " Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.690412 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-operator-scripts\") pod \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\" (UID: \"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37\") " Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.691643 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37" (UID: "3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.691798 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af1308e-8fe9-42d7-b748-c9bf713499d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7af1308e-8fe9-42d7-b748-c9bf713499d3" (UID: "7af1308e-8fe9-42d7-b748-c9bf713499d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.696264 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af1308e-8fe9-42d7-b748-c9bf713499d3-kube-api-access-9vjnm" (OuterVolumeSpecName: "kube-api-access-9vjnm") pod "7af1308e-8fe9-42d7-b748-c9bf713499d3" (UID: "7af1308e-8fe9-42d7-b748-c9bf713499d3"). InnerVolumeSpecName "kube-api-access-9vjnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.696917 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-kube-api-access-fgpkn" (OuterVolumeSpecName: "kube-api-access-fgpkn") pod "3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37" (UID: "3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37"). InnerVolumeSpecName "kube-api-access-fgpkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.792922 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgpkn\" (UniqueName: \"kubernetes.io/projected/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-kube-api-access-fgpkn\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.792952 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7af1308e-8fe9-42d7-b748-c9bf713499d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.792963 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vjnm\" (UniqueName: \"kubernetes.io/projected/7af1308e-8fe9-42d7-b748-c9bf713499d3-kube-api-access-9vjnm\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:24 crc kubenswrapper[4786]: I0313 16:50:24.792971 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:25 crc kubenswrapper[4786]: I0313 16:50:25.065536 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-c03e-account-create-update-qjtm7" event={"ID":"7af1308e-8fe9-42d7-b748-c9bf713499d3","Type":"ContainerDied","Data":"6458a387ede7e0ba2fb4392b7ef681cbeb151f080d68874351a36b5bd2fcc567"} Mar 13 16:50:25 crc kubenswrapper[4786]: I0313 16:50:25.065584 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-c03e-account-create-update-qjtm7" Mar 13 16:50:25 crc kubenswrapper[4786]: I0313 16:50:25.065589 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6458a387ede7e0ba2fb4392b7ef681cbeb151f080d68874351a36b5bd2fcc567" Mar 13 16:50:25 crc kubenswrapper[4786]: I0313 16:50:25.068756 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-qxr2j" event={"ID":"3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37","Type":"ContainerDied","Data":"9b0c6637cb29661094d3207b208a99abad63c7fac40ada04d987bb2c527e7058"} Mar 13 16:50:25 crc kubenswrapper[4786]: I0313 16:50:25.068783 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b0c6637cb29661094d3207b208a99abad63c7fac40ada04d987bb2c527e7058" Mar 13 16:50:25 crc kubenswrapper[4786]: I0313 16:50:25.068849 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-qxr2j" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.013303 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-b952z"] Mar 13 16:50:26 crc kubenswrapper[4786]: E0313 16:50:26.014265 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37" containerName="mariadb-database-create" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.014284 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37" containerName="mariadb-database-create" Mar 13 16:50:26 crc kubenswrapper[4786]: E0313 16:50:26.014310 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af1308e-8fe9-42d7-b748-c9bf713499d3" containerName="mariadb-account-create-update" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.014318 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af1308e-8fe9-42d7-b748-c9bf713499d3" containerName="mariadb-account-create-update" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.014556 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af1308e-8fe9-42d7-b748-c9bf713499d3" containerName="mariadb-account-create-update" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.014581 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37" containerName="mariadb-database-create" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.015372 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.018418 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-kbxzm" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.018691 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.021764 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-combined-ca-bundle\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.021921 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-config-data\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.022235 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqbqr\" (UniqueName: \"kubernetes.io/projected/0a523430-946d-4f79-b557-d784192b2e95-kube-api-access-tqbqr\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.033480 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-b952z"] Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.124118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-combined-ca-bundle\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.124167 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-config-data\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.124289 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqbqr\" (UniqueName: \"kubernetes.io/projected/0a523430-946d-4f79-b557-d784192b2e95-kube-api-access-tqbqr\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.129927 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-config-data\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.137900 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-combined-ca-bundle\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.146909 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqbqr\" (UniqueName: \"kubernetes.io/projected/0a523430-946d-4f79-b557-d784192b2e95-kube-api-access-tqbqr\") pod \"heat-db-sync-b952z\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.346125 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-b952z" Mar 13 16:50:26 crc kubenswrapper[4786]: I0313 16:50:26.927010 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-b952z"] Mar 13 16:50:27 crc kubenswrapper[4786]: I0313 16:50:27.087935 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-b952z" event={"ID":"0a523430-946d-4f79-b557-d784192b2e95","Type":"ContainerStarted","Data":"63ab25ba25360edea773a3d48b0cf85efc427ec04fc249799a60ff76ea9e8d73"} Mar 13 16:50:29 crc kubenswrapper[4786]: I0313 16:50:29.795814 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:29 crc kubenswrapper[4786]: I0313 16:50:29.796345 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:37 crc kubenswrapper[4786]: I0313 16:50:37.195234 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-b952z" event={"ID":"0a523430-946d-4f79-b557-d784192b2e95","Type":"ContainerStarted","Data":"de0519d3ae4697eff464b53e6e5539a2e5efb2a11285388fa46cca8212c2fb93"} Mar 13 16:50:37 crc kubenswrapper[4786]: I0313 16:50:37.211034 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-b952z" podStartSLOduration=2.3796481529999998 podStartE2EDuration="12.211009062s" podCreationTimestamp="2026-03-13 16:50:25 +0000 UTC" firstStartedPulling="2026-03-13 16:50:26.927883423 +0000 UTC m=+6457.091095234" lastFinishedPulling="2026-03-13 16:50:36.759244322 +0000 UTC m=+6466.922456143" observedRunningTime="2026-03-13 16:50:37.208361946 +0000 UTC m=+6467.371573757" watchObservedRunningTime="2026-03-13 16:50:37.211009062 +0000 UTC m=+6467.374220873" Mar 13 16:50:39 crc kubenswrapper[4786]: I0313 16:50:39.220996 4786 generic.go:334] "Generic (PLEG): container finished" podID="0a523430-946d-4f79-b557-d784192b2e95" containerID="de0519d3ae4697eff464b53e6e5539a2e5efb2a11285388fa46cca8212c2fb93" exitCode=0 Mar 13 16:50:39 crc kubenswrapper[4786]: I0313 16:50:39.221191 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-b952z" event={"ID":"0a523430-946d-4f79-b557-d784192b2e95","Type":"ContainerDied","Data":"de0519d3ae4697eff464b53e6e5539a2e5efb2a11285388fa46cca8212c2fb93"} Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.689317 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-b952z" Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.795969 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-config-data\") pod \"0a523430-946d-4f79-b557-d784192b2e95\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.796102 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqbqr\" (UniqueName: \"kubernetes.io/projected/0a523430-946d-4f79-b557-d784192b2e95-kube-api-access-tqbqr\") pod \"0a523430-946d-4f79-b557-d784192b2e95\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.796265 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-combined-ca-bundle\") pod \"0a523430-946d-4f79-b557-d784192b2e95\" (UID: \"0a523430-946d-4f79-b557-d784192b2e95\") " Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.812286 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a523430-946d-4f79-b557-d784192b2e95-kube-api-access-tqbqr" (OuterVolumeSpecName: "kube-api-access-tqbqr") pod "0a523430-946d-4f79-b557-d784192b2e95" (UID: "0a523430-946d-4f79-b557-d784192b2e95"). InnerVolumeSpecName "kube-api-access-tqbqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.846818 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a523430-946d-4f79-b557-d784192b2e95" (UID: "0a523430-946d-4f79-b557-d784192b2e95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.899579 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqbqr\" (UniqueName: \"kubernetes.io/projected/0a523430-946d-4f79-b557-d784192b2e95-kube-api-access-tqbqr\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.899608 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:40 crc kubenswrapper[4786]: I0313 16:50:40.907313 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-config-data" (OuterVolumeSpecName: "config-data") pod "0a523430-946d-4f79-b557-d784192b2e95" (UID: "0a523430-946d-4f79-b557-d784192b2e95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:41 crc kubenswrapper[4786]: I0313 16:50:41.001309 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a523430-946d-4f79-b557-d784192b2e95-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:41 crc kubenswrapper[4786]: I0313 16:50:41.243468 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-b952z" event={"ID":"0a523430-946d-4f79-b557-d784192b2e95","Type":"ContainerDied","Data":"63ab25ba25360edea773a3d48b0cf85efc427ec04fc249799a60ff76ea9e8d73"} Mar 13 16:50:41 crc kubenswrapper[4786]: I0313 16:50:41.243547 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63ab25ba25360edea773a3d48b0cf85efc427ec04fc249799a60ff76ea9e8d73" Mar 13 16:50:41 crc kubenswrapper[4786]: I0313 16:50:41.243693 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-b952z" Mar 13 16:50:41 crc kubenswrapper[4786]: I0313 16:50:41.797314 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.612124 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-664c6d546f-hgh75"] Mar 13 16:50:42 crc kubenswrapper[4786]: E0313 16:50:42.613023 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a523430-946d-4f79-b557-d784192b2e95" containerName="heat-db-sync" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.613047 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a523430-946d-4f79-b557-d784192b2e95" containerName="heat-db-sync" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.613296 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a523430-946d-4f79-b557-d784192b2e95" containerName="heat-db-sync" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.614321 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.617365 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.617577 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-kbxzm" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.617657 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.629199 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-664c6d546f-hgh75"] Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.643304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data-custom\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.643443 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rmd8\" (UniqueName: \"kubernetes.io/projected/60d28482-12bb-4f4b-870e-92425c5aedc4-kube-api-access-6rmd8\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.643471 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.643495 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-combined-ca-bundle\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.746807 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rmd8\" (UniqueName: \"kubernetes.io/projected/60d28482-12bb-4f4b-870e-92425c5aedc4-kube-api-access-6rmd8\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.746866 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.746888 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-combined-ca-bundle\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.747021 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data-custom\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.774897 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.775729 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data-custom\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.780493 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-combined-ca-bundle\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.813144 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6c75c58b4b-jk68l"] Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.814447 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.818946 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rmd8\" (UniqueName: \"kubernetes.io/projected/60d28482-12bb-4f4b-870e-92425c5aedc4-kube-api-access-6rmd8\") pod \"heat-engine-664c6d546f-hgh75\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.821187 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.834926 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c75c58b4b-jk68l"] Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.848938 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.848985 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-combined-ca-bundle\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.849080 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data-custom\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.849109 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhkb\" (UniqueName: \"kubernetes.io/projected/b8f3c97c-cc24-4506-97a3-d04bd1864f24-kube-api-access-8lhkb\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.853617 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-56675d887f-rgjvs"] Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.854940 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.856685 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.879835 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-56675d887f-rgjvs"] Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.943083 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.951194 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.951427 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data-custom\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.951465 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.951480 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-combined-ca-bundle\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.951502 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-combined-ca-bundle\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.951522 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqsv\" (UniqueName: \"kubernetes.io/projected/faacc636-1527-4dae-bacb-e634a06765c3-kube-api-access-fjqsv\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.951597 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data-custom\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.951628 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhkb\" (UniqueName: \"kubernetes.io/projected/b8f3c97c-cc24-4506-97a3-d04bd1864f24-kube-api-access-8lhkb\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.962722 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-combined-ca-bundle\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.963341 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.966552 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data-custom\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:42 crc kubenswrapper[4786]: I0313 16:50:42.970978 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhkb\" (UniqueName: \"kubernetes.io/projected/b8f3c97c-cc24-4506-97a3-d04bd1864f24-kube-api-access-8lhkb\") pod \"heat-cfnapi-6c75c58b4b-jk68l\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.053972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.054029 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data-custom\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.054059 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-combined-ca-bundle\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.054085 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqsv\" (UniqueName: \"kubernetes.io/projected/faacc636-1527-4dae-bacb-e634a06765c3-kube-api-access-fjqsv\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.064626 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-combined-ca-bundle\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.068491 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data-custom\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.073219 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.074922 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqsv\" (UniqueName: \"kubernetes.io/projected/faacc636-1527-4dae-bacb-e634a06765c3-kube-api-access-fjqsv\") pod \"heat-api-56675d887f-rgjvs\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.207353 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.214880 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.401264 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-664c6d546f-hgh75"] Mar 13 16:50:43 crc kubenswrapper[4786]: W0313 16:50:43.401976 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60d28482_12bb_4f4b_870e_92425c5aedc4.slice/crio-1026d66db5d92d4a7695196735ae11da8da3207cd0b87d90f590f1e8c2a31c88 WatchSource:0}: Error finding container 1026d66db5d92d4a7695196735ae11da8da3207cd0b87d90f590f1e8c2a31c88: Status 404 returned error can't find the container with id 1026d66db5d92d4a7695196735ae11da8da3207cd0b87d90f590f1e8c2a31c88 Mar 13 16:50:43 crc kubenswrapper[4786]: W0313 16:50:43.708509 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaacc636_1527_4dae_bacb_e634a06765c3.slice/crio-83effa430ec0647f08b9ac16dfa9d60b7826e94e5c64a7e15ab5a6a0d6c77d8e WatchSource:0}: Error finding container 83effa430ec0647f08b9ac16dfa9d60b7826e94e5c64a7e15ab5a6a0d6c77d8e: Status 404 returned error can't find the container with id 83effa430ec0647f08b9ac16dfa9d60b7826e94e5c64a7e15ab5a6a0d6c77d8e Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.717550 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-56675d887f-rgjvs"] Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.783629 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6b979877c4-zsjfz" Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.790016 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c75c58b4b-jk68l"] Mar 13 16:50:43 crc kubenswrapper[4786]: W0313 16:50:43.791364 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f3c97c_cc24_4506_97a3_d04bd1864f24.slice/crio-07cb7d2bc8c3acb15a8727d14bc93850cd1960b00783101b217ecf08e8e1b201 WatchSource:0}: Error finding container 07cb7d2bc8c3acb15a8727d14bc93850cd1960b00783101b217ecf08e8e1b201: Status 404 returned error can't find the container with id 07cb7d2bc8c3acb15a8727d14bc93850cd1960b00783101b217ecf08e8e1b201 Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.850948 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6667bbdf64-hcxzc"] Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.852043 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6667bbdf64-hcxzc" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon-log" containerID="cri-o://028ac99db632f4107ca629da7a995de2180c047ada30e587a00d63b7fabc1ae2" gracePeriod=30 Mar 13 16:50:43 crc kubenswrapper[4786]: I0313 16:50:43.852626 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6667bbdf64-hcxzc" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" containerID="cri-o://586157e2794d58c3593b5f8272f7d0fc627737484ca31706ba7d96225f6404e0" gracePeriod=30 Mar 13 16:50:44 crc kubenswrapper[4786]: I0313 16:50:44.278935 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-664c6d546f-hgh75" event={"ID":"60d28482-12bb-4f4b-870e-92425c5aedc4","Type":"ContainerStarted","Data":"441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36"} Mar 13 16:50:44 crc kubenswrapper[4786]: I0313 16:50:44.279401 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-664c6d546f-hgh75" event={"ID":"60d28482-12bb-4f4b-870e-92425c5aedc4","Type":"ContainerStarted","Data":"1026d66db5d92d4a7695196735ae11da8da3207cd0b87d90f590f1e8c2a31c88"} Mar 13 16:50:44 crc kubenswrapper[4786]: I0313 16:50:44.279448 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:50:44 crc kubenswrapper[4786]: I0313 16:50:44.280119 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56675d887f-rgjvs" event={"ID":"faacc636-1527-4dae-bacb-e634a06765c3","Type":"ContainerStarted","Data":"83effa430ec0647f08b9ac16dfa9d60b7826e94e5c64a7e15ab5a6a0d6c77d8e"} Mar 13 16:50:44 crc kubenswrapper[4786]: I0313 16:50:44.281686 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" event={"ID":"b8f3c97c-cc24-4506-97a3-d04bd1864f24","Type":"ContainerStarted","Data":"07cb7d2bc8c3acb15a8727d14bc93850cd1960b00783101b217ecf08e8e1b201"} Mar 13 16:50:44 crc kubenswrapper[4786]: I0313 16:50:44.300365 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-664c6d546f-hgh75" podStartSLOduration=2.300349983 podStartE2EDuration="2.300349983s" podCreationTimestamp="2026-03-13 16:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:50:44.296198489 +0000 UTC m=+6474.459410300" watchObservedRunningTime="2026-03-13 16:50:44.300349983 +0000 UTC m=+6474.463561794" Mar 13 16:50:46 crc kubenswrapper[4786]: I0313 16:50:46.316333 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" event={"ID":"b8f3c97c-cc24-4506-97a3-d04bd1864f24","Type":"ContainerStarted","Data":"b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28"} Mar 13 16:50:46 crc kubenswrapper[4786]: I0313 16:50:46.316636 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:46 crc kubenswrapper[4786]: I0313 16:50:46.317985 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56675d887f-rgjvs" event={"ID":"faacc636-1527-4dae-bacb-e634a06765c3","Type":"ContainerStarted","Data":"5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224"} Mar 13 16:50:46 crc kubenswrapper[4786]: I0313 16:50:46.318425 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:46 crc kubenswrapper[4786]: I0313 16:50:46.381949 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-56675d887f-rgjvs" podStartSLOduration=2.390696632 podStartE2EDuration="4.381929999s" podCreationTimestamp="2026-03-13 16:50:42 +0000 UTC" firstStartedPulling="2026-03-13 16:50:43.713410847 +0000 UTC m=+6473.876622658" lastFinishedPulling="2026-03-13 16:50:45.704644214 +0000 UTC m=+6475.867856025" observedRunningTime="2026-03-13 16:50:46.37517153 +0000 UTC m=+6476.538383341" watchObservedRunningTime="2026-03-13 16:50:46.381929999 +0000 UTC m=+6476.545141810" Mar 13 16:50:46 crc kubenswrapper[4786]: I0313 16:50:46.382286 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" podStartSLOduration=2.486113558 podStartE2EDuration="4.382281368s" podCreationTimestamp="2026-03-13 16:50:42 +0000 UTC" firstStartedPulling="2026-03-13 16:50:43.808961127 +0000 UTC m=+6473.972172938" lastFinishedPulling="2026-03-13 16:50:45.705128937 +0000 UTC m=+6475.868340748" observedRunningTime="2026-03-13 16:50:46.346565581 +0000 UTC m=+6476.509777392" watchObservedRunningTime="2026-03-13 16:50:46.382281368 +0000 UTC m=+6476.545493179" Mar 13 16:50:47 crc kubenswrapper[4786]: I0313 16:50:47.023342 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6667bbdf64-hcxzc" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.160:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:35986->10.217.1.160:8443: read: connection reset by peer" Mar 13 16:50:47 crc kubenswrapper[4786]: I0313 16:50:47.332978 4786 generic.go:334] "Generic (PLEG): container finished" podID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerID="586157e2794d58c3593b5f8272f7d0fc627737484ca31706ba7d96225f6404e0" exitCode=0 Mar 13 16:50:47 crc kubenswrapper[4786]: I0313 16:50:47.333076 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667bbdf64-hcxzc" event={"ID":"a2213440-3009-4db0-a1c4-7d5d4a12481b","Type":"ContainerDied","Data":"586157e2794d58c3593b5f8272f7d0fc627737484ca31706ba7d96225f6404e0"} Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.780486 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6f955857ff-qd498"] Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.782083 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.801991 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6f955857ff-qd498"] Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.812280 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qqj2\" (UniqueName: \"kubernetes.io/projected/1d5cf355-7214-417e-85dc-cc9a451d1649-kube-api-access-5qqj2\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.812338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-config-data-custom\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.812757 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-combined-ca-bundle\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.812801 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-config-data\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.868817 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-58974d5667-gmm9k"] Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.870108 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.885117 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7868997dbd-vwz94"] Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.886401 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.900754 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58974d5667-gmm9k"] Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914271 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cnwm\" (UniqueName: \"kubernetes.io/projected/f22627eb-d27d-46de-bf5e-d473f566d51a-kube-api-access-5cnwm\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914331 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data-custom\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914410 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-combined-ca-bundle\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914456 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914485 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-config-data\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914501 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-combined-ca-bundle\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914544 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data-custom\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914594 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914624 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qqj2\" (UniqueName: \"kubernetes.io/projected/1d5cf355-7214-417e-85dc-cc9a451d1649-kube-api-access-5qqj2\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914643 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-config-data-custom\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914674 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-combined-ca-bundle\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.914711 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29twd\" (UniqueName: \"kubernetes.io/projected/701dcccb-b7f5-4545-9442-a417185439f5-kube-api-access-29twd\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.915220 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7868997dbd-vwz94"] Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.919802 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-config-data-custom\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.921696 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-combined-ca-bundle\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.922417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d5cf355-7214-417e-85dc-cc9a451d1649-config-data\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:49 crc kubenswrapper[4786]: I0313 16:50:49.928922 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qqj2\" (UniqueName: \"kubernetes.io/projected/1d5cf355-7214-417e-85dc-cc9a451d1649-kube-api-access-5qqj2\") pod \"heat-engine-6f955857ff-qd498\" (UID: \"1d5cf355-7214-417e-85dc-cc9a451d1649\") " pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.016957 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29twd\" (UniqueName: \"kubernetes.io/projected/701dcccb-b7f5-4545-9442-a417185439f5-kube-api-access-29twd\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.017356 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cnwm\" (UniqueName: \"kubernetes.io/projected/f22627eb-d27d-46de-bf5e-d473f566d51a-kube-api-access-5cnwm\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.017391 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data-custom\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.017430 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-combined-ca-bundle\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.017479 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.017538 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data-custom\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.017585 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.017647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-combined-ca-bundle\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.021872 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data-custom\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.022156 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.025978 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-combined-ca-bundle\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.030248 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-combined-ca-bundle\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.032645 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.032734 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29twd\" (UniqueName: \"kubernetes.io/projected/701dcccb-b7f5-4545-9442-a417185439f5-kube-api-access-29twd\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.033519 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data-custom\") pod \"heat-cfnapi-7868997dbd-vwz94\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.059610 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cnwm\" (UniqueName: \"kubernetes.io/projected/f22627eb-d27d-46de-bf5e-d473f566d51a-kube-api-access-5cnwm\") pod \"heat-api-58974d5667-gmm9k\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.108610 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.188734 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.217078 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.578099 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6f955857ff-qd498"] Mar 13 16:50:50 crc kubenswrapper[4786]: W0313 16:50:50.710471 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf22627eb_d27d_46de_bf5e_d473f566d51a.slice/crio-89877328954f909e5cecff16e205bdac67e4156205bf5e62048f9a56e96041f0 WatchSource:0}: Error finding container 89877328954f909e5cecff16e205bdac67e4156205bf5e62048f9a56e96041f0: Status 404 returned error can't find the container with id 89877328954f909e5cecff16e205bdac67e4156205bf5e62048f9a56e96041f0 Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.710497 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58974d5667-gmm9k"] Mar 13 16:50:50 crc kubenswrapper[4786]: W0313 16:50:50.770200 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701dcccb_b7f5_4545_9442_a417185439f5.slice/crio-b8e6086928f230cd142486361b3f37f84960fb104579edbffb00bd1d0382982e WatchSource:0}: Error finding container b8e6086928f230cd142486361b3f37f84960fb104579edbffb00bd1d0382982e: Status 404 returned error can't find the container with id b8e6086928f230cd142486361b3f37f84960fb104579edbffb00bd1d0382982e Mar 13 16:50:50 crc kubenswrapper[4786]: I0313 16:50:50.773850 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7868997dbd-vwz94"] Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.096566 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-56675d887f-rgjvs"] Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.096827 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-56675d887f-rgjvs" podUID="faacc636-1527-4dae-bacb-e634a06765c3" containerName="heat-api" containerID="cri-o://5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224" gracePeriod=60 Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.122040 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5b44c95f89-pdjc6"] Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.125109 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.130344 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.130663 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.131218 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-56675d887f-rgjvs" podUID="faacc636-1527-4dae-bacb-e634a06765c3" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.170:8004/healthcheck\": EOF" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.141275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-combined-ca-bundle\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.141532 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74p6d\" (UniqueName: \"kubernetes.io/projected/c8a16c52-18c2-4d08-a87e-c32303cd89e9-kube-api-access-74p6d\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.141604 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-internal-tls-certs\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.141727 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-config-data-custom\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.141748 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-config-data\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.141768 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-public-tls-certs\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.142073 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c75c58b4b-jk68l"] Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.142461 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" podUID="b8f3c97c-cc24-4506-97a3-d04bd1864f24" containerName="heat-cfnapi" containerID="cri-o://b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28" gracePeriod=60 Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.183345 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b44c95f89-pdjc6"] Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.188773 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" podUID="b8f3c97c-cc24-4506-97a3-d04bd1864f24" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.169:8000/healthcheck\": EOF" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.235691 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-fd85f647c-9ndph"] Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.242392 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.249245 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.249465 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.257166 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-config-data-custom\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.257214 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-config-data\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.257240 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-public-tls-certs\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.257362 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-combined-ca-bundle\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.257397 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74p6d\" (UniqueName: \"kubernetes.io/projected/c8a16c52-18c2-4d08-a87e-c32303cd89e9-kube-api-access-74p6d\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.257491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-internal-tls-certs\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.258052 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fd85f647c-9ndph"] Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.266061 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-combined-ca-bundle\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.266604 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-internal-tls-certs\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.267073 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-public-tls-certs\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.272221 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-config-data-custom\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.272403 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a16c52-18c2-4d08-a87e-c32303cd89e9-config-data\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.281569 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74p6d\" (UniqueName: \"kubernetes.io/projected/c8a16c52-18c2-4d08-a87e-c32303cd89e9-kube-api-access-74p6d\") pod \"heat-api-5b44c95f89-pdjc6\" (UID: \"c8a16c52-18c2-4d08-a87e-c32303cd89e9\") " pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.358941 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-config-data-custom\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.359033 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-public-tls-certs\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.359113 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-internal-tls-certs\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.359138 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn782\" (UniqueName: \"kubernetes.io/projected/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-kube-api-access-zn782\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.359171 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-config-data\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.359208 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-combined-ca-bundle\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.378638 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f955857ff-qd498" event={"ID":"1d5cf355-7214-417e-85dc-cc9a451d1649","Type":"ContainerStarted","Data":"d723011160e51fd6754f6a1e0c6d708e6219c680829160be8f032532534a3f60"} Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.378682 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6f955857ff-qd498" event={"ID":"1d5cf355-7214-417e-85dc-cc9a451d1649","Type":"ContainerStarted","Data":"7108ddc0aca4bc61e62a40920abaca588fdc1b040486a0825bf65deb97e193dc"} Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.379780 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.381077 4786 generic.go:334] "Generic (PLEG): container finished" podID="f22627eb-d27d-46de-bf5e-d473f566d51a" containerID="21241c87f883d82549e66187bb197cb300449f3ee2c06f315a96552bc6173fc5" exitCode=1 Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.381116 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58974d5667-gmm9k" event={"ID":"f22627eb-d27d-46de-bf5e-d473f566d51a","Type":"ContainerDied","Data":"21241c87f883d82549e66187bb197cb300449f3ee2c06f315a96552bc6173fc5"} Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.381132 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58974d5667-gmm9k" event={"ID":"f22627eb-d27d-46de-bf5e-d473f566d51a","Type":"ContainerStarted","Data":"89877328954f909e5cecff16e205bdac67e4156205bf5e62048f9a56e96041f0"} Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.381374 4786 scope.go:117] "RemoveContainer" containerID="21241c87f883d82549e66187bb197cb300449f3ee2c06f315a96552bc6173fc5" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.383685 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7868997dbd-vwz94" event={"ID":"701dcccb-b7f5-4545-9442-a417185439f5","Type":"ContainerStarted","Data":"5863dd6c0a9f296a132493206db59e130216c1bd33e9a3c70d86eaa5246b1ce2"} Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.383706 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7868997dbd-vwz94" event={"ID":"701dcccb-b7f5-4545-9442-a417185439f5","Type":"ContainerStarted","Data":"b8e6086928f230cd142486361b3f37f84960fb104579edbffb00bd1d0382982e"} Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.383943 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.428383 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6f955857ff-qd498" podStartSLOduration=2.428367125 podStartE2EDuration="2.428367125s" podCreationTimestamp="2026-03-13 16:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:50:51.400114895 +0000 UTC m=+6481.563326706" watchObservedRunningTime="2026-03-13 16:50:51.428367125 +0000 UTC m=+6481.591578926" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.456562 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7868997dbd-vwz94" podStartSLOduration=2.456545113 podStartE2EDuration="2.456545113s" podCreationTimestamp="2026-03-13 16:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:50:51.452090361 +0000 UTC m=+6481.615302182" watchObservedRunningTime="2026-03-13 16:50:51.456545113 +0000 UTC m=+6481.619756924" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.460589 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.462004 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-combined-ca-bundle\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.462165 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-config-data-custom\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.462302 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-public-tls-certs\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.462463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-internal-tls-certs\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.462540 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn782\" (UniqueName: \"kubernetes.io/projected/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-kube-api-access-zn782\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.462633 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-config-data\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.470018 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-internal-tls-certs\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.471475 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-public-tls-certs\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.471931 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-config-data-custom\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.472654 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-config-data\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.478101 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-combined-ca-bundle\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.493595 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn782\" (UniqueName: \"kubernetes.io/projected/767cfaf3-e5cc-4fc0-9be0-26e357051ec3-kube-api-access-zn782\") pod \"heat-cfnapi-fd85f647c-9ndph\" (UID: \"767cfaf3-e5cc-4fc0-9be0-26e357051ec3\") " pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.749311 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:51 crc kubenswrapper[4786]: I0313 16:50:51.930511 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b44c95f89-pdjc6"] Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.257550 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fd85f647c-9ndph"] Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.402008 4786 generic.go:334] "Generic (PLEG): container finished" podID="701dcccb-b7f5-4545-9442-a417185439f5" containerID="5863dd6c0a9f296a132493206db59e130216c1bd33e9a3c70d86eaa5246b1ce2" exitCode=1 Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.402087 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7868997dbd-vwz94" event={"ID":"701dcccb-b7f5-4545-9442-a417185439f5","Type":"ContainerDied","Data":"5863dd6c0a9f296a132493206db59e130216c1bd33e9a3c70d86eaa5246b1ce2"} Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.408147 4786 scope.go:117] "RemoveContainer" containerID="5863dd6c0a9f296a132493206db59e130216c1bd33e9a3c70d86eaa5246b1ce2" Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.412525 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b44c95f89-pdjc6" event={"ID":"c8a16c52-18c2-4d08-a87e-c32303cd89e9","Type":"ContainerStarted","Data":"083b636d673da5d42034f9af1ce206e764468bd2642cf800892f76022b86f6db"} Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.412575 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b44c95f89-pdjc6" event={"ID":"c8a16c52-18c2-4d08-a87e-c32303cd89e9","Type":"ContainerStarted","Data":"b3c9cbe1247ec5a5b2f4861b376be2ed9d8b3ad5a7cb6ea0732187fc8b89c0c5"} Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.413642 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.428281 4786 generic.go:334] "Generic (PLEG): container finished" podID="f22627eb-d27d-46de-bf5e-d473f566d51a" containerID="1b636a01ced5ecc331b19067cea41710a4b6da58d7c88f8a7b4e7082b3879c19" exitCode=1 Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.428349 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58974d5667-gmm9k" event={"ID":"f22627eb-d27d-46de-bf5e-d473f566d51a","Type":"ContainerDied","Data":"1b636a01ced5ecc331b19067cea41710a4b6da58d7c88f8a7b4e7082b3879c19"} Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.428388 4786 scope.go:117] "RemoveContainer" containerID="21241c87f883d82549e66187bb197cb300449f3ee2c06f315a96552bc6173fc5" Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.429180 4786 scope.go:117] "RemoveContainer" containerID="1b636a01ced5ecc331b19067cea41710a4b6da58d7c88f8a7b4e7082b3879c19" Mar 13 16:50:52 crc kubenswrapper[4786]: E0313 16:50:52.429801 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-58974d5667-gmm9k_openstack(f22627eb-d27d-46de-bf5e-d473f566d51a)\"" pod="openstack/heat-api-58974d5667-gmm9k" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.439095 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fd85f647c-9ndph" event={"ID":"767cfaf3-e5cc-4fc0-9be0-26e357051ec3","Type":"ContainerStarted","Data":"a6fd0aba3787a507a662e65e4d79347255c644f2e4fa03c4ca32f0bf54643289"} Mar 13 16:50:52 crc kubenswrapper[4786]: I0313 16:50:52.451547 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5b44c95f89-pdjc6" podStartSLOduration=1.451526141 podStartE2EDuration="1.451526141s" podCreationTimestamp="2026-03-13 16:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:50:52.439404557 +0000 UTC m=+6482.602616368" watchObservedRunningTime="2026-03-13 16:50:52.451526141 +0000 UTC m=+6482.614737952" Mar 13 16:50:53 crc kubenswrapper[4786]: I0313 16:50:53.450741 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fd85f647c-9ndph" event={"ID":"767cfaf3-e5cc-4fc0-9be0-26e357051ec3","Type":"ContainerStarted","Data":"0b87113ef5dace8c2b78462a75a1dab91f335dc8f54e78dab9c629c2bdb8911c"} Mar 13 16:50:53 crc kubenswrapper[4786]: I0313 16:50:53.451187 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:50:53 crc kubenswrapper[4786]: I0313 16:50:53.452934 4786 generic.go:334] "Generic (PLEG): container finished" podID="701dcccb-b7f5-4545-9442-a417185439f5" containerID="3fdb85e75190bb87a21da4fef1033b7578d5afe8d64c3066fb43b041cba5aef2" exitCode=1 Mar 13 16:50:53 crc kubenswrapper[4786]: I0313 16:50:53.452972 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7868997dbd-vwz94" event={"ID":"701dcccb-b7f5-4545-9442-a417185439f5","Type":"ContainerDied","Data":"3fdb85e75190bb87a21da4fef1033b7578d5afe8d64c3066fb43b041cba5aef2"} Mar 13 16:50:53 crc kubenswrapper[4786]: I0313 16:50:53.453007 4786 scope.go:117] "RemoveContainer" containerID="5863dd6c0a9f296a132493206db59e130216c1bd33e9a3c70d86eaa5246b1ce2" Mar 13 16:50:53 crc kubenswrapper[4786]: I0313 16:50:53.453573 4786 scope.go:117] "RemoveContainer" containerID="3fdb85e75190bb87a21da4fef1033b7578d5afe8d64c3066fb43b041cba5aef2" Mar 13 16:50:53 crc kubenswrapper[4786]: E0313 16:50:53.453799 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7868997dbd-vwz94_openstack(701dcccb-b7f5-4545-9442-a417185439f5)\"" pod="openstack/heat-cfnapi-7868997dbd-vwz94" podUID="701dcccb-b7f5-4545-9442-a417185439f5" Mar 13 16:50:53 crc kubenswrapper[4786]: I0313 16:50:53.456918 4786 scope.go:117] "RemoveContainer" containerID="1b636a01ced5ecc331b19067cea41710a4b6da58d7c88f8a7b4e7082b3879c19" Mar 13 16:50:53 crc kubenswrapper[4786]: E0313 16:50:53.457270 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-58974d5667-gmm9k_openstack(f22627eb-d27d-46de-bf5e-d473f566d51a)\"" pod="openstack/heat-api-58974d5667-gmm9k" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" Mar 13 16:50:53 crc kubenswrapper[4786]: I0313 16:50:53.474382 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-fd85f647c-9ndph" podStartSLOduration=2.474363868 podStartE2EDuration="2.474363868s" podCreationTimestamp="2026-03-13 16:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:50:53.469910676 +0000 UTC m=+6483.633122497" watchObservedRunningTime="2026-03-13 16:50:53.474363868 +0000 UTC m=+6483.637575679" Mar 13 16:50:54 crc kubenswrapper[4786]: I0313 16:50:54.481670 4786 scope.go:117] "RemoveContainer" containerID="3fdb85e75190bb87a21da4fef1033b7578d5afe8d64c3066fb43b041cba5aef2" Mar 13 16:50:54 crc kubenswrapper[4786]: E0313 16:50:54.482171 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7868997dbd-vwz94_openstack(701dcccb-b7f5-4545-9442-a417185439f5)\"" pod="openstack/heat-cfnapi-7868997dbd-vwz94" podUID="701dcccb-b7f5-4545-9442-a417185439f5" Mar 13 16:50:54 crc kubenswrapper[4786]: I0313 16:50:54.751807 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6667bbdf64-hcxzc" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.160:8443: connect: connection refused" Mar 13 16:50:55 crc kubenswrapper[4786]: I0313 16:50:55.189488 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:55 crc kubenswrapper[4786]: I0313 16:50:55.190269 4786 scope.go:117] "RemoveContainer" containerID="1b636a01ced5ecc331b19067cea41710a4b6da58d7c88f8a7b4e7082b3879c19" Mar 13 16:50:55 crc kubenswrapper[4786]: E0313 16:50:55.190538 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-58974d5667-gmm9k_openstack(f22627eb-d27d-46de-bf5e-d473f566d51a)\"" pod="openstack/heat-api-58974d5667-gmm9k" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" Mar 13 16:50:55 crc kubenswrapper[4786]: I0313 16:50:55.190950 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:50:55 crc kubenswrapper[4786]: I0313 16:50:55.218114 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:55 crc kubenswrapper[4786]: I0313 16:50:55.218169 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:50:55 crc kubenswrapper[4786]: I0313 16:50:55.490963 4786 scope.go:117] "RemoveContainer" containerID="3fdb85e75190bb87a21da4fef1033b7578d5afe8d64c3066fb43b041cba5aef2" Mar 13 16:50:55 crc kubenswrapper[4786]: I0313 16:50:55.491283 4786 scope.go:117] "RemoveContainer" containerID="1b636a01ced5ecc331b19067cea41710a4b6da58d7c88f8a7b4e7082b3879c19" Mar 13 16:50:55 crc kubenswrapper[4786]: E0313 16:50:55.491392 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7868997dbd-vwz94_openstack(701dcccb-b7f5-4545-9442-a417185439f5)\"" pod="openstack/heat-cfnapi-7868997dbd-vwz94" podUID="701dcccb-b7f5-4545-9442-a417185439f5" Mar 13 16:50:55 crc kubenswrapper[4786]: E0313 16:50:55.491735 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-58974d5667-gmm9k_openstack(f22627eb-d27d-46de-bf5e-d473f566d51a)\"" pod="openstack/heat-api-58974d5667-gmm9k" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" Mar 13 16:50:56 crc kubenswrapper[4786]: I0313 16:50:56.530150 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-56675d887f-rgjvs" podUID="faacc636-1527-4dae-bacb-e634a06765c3" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.170:8004/healthcheck\": read tcp 10.217.0.2:49156->10.217.1.170:8004: read: connection reset by peer" Mar 13 16:50:56 crc kubenswrapper[4786]: I0313 16:50:56.558231 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" podUID="b8f3c97c-cc24-4506-97a3-d04bd1864f24" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.169:8000/healthcheck\": read tcp 10.217.0.2:58692->10.217.1.169:8000: read: connection reset by peer" Mar 13 16:50:56 crc kubenswrapper[4786]: I0313 16:50:56.996361 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.094273 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.103117 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjqsv\" (UniqueName: \"kubernetes.io/projected/faacc636-1527-4dae-bacb-e634a06765c3-kube-api-access-fjqsv\") pod \"faacc636-1527-4dae-bacb-e634a06765c3\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.103339 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data-custom\") pod \"faacc636-1527-4dae-bacb-e634a06765c3\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.104108 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-combined-ca-bundle\") pod \"faacc636-1527-4dae-bacb-e634a06765c3\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.104336 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data\") pod \"faacc636-1527-4dae-bacb-e634a06765c3\" (UID: \"faacc636-1527-4dae-bacb-e634a06765c3\") " Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.110453 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faacc636-1527-4dae-bacb-e634a06765c3-kube-api-access-fjqsv" (OuterVolumeSpecName: "kube-api-access-fjqsv") pod "faacc636-1527-4dae-bacb-e634a06765c3" (UID: "faacc636-1527-4dae-bacb-e634a06765c3"). InnerVolumeSpecName "kube-api-access-fjqsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.114687 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "faacc636-1527-4dae-bacb-e634a06765c3" (UID: "faacc636-1527-4dae-bacb-e634a06765c3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.154029 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faacc636-1527-4dae-bacb-e634a06765c3" (UID: "faacc636-1527-4dae-bacb-e634a06765c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.182153 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data" (OuterVolumeSpecName: "config-data") pod "faacc636-1527-4dae-bacb-e634a06765c3" (UID: "faacc636-1527-4dae-bacb-e634a06765c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.207204 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data\") pod \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.207243 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-combined-ca-bundle\") pod \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.207468 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lhkb\" (UniqueName: \"kubernetes.io/projected/b8f3c97c-cc24-4506-97a3-d04bd1864f24-kube-api-access-8lhkb\") pod \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.207486 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data-custom\") pod \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\" (UID: \"b8f3c97c-cc24-4506-97a3-d04bd1864f24\") " Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.208031 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.208050 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.208060 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faacc636-1527-4dae-bacb-e634a06765c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.208071 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjqsv\" (UniqueName: \"kubernetes.io/projected/faacc636-1527-4dae-bacb-e634a06765c3-kube-api-access-fjqsv\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.210675 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b8f3c97c-cc24-4506-97a3-d04bd1864f24" (UID: "b8f3c97c-cc24-4506-97a3-d04bd1864f24"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.211634 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f3c97c-cc24-4506-97a3-d04bd1864f24-kube-api-access-8lhkb" (OuterVolumeSpecName: "kube-api-access-8lhkb") pod "b8f3c97c-cc24-4506-97a3-d04bd1864f24" (UID: "b8f3c97c-cc24-4506-97a3-d04bd1864f24"). InnerVolumeSpecName "kube-api-access-8lhkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.232200 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8f3c97c-cc24-4506-97a3-d04bd1864f24" (UID: "b8f3c97c-cc24-4506-97a3-d04bd1864f24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.276801 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data" (OuterVolumeSpecName: "config-data") pod "b8f3c97c-cc24-4506-97a3-d04bd1864f24" (UID: "b8f3c97c-cc24-4506-97a3-d04bd1864f24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.310312 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lhkb\" (UniqueName: \"kubernetes.io/projected/b8f3c97c-cc24-4506-97a3-d04bd1864f24-kube-api-access-8lhkb\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.310342 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.310353 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.310363 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f3c97c-cc24-4506-97a3-d04bd1864f24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.515436 4786 generic.go:334] "Generic (PLEG): container finished" podID="faacc636-1527-4dae-bacb-e634a06765c3" containerID="5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224" exitCode=0 Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.515484 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56675d887f-rgjvs" event={"ID":"faacc636-1527-4dae-bacb-e634a06765c3","Type":"ContainerDied","Data":"5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224"} Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.515550 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56675d887f-rgjvs" event={"ID":"faacc636-1527-4dae-bacb-e634a06765c3","Type":"ContainerDied","Data":"83effa430ec0647f08b9ac16dfa9d60b7826e94e5c64a7e15ab5a6a0d6c77d8e"} Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.515552 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56675d887f-rgjvs" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.515581 4786 scope.go:117] "RemoveContainer" containerID="5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.519530 4786 generic.go:334] "Generic (PLEG): container finished" podID="b8f3c97c-cc24-4506-97a3-d04bd1864f24" containerID="b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28" exitCode=0 Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.519598 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" event={"ID":"b8f3c97c-cc24-4506-97a3-d04bd1864f24","Type":"ContainerDied","Data":"b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28"} Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.519627 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.519722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c75c58b4b-jk68l" event={"ID":"b8f3c97c-cc24-4506-97a3-d04bd1864f24","Type":"ContainerDied","Data":"07cb7d2bc8c3acb15a8727d14bc93850cd1960b00783101b217ecf08e8e1b201"} Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.567844 4786 scope.go:117] "RemoveContainer" containerID="5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224" Mar 13 16:50:57 crc kubenswrapper[4786]: E0313 16:50:57.568495 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224\": container with ID starting with 5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224 not found: ID does not exist" containerID="5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.568545 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224"} err="failed to get container status \"5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224\": rpc error: code = NotFound desc = could not find container \"5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224\": container with ID starting with 5aec1197d84181e361e041a7f7e7f981f13c89ed1258c88b57759196b52a3224 not found: ID does not exist" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.568584 4786 scope.go:117] "RemoveContainer" containerID="b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.589616 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-56675d887f-rgjvs"] Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.604414 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-56675d887f-rgjvs"] Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.622641 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c75c58b4b-jk68l"] Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.622680 4786 scope.go:117] "RemoveContainer" containerID="b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28" Mar 13 16:50:57 crc kubenswrapper[4786]: E0313 16:50:57.623559 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28\": container with ID starting with b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28 not found: ID does not exist" containerID="b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.623656 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28"} err="failed to get container status \"b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28\": rpc error: code = NotFound desc = could not find container \"b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28\": container with ID starting with b660b128a356f77453708c8a7e91804ef3e47efb41bd13ae014ac00ff1adbd28 not found: ID does not exist" Mar 13 16:50:57 crc kubenswrapper[4786]: I0313 16:50:57.639611 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6c75c58b4b-jk68l"] Mar 13 16:50:58 crc kubenswrapper[4786]: I0313 16:50:58.572731 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f3c97c-cc24-4506-97a3-d04bd1864f24" path="/var/lib/kubelet/pods/b8f3c97c-cc24-4506-97a3-d04bd1864f24/volumes" Mar 13 16:50:58 crc kubenswrapper[4786]: I0313 16:50:58.574282 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faacc636-1527-4dae-bacb-e634a06765c3" path="/var/lib/kubelet/pods/faacc636-1527-4dae-bacb-e634a06765c3/volumes" Mar 13 16:51:00 crc kubenswrapper[4786]: I0313 16:51:00.167674 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6f955857ff-qd498" Mar 13 16:51:00 crc kubenswrapper[4786]: I0313 16:51:00.256988 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-664c6d546f-hgh75"] Mar 13 16:51:00 crc kubenswrapper[4786]: I0313 16:51:00.257310 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-664c6d546f-hgh75" podUID="60d28482-12bb-4f4b-870e-92425c5aedc4" containerName="heat-engine" containerID="cri-o://441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" gracePeriod=60 Mar 13 16:51:00 crc kubenswrapper[4786]: E0313 16:51:00.259966 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 16:51:00 crc kubenswrapper[4786]: E0313 16:51:00.262569 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 16:51:00 crc kubenswrapper[4786]: E0313 16:51:00.265794 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 16:51:00 crc kubenswrapper[4786]: E0313 16:51:00.265842 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-664c6d546f-hgh75" podUID="60d28482-12bb-4f4b-870e-92425c5aedc4" containerName="heat-engine" Mar 13 16:51:02 crc kubenswrapper[4786]: I0313 16:51:02.790504 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5b44c95f89-pdjc6" Mar 13 16:51:02 crc kubenswrapper[4786]: I0313 16:51:02.881927 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-58974d5667-gmm9k"] Mar 13 16:51:02 crc kubenswrapper[4786]: E0313 16:51:02.953928 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 16:51:02 crc kubenswrapper[4786]: E0313 16:51:02.955234 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 16:51:02 crc kubenswrapper[4786]: E0313 16:51:02.959272 4786 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 16:51:02 crc kubenswrapper[4786]: E0313 16:51:02.959310 4786 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-664c6d546f-hgh75" podUID="60d28482-12bb-4f4b-870e-92425c5aedc4" containerName="heat-engine" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.038295 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-fd85f647c-9ndph" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.116050 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7868997dbd-vwz94"] Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.314579 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.379470 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data\") pod \"f22627eb-d27d-46de-bf5e-d473f566d51a\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.379540 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-combined-ca-bundle\") pod \"f22627eb-d27d-46de-bf5e-d473f566d51a\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.379608 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cnwm\" (UniqueName: \"kubernetes.io/projected/f22627eb-d27d-46de-bf5e-d473f566d51a-kube-api-access-5cnwm\") pod \"f22627eb-d27d-46de-bf5e-d473f566d51a\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.379693 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data-custom\") pod \"f22627eb-d27d-46de-bf5e-d473f566d51a\" (UID: \"f22627eb-d27d-46de-bf5e-d473f566d51a\") " Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.385006 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f22627eb-d27d-46de-bf5e-d473f566d51a" (UID: "f22627eb-d27d-46de-bf5e-d473f566d51a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.385834 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f22627eb-d27d-46de-bf5e-d473f566d51a-kube-api-access-5cnwm" (OuterVolumeSpecName: "kube-api-access-5cnwm") pod "f22627eb-d27d-46de-bf5e-d473f566d51a" (UID: "f22627eb-d27d-46de-bf5e-d473f566d51a"). InnerVolumeSpecName "kube-api-access-5cnwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.449046 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f22627eb-d27d-46de-bf5e-d473f566d51a" (UID: "f22627eb-d27d-46de-bf5e-d473f566d51a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.454100 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data" (OuterVolumeSpecName: "config-data") pod "f22627eb-d27d-46de-bf5e-d473f566d51a" (UID: "f22627eb-d27d-46de-bf5e-d473f566d51a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.483385 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cnwm\" (UniqueName: \"kubernetes.io/projected/f22627eb-d27d-46de-bf5e-d473f566d51a-kube-api-access-5cnwm\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.483419 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.483429 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.483437 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22627eb-d27d-46de-bf5e-d473f566d51a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.514386 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.584418 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data-custom\") pod \"701dcccb-b7f5-4545-9442-a417185439f5\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.584736 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29twd\" (UniqueName: \"kubernetes.io/projected/701dcccb-b7f5-4545-9442-a417185439f5-kube-api-access-29twd\") pod \"701dcccb-b7f5-4545-9442-a417185439f5\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.584840 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-combined-ca-bundle\") pod \"701dcccb-b7f5-4545-9442-a417185439f5\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.584909 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data\") pod \"701dcccb-b7f5-4545-9442-a417185439f5\" (UID: \"701dcccb-b7f5-4545-9442-a417185439f5\") " Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.588083 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701dcccb-b7f5-4545-9442-a417185439f5-kube-api-access-29twd" (OuterVolumeSpecName: "kube-api-access-29twd") pod "701dcccb-b7f5-4545-9442-a417185439f5" (UID: "701dcccb-b7f5-4545-9442-a417185439f5"). InnerVolumeSpecName "kube-api-access-29twd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.590096 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "701dcccb-b7f5-4545-9442-a417185439f5" (UID: "701dcccb-b7f5-4545-9442-a417185439f5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.596541 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7868997dbd-vwz94" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.596976 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7868997dbd-vwz94" event={"ID":"701dcccb-b7f5-4545-9442-a417185439f5","Type":"ContainerDied","Data":"b8e6086928f230cd142486361b3f37f84960fb104579edbffb00bd1d0382982e"} Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.597057 4786 scope.go:117] "RemoveContainer" containerID="3fdb85e75190bb87a21da4fef1033b7578d5afe8d64c3066fb43b041cba5aef2" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.599315 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58974d5667-gmm9k" event={"ID":"f22627eb-d27d-46de-bf5e-d473f566d51a","Type":"ContainerDied","Data":"89877328954f909e5cecff16e205bdac67e4156205bf5e62048f9a56e96041f0"} Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.599433 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58974d5667-gmm9k" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.616525 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701dcccb-b7f5-4545-9442-a417185439f5" (UID: "701dcccb-b7f5-4545-9442-a417185439f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.653682 4786 scope.go:117] "RemoveContainer" containerID="1b636a01ced5ecc331b19067cea41710a4b6da58d7c88f8a7b4e7082b3879c19" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.657342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data" (OuterVolumeSpecName: "config-data") pod "701dcccb-b7f5-4545-9442-a417185439f5" (UID: "701dcccb-b7f5-4545-9442-a417185439f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.661640 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-58974d5667-gmm9k"] Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.673093 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-58974d5667-gmm9k"] Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.688103 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.688140 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29twd\" (UniqueName: \"kubernetes.io/projected/701dcccb-b7f5-4545-9442-a417185439f5-kube-api-access-29twd\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.688158 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.688175 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dcccb-b7f5-4545-9442-a417185439f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.942886 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7868997dbd-vwz94"] Mar 13 16:51:03 crc kubenswrapper[4786]: I0313 16:51:03.954607 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7868997dbd-vwz94"] Mar 13 16:51:04 crc kubenswrapper[4786]: I0313 16:51:04.568497 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701dcccb-b7f5-4545-9442-a417185439f5" path="/var/lib/kubelet/pods/701dcccb-b7f5-4545-9442-a417185439f5/volumes" Mar 13 16:51:04 crc kubenswrapper[4786]: I0313 16:51:04.569852 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" path="/var/lib/kubelet/pods/f22627eb-d27d-46de-bf5e-d473f566d51a/volumes" Mar 13 16:51:04 crc kubenswrapper[4786]: I0313 16:51:04.752603 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6667bbdf64-hcxzc" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.160:8443: connect: connection refused" Mar 13 16:51:04 crc kubenswrapper[4786]: I0313 16:51:04.753030 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:51:10 crc kubenswrapper[4786]: I0313 16:51:10.705933 4786 generic.go:334] "Generic (PLEG): container finished" podID="60d28482-12bb-4f4b-870e-92425c5aedc4" containerID="441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" exitCode=0 Mar 13 16:51:10 crc kubenswrapper[4786]: I0313 16:51:10.706030 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-664c6d546f-hgh75" event={"ID":"60d28482-12bb-4f4b-870e-92425c5aedc4","Type":"ContainerDied","Data":"441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36"} Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.189405 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.257656 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data-custom\") pod \"60d28482-12bb-4f4b-870e-92425c5aedc4\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.257802 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-combined-ca-bundle\") pod \"60d28482-12bb-4f4b-870e-92425c5aedc4\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.258043 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rmd8\" (UniqueName: \"kubernetes.io/projected/60d28482-12bb-4f4b-870e-92425c5aedc4-kube-api-access-6rmd8\") pod \"60d28482-12bb-4f4b-870e-92425c5aedc4\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.258081 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data\") pod \"60d28482-12bb-4f4b-870e-92425c5aedc4\" (UID: \"60d28482-12bb-4f4b-870e-92425c5aedc4\") " Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.266996 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60d28482-12bb-4f4b-870e-92425c5aedc4" (UID: "60d28482-12bb-4f4b-870e-92425c5aedc4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.268951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d28482-12bb-4f4b-870e-92425c5aedc4-kube-api-access-6rmd8" (OuterVolumeSpecName: "kube-api-access-6rmd8") pod "60d28482-12bb-4f4b-870e-92425c5aedc4" (UID: "60d28482-12bb-4f4b-870e-92425c5aedc4"). InnerVolumeSpecName "kube-api-access-6rmd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.291463 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60d28482-12bb-4f4b-870e-92425c5aedc4" (UID: "60d28482-12bb-4f4b-870e-92425c5aedc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.335328 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data" (OuterVolumeSpecName: "config-data") pod "60d28482-12bb-4f4b-870e-92425c5aedc4" (UID: "60d28482-12bb-4f4b-870e-92425c5aedc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.365028 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.365059 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rmd8\" (UniqueName: \"kubernetes.io/projected/60d28482-12bb-4f4b-870e-92425c5aedc4-kube-api-access-6rmd8\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.365069 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.365078 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d28482-12bb-4f4b-870e-92425c5aedc4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.719235 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-664c6d546f-hgh75" event={"ID":"60d28482-12bb-4f4b-870e-92425c5aedc4","Type":"ContainerDied","Data":"1026d66db5d92d4a7695196735ae11da8da3207cd0b87d90f590f1e8c2a31c88"} Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.719647 4786 scope.go:117] "RemoveContainer" containerID="441ee9573b6b7e9220d39a98af31b55026e767799652712d1a295229b7155b36" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.719900 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-664c6d546f-hgh75" Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.783084 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-664c6d546f-hgh75"] Mar 13 16:51:11 crc kubenswrapper[4786]: I0313 16:51:11.791380 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-664c6d546f-hgh75"] Mar 13 16:51:12 crc kubenswrapper[4786]: I0313 16:51:12.570134 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d28482-12bb-4f4b-870e-92425c5aedc4" path="/var/lib/kubelet/pods/60d28482-12bb-4f4b-870e-92425c5aedc4/volumes" Mar 13 16:51:14 crc kubenswrapper[4786]: I0313 16:51:14.781452 4786 generic.go:334] "Generic (PLEG): container finished" podID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerID="028ac99db632f4107ca629da7a995de2180c047ada30e587a00d63b7fabc1ae2" exitCode=137 Mar 13 16:51:14 crc kubenswrapper[4786]: I0313 16:51:14.781515 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667bbdf64-hcxzc" event={"ID":"a2213440-3009-4db0-a1c4-7d5d4a12481b","Type":"ContainerDied","Data":"028ac99db632f4107ca629da7a995de2180c047ada30e587a00d63b7fabc1ae2"} Mar 13 16:51:14 crc kubenswrapper[4786]: I0313 16:51:14.961168 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.053100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtz95\" (UniqueName: \"kubernetes.io/projected/a2213440-3009-4db0-a1c4-7d5d4a12481b-kube-api-access-wtz95\") pod \"a2213440-3009-4db0-a1c4-7d5d4a12481b\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.053183 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2213440-3009-4db0-a1c4-7d5d4a12481b-logs\") pod \"a2213440-3009-4db0-a1c4-7d5d4a12481b\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.053224 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-config-data\") pod \"a2213440-3009-4db0-a1c4-7d5d4a12481b\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.053274 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-secret-key\") pod \"a2213440-3009-4db0-a1c4-7d5d4a12481b\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.053328 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-scripts\") pod \"a2213440-3009-4db0-a1c4-7d5d4a12481b\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.053376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-combined-ca-bundle\") pod \"a2213440-3009-4db0-a1c4-7d5d4a12481b\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.053498 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-tls-certs\") pod \"a2213440-3009-4db0-a1c4-7d5d4a12481b\" (UID: \"a2213440-3009-4db0-a1c4-7d5d4a12481b\") " Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.054125 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2213440-3009-4db0-a1c4-7d5d4a12481b-logs" (OuterVolumeSpecName: "logs") pod "a2213440-3009-4db0-a1c4-7d5d4a12481b" (UID: "a2213440-3009-4db0-a1c4-7d5d4a12481b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.054931 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2213440-3009-4db0-a1c4-7d5d4a12481b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.060625 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a2213440-3009-4db0-a1c4-7d5d4a12481b" (UID: "a2213440-3009-4db0-a1c4-7d5d4a12481b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.068167 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2213440-3009-4db0-a1c4-7d5d4a12481b-kube-api-access-wtz95" (OuterVolumeSpecName: "kube-api-access-wtz95") pod "a2213440-3009-4db0-a1c4-7d5d4a12481b" (UID: "a2213440-3009-4db0-a1c4-7d5d4a12481b"). InnerVolumeSpecName "kube-api-access-wtz95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.084376 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2213440-3009-4db0-a1c4-7d5d4a12481b" (UID: "a2213440-3009-4db0-a1c4-7d5d4a12481b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.087691 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-scripts" (OuterVolumeSpecName: "scripts") pod "a2213440-3009-4db0-a1c4-7d5d4a12481b" (UID: "a2213440-3009-4db0-a1c4-7d5d4a12481b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.087820 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-config-data" (OuterVolumeSpecName: "config-data") pod "a2213440-3009-4db0-a1c4-7d5d4a12481b" (UID: "a2213440-3009-4db0-a1c4-7d5d4a12481b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.116106 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a2213440-3009-4db0-a1c4-7d5d4a12481b" (UID: "a2213440-3009-4db0-a1c4-7d5d4a12481b"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.156726 4786 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.156763 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.156776 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.156789 4786 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2213440-3009-4db0-a1c4-7d5d4a12481b-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.156803 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtz95\" (UniqueName: \"kubernetes.io/projected/a2213440-3009-4db0-a1c4-7d5d4a12481b-kube-api-access-wtz95\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.156818 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2213440-3009-4db0-a1c4-7d5d4a12481b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.799539 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667bbdf64-hcxzc" event={"ID":"a2213440-3009-4db0-a1c4-7d5d4a12481b","Type":"ContainerDied","Data":"f32e7fdc3358b37e56e73572ebd753139df3204e7a4da38e21f1a41e80c234cd"} Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.800094 4786 scope.go:117] "RemoveContainer" containerID="586157e2794d58c3593b5f8272f7d0fc627737484ca31706ba7d96225f6404e0" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.799717 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6667bbdf64-hcxzc" Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.869491 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6667bbdf64-hcxzc"] Mar 13 16:51:15 crc kubenswrapper[4786]: I0313 16:51:15.882556 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6667bbdf64-hcxzc"] Mar 13 16:51:16 crc kubenswrapper[4786]: I0313 16:51:16.037340 4786 scope.go:117] "RemoveContainer" containerID="028ac99db632f4107ca629da7a995de2180c047ada30e587a00d63b7fabc1ae2" Mar 13 16:51:16 crc kubenswrapper[4786]: I0313 16:51:16.073249 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7f99v"] Mar 13 16:51:16 crc kubenswrapper[4786]: I0313 16:51:16.114306 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7f99v"] Mar 13 16:51:16 crc kubenswrapper[4786]: I0313 16:51:16.576167 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="155b42b2-bff4-4522-88b5-79a356cf2c9b" path="/var/lib/kubelet/pods/155b42b2-bff4-4522-88b5-79a356cf2c9b/volumes" Mar 13 16:51:16 crc kubenswrapper[4786]: I0313 16:51:16.577692 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" path="/var/lib/kubelet/pods/a2213440-3009-4db0-a1c4-7d5d4a12481b/volumes" Mar 13 16:51:16 crc kubenswrapper[4786]: I0313 16:51:16.844935 4786 scope.go:117] "RemoveContainer" containerID="413dd7bd2a10d71ebc2dd6ec5d8b04cd0623e6839a169533acf70d39c63844db" Mar 13 16:51:16 crc kubenswrapper[4786]: I0313 16:51:16.891040 4786 scope.go:117] "RemoveContainer" containerID="ee691082331590c03e45be6cc563420fdbecd6894a6ff59f77020466a4d10b5f" Mar 13 16:51:16 crc kubenswrapper[4786]: I0313 16:51:16.942331 4786 scope.go:117] "RemoveContainer" containerID="15bb4362b12a5a94f398ac340672a96253d6cf023c83afd7ec81124664d2a4db" Mar 13 16:51:17 crc kubenswrapper[4786]: I0313 16:51:17.058914 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a45a-account-create-update-fc84c"] Mar 13 16:51:17 crc kubenswrapper[4786]: I0313 16:51:17.071173 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a45a-account-create-update-fc84c"] Mar 13 16:51:18 crc kubenswrapper[4786]: I0313 16:51:18.568188 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81bcc1d5-7c5e-4848-a120-fd1438a9eefa" path="/var/lib/kubelet/pods/81bcc1d5-7c5e-4848-a120-fd1438a9eefa/volumes" Mar 13 16:51:19 crc kubenswrapper[4786]: I0313 16:51:19.752797 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6667bbdf64-hcxzc" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.160:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 16:51:25 crc kubenswrapper[4786]: I0313 16:51:25.066505 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jvlvx"] Mar 13 16:51:25 crc kubenswrapper[4786]: I0313 16:51:25.084913 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jvlvx"] Mar 13 16:51:26 crc kubenswrapper[4786]: I0313 16:51:26.565075 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbd3a91-0884-41f5-804a-6407a7521819" path="/var/lib/kubelet/pods/4dbd3a91-0884-41f5-804a-6407a7521819/volumes" Mar 13 16:51:37 crc kubenswrapper[4786]: I0313 16:51:37.868763 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:51:37 crc kubenswrapper[4786]: I0313 16:51:37.869434 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.701123 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr"] Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702062 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d28482-12bb-4f4b-870e-92425c5aedc4" containerName="heat-engine" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702077 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d28482-12bb-4f4b-870e-92425c5aedc4" containerName="heat-engine" Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702094 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701dcccb-b7f5-4545-9442-a417185439f5" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702102 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="701dcccb-b7f5-4545-9442-a417185439f5" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702114 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702122 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702139 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702148 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702161 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faacc636-1527-4dae-bacb-e634a06765c3" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702169 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="faacc636-1527-4dae-bacb-e634a06765c3" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702181 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f3c97c-cc24-4506-97a3-d04bd1864f24" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702189 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f3c97c-cc24-4506-97a3-d04bd1864f24" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702215 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon-log" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702223 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon-log" Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702235 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701dcccb-b7f5-4545-9442-a417185439f5" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702244 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="701dcccb-b7f5-4545-9442-a417185439f5" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: E0313 16:51:39.702263 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702271 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702501 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f3c97c-cc24-4506-97a3-d04bd1864f24" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702517 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702541 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="701dcccb-b7f5-4545-9442-a417185439f5" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702560 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702584 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="701dcccb-b7f5-4545-9442-a417185439f5" containerName="heat-cfnapi" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702598 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2213440-3009-4db0-a1c4-7d5d4a12481b" containerName="horizon-log" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702616 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d28482-12bb-4f4b-870e-92425c5aedc4" containerName="heat-engine" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702630 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="faacc636-1527-4dae-bacb-e634a06765c3" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.702643 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f22627eb-d27d-46de-bf5e-d473f566d51a" containerName="heat-api" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.704359 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.707169 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.713624 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr"] Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.814710 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.814898 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvkzm\" (UniqueName: \"kubernetes.io/projected/7eef90e3-9167-4e4c-941e-943965d480a3-kube-api-access-bvkzm\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.814966 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.917429 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.917548 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvkzm\" (UniqueName: \"kubernetes.io/projected/7eef90e3-9167-4e4c-941e-943965d480a3-kube-api-access-bvkzm\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.917624 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.918062 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.918145 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:39 crc kubenswrapper[4786]: I0313 16:51:39.937511 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvkzm\" (UniqueName: \"kubernetes.io/projected/7eef90e3-9167-4e4c-941e-943965d480a3-kube-api-access-bvkzm\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:40 crc kubenswrapper[4786]: I0313 16:51:40.036738 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:40 crc kubenswrapper[4786]: I0313 16:51:40.379006 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr"] Mar 13 16:51:41 crc kubenswrapper[4786]: I0313 16:51:41.079069 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" event={"ID":"7eef90e3-9167-4e4c-941e-943965d480a3","Type":"ContainerStarted","Data":"2ec2f05d46f4871ebdbd54ff2c26874dea0e00548ed2110a1b4b9a0a4a85d48d"} Mar 13 16:51:41 crc kubenswrapper[4786]: I0313 16:51:41.079492 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" event={"ID":"7eef90e3-9167-4e4c-941e-943965d480a3","Type":"ContainerStarted","Data":"b9cd1df05bdfb43abde0ed60eb58262b1945a0f0b3fb6ccd93c14d9e5224370c"} Mar 13 16:51:41 crc kubenswrapper[4786]: E0313 16:51:41.222193 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eef90e3_9167_4e4c_941e_943965d480a3.slice/crio-conmon-2ec2f05d46f4871ebdbd54ff2c26874dea0e00548ed2110a1b4b9a0a4a85d48d.scope\": RecentStats: unable to find data in memory cache]" Mar 13 16:51:42 crc kubenswrapper[4786]: I0313 16:51:42.095898 4786 generic.go:334] "Generic (PLEG): container finished" podID="7eef90e3-9167-4e4c-941e-943965d480a3" containerID="2ec2f05d46f4871ebdbd54ff2c26874dea0e00548ed2110a1b4b9a0a4a85d48d" exitCode=0 Mar 13 16:51:42 crc kubenswrapper[4786]: I0313 16:51:42.095988 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" event={"ID":"7eef90e3-9167-4e4c-941e-943965d480a3","Type":"ContainerDied","Data":"2ec2f05d46f4871ebdbd54ff2c26874dea0e00548ed2110a1b4b9a0a4a85d48d"} Mar 13 16:51:42 crc kubenswrapper[4786]: I0313 16:51:42.099208 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:51:45 crc kubenswrapper[4786]: I0313 16:51:45.144900 4786 generic.go:334] "Generic (PLEG): container finished" podID="7eef90e3-9167-4e4c-941e-943965d480a3" containerID="8fa641d24aba4dbc142aa07b3df4c5636a5ac2cb94ee223627f5503337fc4855" exitCode=0 Mar 13 16:51:45 crc kubenswrapper[4786]: I0313 16:51:45.145534 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" event={"ID":"7eef90e3-9167-4e4c-941e-943965d480a3","Type":"ContainerDied","Data":"8fa641d24aba4dbc142aa07b3df4c5636a5ac2cb94ee223627f5503337fc4855"} Mar 13 16:51:46 crc kubenswrapper[4786]: I0313 16:51:46.155026 4786 generic.go:334] "Generic (PLEG): container finished" podID="7eef90e3-9167-4e4c-941e-943965d480a3" containerID="807074d542b03bd79c39b64173c1387a55451c675c5cc4d2673368a1ec277cc4" exitCode=0 Mar 13 16:51:46 crc kubenswrapper[4786]: I0313 16:51:46.155072 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" event={"ID":"7eef90e3-9167-4e4c-941e-943965d480a3","Type":"ContainerDied","Data":"807074d542b03bd79c39b64173c1387a55451c675c5cc4d2673368a1ec277cc4"} Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.554725 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.697165 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvkzm\" (UniqueName: \"kubernetes.io/projected/7eef90e3-9167-4e4c-941e-943965d480a3-kube-api-access-bvkzm\") pod \"7eef90e3-9167-4e4c-941e-943965d480a3\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.697639 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-util\") pod \"7eef90e3-9167-4e4c-941e-943965d480a3\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.697675 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-bundle\") pod \"7eef90e3-9167-4e4c-941e-943965d480a3\" (UID: \"7eef90e3-9167-4e4c-941e-943965d480a3\") " Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.699791 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-bundle" (OuterVolumeSpecName: "bundle") pod "7eef90e3-9167-4e4c-941e-943965d480a3" (UID: "7eef90e3-9167-4e4c-941e-943965d480a3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.704147 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eef90e3-9167-4e4c-941e-943965d480a3-kube-api-access-bvkzm" (OuterVolumeSpecName: "kube-api-access-bvkzm") pod "7eef90e3-9167-4e4c-941e-943965d480a3" (UID: "7eef90e3-9167-4e4c-941e-943965d480a3"). InnerVolumeSpecName "kube-api-access-bvkzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.707345 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-util" (OuterVolumeSpecName: "util") pod "7eef90e3-9167-4e4c-941e-943965d480a3" (UID: "7eef90e3-9167-4e4c-941e-943965d480a3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.799996 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-util\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.800026 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7eef90e3-9167-4e4c-941e-943965d480a3-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:47 crc kubenswrapper[4786]: I0313 16:51:47.800035 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvkzm\" (UniqueName: \"kubernetes.io/projected/7eef90e3-9167-4e4c-941e-943965d480a3-kube-api-access-bvkzm\") on node \"crc\" DevicePath \"\"" Mar 13 16:51:48 crc kubenswrapper[4786]: I0313 16:51:48.178422 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" event={"ID":"7eef90e3-9167-4e4c-941e-943965d480a3","Type":"ContainerDied","Data":"b9cd1df05bdfb43abde0ed60eb58262b1945a0f0b3fb6ccd93c14d9e5224370c"} Mar 13 16:51:48 crc kubenswrapper[4786]: I0313 16:51:48.178487 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9cd1df05bdfb43abde0ed60eb58262b1945a0f0b3fb6ccd93c14d9e5224370c" Mar 13 16:51:48 crc kubenswrapper[4786]: I0313 16:51:48.178512 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.529149 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn"] Mar 13 16:51:58 crc kubenswrapper[4786]: E0313 16:51:58.530116 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eef90e3-9167-4e4c-941e-943965d480a3" containerName="pull" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.530129 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eef90e3-9167-4e4c-941e-943965d480a3" containerName="pull" Mar 13 16:51:58 crc kubenswrapper[4786]: E0313 16:51:58.530145 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eef90e3-9167-4e4c-941e-943965d480a3" containerName="util" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.530151 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eef90e3-9167-4e4c-941e-943965d480a3" containerName="util" Mar 13 16:51:58 crc kubenswrapper[4786]: E0313 16:51:58.530178 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eef90e3-9167-4e4c-941e-943965d480a3" containerName="extract" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.530185 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eef90e3-9167-4e4c-941e-943965d480a3" containerName="extract" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.530359 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eef90e3-9167-4e4c-941e-943965d480a3" containerName="extract" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.531024 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.532839 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.533309 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-fdf6f" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.533469 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.546511 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn"] Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.618017 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwb2\" (UniqueName: \"kubernetes.io/projected/d2886207-16dc-47f5-bc4d-dba0a8a55ed1-kube-api-access-fwwb2\") pod \"obo-prometheus-operator-68bc856cb9-2rkbn\" (UID: \"d2886207-16dc-47f5-bc4d-dba0a8a55ed1\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.660869 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd"] Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.662208 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.665807 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-whv79" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.666023 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.672182 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp"] Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.676363 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.687328 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd"] Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.699386 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp"] Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.720309 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwb2\" (UniqueName: \"kubernetes.io/projected/d2886207-16dc-47f5-bc4d-dba0a8a55ed1-kube-api-access-fwwb2\") pod \"obo-prometheus-operator-68bc856cb9-2rkbn\" (UID: \"d2886207-16dc-47f5-bc4d-dba0a8a55ed1\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.720377 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cdb71ef5-96d3-42b6-86d9-05be37f6961a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd\" (UID: \"cdb71ef5-96d3-42b6-86d9-05be37f6961a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.720411 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23fe82eb-830c-4fc5-b855-421e358620d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-c7skp\" (UID: \"23fe82eb-830c-4fc5-b855-421e358620d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.720479 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23fe82eb-830c-4fc5-b855-421e358620d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-c7skp\" (UID: \"23fe82eb-830c-4fc5-b855-421e358620d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.720545 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cdb71ef5-96d3-42b6-86d9-05be37f6961a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd\" (UID: \"cdb71ef5-96d3-42b6-86d9-05be37f6961a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.745709 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwb2\" (UniqueName: \"kubernetes.io/projected/d2886207-16dc-47f5-bc4d-dba0a8a55ed1-kube-api-access-fwwb2\") pod \"obo-prometheus-operator-68bc856cb9-2rkbn\" (UID: \"d2886207-16dc-47f5-bc4d-dba0a8a55ed1\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.822485 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23fe82eb-830c-4fc5-b855-421e358620d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-c7skp\" (UID: \"23fe82eb-830c-4fc5-b855-421e358620d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.822558 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cdb71ef5-96d3-42b6-86d9-05be37f6961a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd\" (UID: \"cdb71ef5-96d3-42b6-86d9-05be37f6961a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.822668 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cdb71ef5-96d3-42b6-86d9-05be37f6961a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd\" (UID: \"cdb71ef5-96d3-42b6-86d9-05be37f6961a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.822690 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23fe82eb-830c-4fc5-b855-421e358620d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-c7skp\" (UID: \"23fe82eb-830c-4fc5-b855-421e358620d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.826267 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/23fe82eb-830c-4fc5-b855-421e358620d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-c7skp\" (UID: \"23fe82eb-830c-4fc5-b855-421e358620d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.831094 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/23fe82eb-830c-4fc5-b855-421e358620d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-c7skp\" (UID: \"23fe82eb-830c-4fc5-b855-421e358620d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.832749 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cdb71ef5-96d3-42b6-86d9-05be37f6961a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd\" (UID: \"cdb71ef5-96d3-42b6-86d9-05be37f6961a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.838430 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cdb71ef5-96d3-42b6-86d9-05be37f6961a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd\" (UID: \"cdb71ef5-96d3-42b6-86d9-05be37f6961a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.861645 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.873802 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-n4977"] Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.875018 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.878838 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.883691 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-wdl5x" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.892739 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-n4977"] Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.923993 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6wc\" (UniqueName: \"kubernetes.io/projected/e11733a0-03f3-4862-b5f8-92b7cc51ae99-kube-api-access-4z6wc\") pod \"observability-operator-59bdc8b94-n4977\" (UID: \"e11733a0-03f3-4862-b5f8-92b7cc51ae99\") " pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.924288 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e11733a0-03f3-4862-b5f8-92b7cc51ae99-observability-operator-tls\") pod \"observability-operator-59bdc8b94-n4977\" (UID: \"e11733a0-03f3-4862-b5f8-92b7cc51ae99\") " pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.982613 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" Mar 13 16:51:58 crc kubenswrapper[4786]: I0313 16:51:58.994120 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.031760 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e11733a0-03f3-4862-b5f8-92b7cc51ae99-observability-operator-tls\") pod \"observability-operator-59bdc8b94-n4977\" (UID: \"e11733a0-03f3-4862-b5f8-92b7cc51ae99\") " pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.031912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6wc\" (UniqueName: \"kubernetes.io/projected/e11733a0-03f3-4862-b5f8-92b7cc51ae99-kube-api-access-4z6wc\") pod \"observability-operator-59bdc8b94-n4977\" (UID: \"e11733a0-03f3-4862-b5f8-92b7cc51ae99\") " pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.065735 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e11733a0-03f3-4862-b5f8-92b7cc51ae99-observability-operator-tls\") pod \"observability-operator-59bdc8b94-n4977\" (UID: \"e11733a0-03f3-4862-b5f8-92b7cc51ae99\") " pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.092971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6wc\" (UniqueName: \"kubernetes.io/projected/e11733a0-03f3-4862-b5f8-92b7cc51ae99-kube-api-access-4z6wc\") pod \"observability-operator-59bdc8b94-n4977\" (UID: \"e11733a0-03f3-4862-b5f8-92b7cc51ae99\") " pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.156977 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jx4bl"] Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.173243 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.184367 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-xdh9j" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.203382 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jx4bl"] Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.242086 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhtv\" (UniqueName: \"kubernetes.io/projected/bdcbef08-46bc-46ef-8342-d8128a2d4de1-kube-api-access-hkhtv\") pod \"perses-operator-5bf474d74f-jx4bl\" (UID: \"bdcbef08-46bc-46ef-8342-d8128a2d4de1\") " pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.242207 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdcbef08-46bc-46ef-8342-d8128a2d4de1-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jx4bl\" (UID: \"bdcbef08-46bc-46ef-8342-d8128a2d4de1\") " pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.305335 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.350410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhtv\" (UniqueName: \"kubernetes.io/projected/bdcbef08-46bc-46ef-8342-d8128a2d4de1-kube-api-access-hkhtv\") pod \"perses-operator-5bf474d74f-jx4bl\" (UID: \"bdcbef08-46bc-46ef-8342-d8128a2d4de1\") " pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.350508 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdcbef08-46bc-46ef-8342-d8128a2d4de1-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jx4bl\" (UID: \"bdcbef08-46bc-46ef-8342-d8128a2d4de1\") " pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.351401 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdcbef08-46bc-46ef-8342-d8128a2d4de1-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jx4bl\" (UID: \"bdcbef08-46bc-46ef-8342-d8128a2d4de1\") " pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.415043 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhtv\" (UniqueName: \"kubernetes.io/projected/bdcbef08-46bc-46ef-8342-d8128a2d4de1-kube-api-access-hkhtv\") pod \"perses-operator-5bf474d74f-jx4bl\" (UID: \"bdcbef08-46bc-46ef-8342-d8128a2d4de1\") " pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:51:59 crc kubenswrapper[4786]: I0313 16:51:59.510423 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.131897 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn"] Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.148270 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557012-j6lqx"] Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.149911 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.160371 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.160486 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.160595 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.164250 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557012-j6lqx"] Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.176152 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5ql\" (UniqueName: \"kubernetes.io/projected/5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd-kube-api-access-bs5ql\") pod \"auto-csr-approver-29557012-j6lqx\" (UID: \"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd\") " pod="openshift-infra/auto-csr-approver-29557012-j6lqx" Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.269140 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-n4977"] Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.279259 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp"] Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.280326 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5ql\" (UniqueName: \"kubernetes.io/projected/5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd-kube-api-access-bs5ql\") pod \"auto-csr-approver-29557012-j6lqx\" (UID: \"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd\") " pod="openshift-infra/auto-csr-approver-29557012-j6lqx" Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.319272 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5ql\" (UniqueName: \"kubernetes.io/projected/5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd-kube-api-access-bs5ql\") pod \"auto-csr-approver-29557012-j6lqx\" (UID: \"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd\") " pod="openshift-infra/auto-csr-approver-29557012-j6lqx" Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.323923 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd"] Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.351207 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" event={"ID":"23fe82eb-830c-4fc5-b855-421e358620d5","Type":"ContainerStarted","Data":"03b0eeef5b3eef957e78265462a46d5e4663bbb4c163eefbda3015fbda06d783"} Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.352305 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jx4bl"] Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.352768 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-n4977" event={"ID":"e11733a0-03f3-4862-b5f8-92b7cc51ae99","Type":"ContainerStarted","Data":"16392c84c7eac5c8e2ac8506a9f67038bb4aefdcb2fdbb675006a4962a6b4560"} Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.354492 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" event={"ID":"bdcbef08-46bc-46ef-8342-d8128a2d4de1","Type":"ContainerStarted","Data":"56d01bbc9d3d55bbddd1adf4044c2e2201332253710bd0c7918bfd791f817695"} Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.355940 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" event={"ID":"d2886207-16dc-47f5-bc4d-dba0a8a55ed1","Type":"ContainerStarted","Data":"26e177615f94273f79f963ef0523c6721e1d062244263b27889ccc938d122ff1"} Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.357215 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" event={"ID":"cdb71ef5-96d3-42b6-86d9-05be37f6961a","Type":"ContainerStarted","Data":"39b276c5ce8a0f24ca30107899b06cf3b18fcca11caddbb4431d0b8859d9630b"} Mar 13 16:52:00 crc kubenswrapper[4786]: I0313 16:52:00.546367 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" Mar 13 16:52:01 crc kubenswrapper[4786]: I0313 16:52:01.119602 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557012-j6lqx"] Mar 13 16:52:01 crc kubenswrapper[4786]: I0313 16:52:01.367510 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" event={"ID":"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd","Type":"ContainerStarted","Data":"5279764e39df9036ab3def5b8eb72e2b9262f6d6cff7d6593312a46d94a913c6"} Mar 13 16:52:03 crc kubenswrapper[4786]: I0313 16:52:03.416789 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" event={"ID":"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd","Type":"ContainerStarted","Data":"84b74aa0ae04474301efb2fe7fdce8900d9b2cb70e891b950ee3d372511b996f"} Mar 13 16:52:03 crc kubenswrapper[4786]: I0313 16:52:03.439235 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" podStartSLOduration=2.548289233 podStartE2EDuration="3.439219977s" podCreationTimestamp="2026-03-13 16:52:00 +0000 UTC" firstStartedPulling="2026-03-13 16:52:01.149065549 +0000 UTC m=+6551.312277360" lastFinishedPulling="2026-03-13 16:52:02.039996293 +0000 UTC m=+6552.203208104" observedRunningTime="2026-03-13 16:52:03.429824491 +0000 UTC m=+6553.593036302" watchObservedRunningTime="2026-03-13 16:52:03.439219977 +0000 UTC m=+6553.602431788" Mar 13 16:52:05 crc kubenswrapper[4786]: I0313 16:52:05.440637 4786 generic.go:334] "Generic (PLEG): container finished" podID="5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd" containerID="84b74aa0ae04474301efb2fe7fdce8900d9b2cb70e891b950ee3d372511b996f" exitCode=0 Mar 13 16:52:05 crc kubenswrapper[4786]: I0313 16:52:05.440744 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" event={"ID":"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd","Type":"ContainerDied","Data":"84b74aa0ae04474301efb2fe7fdce8900d9b2cb70e891b950ee3d372511b996f"} Mar 13 16:52:07 crc kubenswrapper[4786]: I0313 16:52:07.869225 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:52:07 crc kubenswrapper[4786]: I0313 16:52:07.869812 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:52:10 crc kubenswrapper[4786]: I0313 16:52:10.160949 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" Mar 13 16:52:10 crc kubenswrapper[4786]: I0313 16:52:10.324749 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs5ql\" (UniqueName: \"kubernetes.io/projected/5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd-kube-api-access-bs5ql\") pod \"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd\" (UID: \"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd\") " Mar 13 16:52:10 crc kubenswrapper[4786]: I0313 16:52:10.346109 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd-kube-api-access-bs5ql" (OuterVolumeSpecName: "kube-api-access-bs5ql") pod "5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd" (UID: "5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd"). InnerVolumeSpecName "kube-api-access-bs5ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:52:10 crc kubenswrapper[4786]: I0313 16:52:10.427293 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs5ql\" (UniqueName: \"kubernetes.io/projected/5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd-kube-api-access-bs5ql\") on node \"crc\" DevicePath \"\"" Mar 13 16:52:10 crc kubenswrapper[4786]: I0313 16:52:10.498741 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" event={"ID":"5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd","Type":"ContainerDied","Data":"5279764e39df9036ab3def5b8eb72e2b9262f6d6cff7d6593312a46d94a913c6"} Mar 13 16:52:10 crc kubenswrapper[4786]: I0313 16:52:10.498778 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5279764e39df9036ab3def5b8eb72e2b9262f6d6cff7d6593312a46d94a913c6" Mar 13 16:52:10 crc kubenswrapper[4786]: I0313 16:52:10.498806 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557012-j6lqx" Mar 13 16:52:11 crc kubenswrapper[4786]: I0313 16:52:11.260374 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557006-4q2sc"] Mar 13 16:52:11 crc kubenswrapper[4786]: I0313 16:52:11.271318 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557006-4q2sc"] Mar 13 16:52:12 crc kubenswrapper[4786]: I0313 16:52:12.565586 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40aa8c47-ea06-43fe-ab42-ed8406764e84" path="/var/lib/kubelet/pods/40aa8c47-ea06-43fe-ab42-ed8406764e84/volumes" Mar 13 16:52:13 crc kubenswrapper[4786]: E0313 16:52:13.861042 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 13 16:52:13 crc kubenswrapper[4786]: E0313 16:52:13.861562 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-6d866698b6-c7skp_openshift-operators(23fe82eb-830c-4fc5-b855-421e358620d5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 16:52:13 crc kubenswrapper[4786]: E0313 16:52:13.861756 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a" Mar 13 16:52:13 crc kubenswrapper[4786]: E0313 16:52:13.861881 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator --watch-referenced-objects-in-all-namespaces=true --disable-unmanaged-prometheus-configuration=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fwwb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-68bc856cb9-2rkbn_openshift-operators(d2886207-16dc-47f5-bc4d-dba0a8a55ed1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 16:52:13 crc kubenswrapper[4786]: E0313 16:52:13.863058 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" podUID="23fe82eb-830c-4fc5-b855-421e358620d5" Mar 13 16:52:13 crc kubenswrapper[4786]: E0313 16:52:13.863090 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" podUID="d2886207-16dc-47f5-bc4d-dba0a8a55ed1" Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.544137 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-n4977" event={"ID":"e11733a0-03f3-4862-b5f8-92b7cc51ae99","Type":"ContainerStarted","Data":"d157f63b9bc09a8a94fd00406adcbb59675fde0d3711d4602725bcb13091cc0a"} Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.544474 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.550194 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" event={"ID":"bdcbef08-46bc-46ef-8342-d8128a2d4de1","Type":"ContainerStarted","Data":"3105ef745e52f29a6e6018ff67216d5410fc20da2baadab6e9b235ee956515d7"} Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.550723 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:52:14 crc kubenswrapper[4786]: E0313 16:52:14.581898 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a\\\"\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" podUID="d2886207-16dc-47f5-bc4d-dba0a8a55ed1" Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.590292 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-n4977" podStartSLOduration=2.961960999 podStartE2EDuration="16.590270853s" podCreationTimestamp="2026-03-13 16:51:58 +0000 UTC" firstStartedPulling="2026-03-13 16:52:00.290758116 +0000 UTC m=+6550.453969927" lastFinishedPulling="2026-03-13 16:52:13.91906797 +0000 UTC m=+6564.082279781" observedRunningTime="2026-03-13 16:52:14.587514354 +0000 UTC m=+6564.750726165" watchObservedRunningTime="2026-03-13 16:52:14.590270853 +0000 UTC m=+6564.753482664" Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.595959 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" event={"ID":"cdb71ef5-96d3-42b6-86d9-05be37f6961a","Type":"ContainerStarted","Data":"cc013cd286ff3643674b61754ebe6a72b65d2314086b26c6f414594c2f412670"} Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.596034 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-n4977" Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.662575 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" podStartSLOduration=2.127012096 podStartE2EDuration="15.662551549s" podCreationTimestamp="2026-03-13 16:51:59 +0000 UTC" firstStartedPulling="2026-03-13 16:52:00.32157615 +0000 UTC m=+6550.484787951" lastFinishedPulling="2026-03-13 16:52:13.857115593 +0000 UTC m=+6564.020327404" observedRunningTime="2026-03-13 16:52:14.656733913 +0000 UTC m=+6564.819945724" watchObservedRunningTime="2026-03-13 16:52:14.662551549 +0000 UTC m=+6564.825763360" Mar 13 16:52:14 crc kubenswrapper[4786]: I0313 16:52:14.687921 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd" podStartSLOduration=3.144429404 podStartE2EDuration="16.687900636s" podCreationTimestamp="2026-03-13 16:51:58 +0000 UTC" firstStartedPulling="2026-03-13 16:52:00.339263085 +0000 UTC m=+6550.502474896" lastFinishedPulling="2026-03-13 16:52:13.882734317 +0000 UTC m=+6564.045946128" observedRunningTime="2026-03-13 16:52:14.671462023 +0000 UTC m=+6564.834673834" watchObservedRunningTime="2026-03-13 16:52:14.687900636 +0000 UTC m=+6564.851112467" Mar 13 16:52:15 crc kubenswrapper[4786]: I0313 16:52:15.586319 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" event={"ID":"23fe82eb-830c-4fc5-b855-421e358620d5","Type":"ContainerStarted","Data":"2d1e318e4965ed3316c71a25ffdd1d50b41b90958e2bf7fdc97ab6681507ca84"} Mar 13 16:52:15 crc kubenswrapper[4786]: I0313 16:52:15.609705 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d866698b6-c7skp" podStartSLOduration=-9223372019.245087 podStartE2EDuration="17.609688484s" podCreationTimestamp="2026-03-13 16:51:58 +0000 UTC" firstStartedPulling="2026-03-13 16:52:00.291067014 +0000 UTC m=+6550.454278825" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:52:15.604376501 +0000 UTC m=+6565.767588332" watchObservedRunningTime="2026-03-13 16:52:15.609688484 +0000 UTC m=+6565.772900295" Mar 13 16:52:17 crc kubenswrapper[4786]: I0313 16:52:17.147025 4786 scope.go:117] "RemoveContainer" containerID="3b1c82d2be967ff6faec2c05e8a6292934791fc3115b2186a5905385625879eb" Mar 13 16:52:17 crc kubenswrapper[4786]: I0313 16:52:17.200794 4786 scope.go:117] "RemoveContainer" containerID="f3bfcf3d00095b6b6b237110f166707860784a0b3f833d97f60db20aedea3a4e" Mar 13 16:52:17 crc kubenswrapper[4786]: I0313 16:52:17.265175 4786 scope.go:117] "RemoveContainer" containerID="f79b97c3ccaefcdb9b4d9b14d4b9f3c87a7a7ec9dbe88fcd0bbed4d1fd25eed6" Mar 13 16:52:19 crc kubenswrapper[4786]: I0313 16:52:19.514188 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-jx4bl" Mar 13 16:52:27 crc kubenswrapper[4786]: I0313 16:52:27.737312 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" event={"ID":"d2886207-16dc-47f5-bc4d-dba0a8a55ed1","Type":"ContainerStarted","Data":"ff85a9be535bee55a5fe82e0b032d172cb2c8aa60b92eb856448b74c7ec5d7e2"} Mar 13 16:52:27 crc kubenswrapper[4786]: I0313 16:52:27.765805 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rkbn" podStartSLOduration=3.308141187 podStartE2EDuration="29.765785031s" podCreationTimestamp="2026-03-13 16:51:58 +0000 UTC" firstStartedPulling="2026-03-13 16:52:00.19617914 +0000 UTC m=+6550.359390941" lastFinishedPulling="2026-03-13 16:52:26.653822974 +0000 UTC m=+6576.817034785" observedRunningTime="2026-03-13 16:52:27.756403155 +0000 UTC m=+6577.919614976" watchObservedRunningTime="2026-03-13 16:52:27.765785031 +0000 UTC m=+6577.928996852" Mar 13 16:52:29 crc kubenswrapper[4786]: I0313 16:52:29.042314 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qsk42"] Mar 13 16:52:29 crc kubenswrapper[4786]: I0313 16:52:29.056614 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qsk42"] Mar 13 16:52:30 crc kubenswrapper[4786]: I0313 16:52:30.026483 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9589-account-create-update-tfxvl"] Mar 13 16:52:30 crc kubenswrapper[4786]: I0313 16:52:30.040359 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9589-account-create-update-tfxvl"] Mar 13 16:52:30 crc kubenswrapper[4786]: I0313 16:52:30.564779 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0856c17b-ac5d-4f95-b0d1-d0b725c11454" path="/var/lib/kubelet/pods/0856c17b-ac5d-4f95-b0d1-d0b725c11454/volumes" Mar 13 16:52:30 crc kubenswrapper[4786]: I0313 16:52:30.565381 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a98a232-c926-4aba-85c2-a628b9c6d16d" path="/var/lib/kubelet/pods/8a98a232-c926-4aba-85c2-a628b9c6d16d/volumes" Mar 13 16:52:33 crc kubenswrapper[4786]: I0313 16:52:33.978582 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:33 crc kubenswrapper[4786]: I0313 16:52:33.979465 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="4771f9ed-a556-43de-b0b2-5f91c6b02768" containerName="openstackclient" containerID="cri-o://73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560" gracePeriod=2 Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.026520 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.050622 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:34 crc kubenswrapper[4786]: E0313 16:52:34.052236 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4771f9ed-a556-43de-b0b2-5f91c6b02768" containerName="openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.052284 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4771f9ed-a556-43de-b0b2-5f91c6b02768" containerName="openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: E0313 16:52:34.052319 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd" containerName="oc" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.052328 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd" containerName="oc" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.052606 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd" containerName="oc" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.052644 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4771f9ed-a556-43de-b0b2-5f91c6b02768" containerName="openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.053546 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.061567 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.065467 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9ffbb2a9-64ca-4f9f-be80-d7424553cb69" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.082953 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:34 crc kubenswrapper[4786]: E0313 16:52:34.083714 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-f2psl openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-f2psl openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="9ffbb2a9-64ca-4f9f-be80-d7424553cb69" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.091473 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.099973 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.101598 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.108441 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.128430 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4771f9ed-a556-43de-b0b2-5f91c6b02768" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.198481 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.199793 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.203302 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n2lfr" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.224268 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.247932 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.247991 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.248091 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.248147 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfwt2\" (UniqueName: \"kubernetes.io/projected/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-kube-api-access-dfwt2\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.350265 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.350355 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fklns\" (UniqueName: \"kubernetes.io/projected/d1dfd277-77e2-41fb-ab90-1548be6194d9-kube-api-access-fklns\") pod \"kube-state-metrics-0\" (UID: \"d1dfd277-77e2-41fb-ab90-1548be6194d9\") " pod="openstack/kube-state-metrics-0" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.350392 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfwt2\" (UniqueName: \"kubernetes.io/projected/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-kube-api-access-dfwt2\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.350434 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.350466 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.351416 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.356067 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.379744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.386071 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfwt2\" (UniqueName: \"kubernetes.io/projected/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-kube-api-access-dfwt2\") pod \"openstackclient\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.419770 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.452542 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fklns\" (UniqueName: \"kubernetes.io/projected/d1dfd277-77e2-41fb-ab90-1548be6194d9-kube-api-access-fklns\") pod \"kube-state-metrics-0\" (UID: \"d1dfd277-77e2-41fb-ab90-1548be6194d9\") " pod="openstack/kube-state-metrics-0" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.473248 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fklns\" (UniqueName: \"kubernetes.io/projected/d1dfd277-77e2-41fb-ab90-1548be6194d9-kube-api-access-fklns\") pod \"kube-state-metrics-0\" (UID: \"d1dfd277-77e2-41fb-ab90-1548be6194d9\") " pod="openstack/kube-state-metrics-0" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.517282 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.571723 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffbb2a9-64ca-4f9f-be80-d7424553cb69" path="/var/lib/kubelet/pods/9ffbb2a9-64ca-4f9f-be80-d7424553cb69/volumes" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.821601 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.824474 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9ffbb2a9-64ca-4f9f-be80-d7424553cb69" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.834756 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.838270 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9ffbb2a9-64ca-4f9f-be80-d7424553cb69" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.872095 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.874232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.877616 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.877748 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.878019 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.878185 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.878543 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-j8bjt" Mar 13 16:52:34 crc kubenswrapper[4786]: I0313 16:52:34.888503 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.061569 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.069689 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.073651 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/98f1c455-5023-4f1f-989c-4c63b48a696b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.073686 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.073718 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/98f1c455-5023-4f1f-989c-4c63b48a696b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.074349 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npbrz\" (UniqueName: \"kubernetes.io/projected/98f1c455-5023-4f1f-989c-4c63b48a696b-kube-api-access-npbrz\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.074428 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/98f1c455-5023-4f1f-989c-4c63b48a696b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.074476 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.074701 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.176677 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/98f1c455-5023-4f1f-989c-4c63b48a696b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.177053 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.177110 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.177148 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/98f1c455-5023-4f1f-989c-4c63b48a696b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.177168 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.177191 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/98f1c455-5023-4f1f-989c-4c63b48a696b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.177221 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npbrz\" (UniqueName: \"kubernetes.io/projected/98f1c455-5023-4f1f-989c-4c63b48a696b-kube-api-access-npbrz\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.177886 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/98f1c455-5023-4f1f-989c-4c63b48a696b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.186489 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/98f1c455-5023-4f1f-989c-4c63b48a696b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.186825 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.187105 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/98f1c455-5023-4f1f-989c-4c63b48a696b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.187290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.191694 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/98f1c455-5023-4f1f-989c-4c63b48a696b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.199261 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npbrz\" (UniqueName: \"kubernetes.io/projected/98f1c455-5023-4f1f-989c-4c63b48a696b-kube-api-access-npbrz\") pod \"alertmanager-metric-storage-0\" (UID: \"98f1c455-5023-4f1f-989c-4c63b48a696b\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.459781 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.462392 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.466347 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.466380 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.466397 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.466405 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.466520 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-9dzwq" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.466709 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.466812 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.479039 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.482589 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.496096 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584203 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584532 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd3541d1-dd5d-4779-ad92-25491f443513-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584559 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584600 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rncqr\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-kube-api-access-rncqr\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584704 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584725 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584754 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-config\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.584787 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687185 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rncqr\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-kube-api-access-rncqr\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687295 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687335 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687393 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687421 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687458 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-config\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687496 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687620 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687672 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd3541d1-dd5d-4779-ad92-25491f443513-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.687720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.688591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.689147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.691436 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.694421 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.702768 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.703638 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd3541d1-dd5d-4779-ad92-25491f443513-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.704282 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.704358 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/22de98f187aafd7c57286e5c682d3283fcc085a3cc830afd132a53eec618e4ac/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.704502 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rncqr\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-kube-api-access-rncqr\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.705214 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-config\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.708537 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.765756 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"prometheus-metric-storage-0\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.786208 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.838305 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4968f9bc-8e95-4fb8-bcbc-acc27742a76a","Type":"ContainerStarted","Data":"2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b"} Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.838338 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4968f9bc-8e95-4fb8-bcbc-acc27742a76a","Type":"ContainerStarted","Data":"0e9825bb34c4f083f5ea8eab287f400172a3e2271c9adaa7d081577915c7a0f4"} Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.851989 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.852273 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d1dfd277-77e2-41fb-ab90-1548be6194d9","Type":"ContainerStarted","Data":"973a6ab88f7b1d302acda236ba0069d5845763aa3c520b0320a4ac64ac667f35"} Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.877323 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9ffbb2a9-64ca-4f9f-be80-d7424553cb69" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" Mar 13 16:52:35 crc kubenswrapper[4786]: I0313 16:52:35.893383 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.893357945 podStartE2EDuration="1.893357945s" podCreationTimestamp="2026-03-13 16:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:52:35.866657044 +0000 UTC m=+6586.029868855" watchObservedRunningTime="2026-03-13 16:52:35.893357945 +0000 UTC m=+6586.056569756" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.017821 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.292167 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:52:36 crc kubenswrapper[4786]: W0313 16:52:36.302698 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd3541d1_dd5d_4779_ad92_25491f443513.slice/crio-399a9d567dd2f4e55befb7133ca1409faaf8fea5474e1315268d03714ec92f56 WatchSource:0}: Error finding container 399a9d567dd2f4e55befb7133ca1409faaf8fea5474e1315268d03714ec92f56: Status 404 returned error can't find the container with id 399a9d567dd2f4e55befb7133ca1409faaf8fea5474e1315268d03714ec92f56 Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.471739 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.616152 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-combined-ca-bundle\") pod \"4771f9ed-a556-43de-b0b2-5f91c6b02768\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.616196 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-596mc\" (UniqueName: \"kubernetes.io/projected/4771f9ed-a556-43de-b0b2-5f91c6b02768-kube-api-access-596mc\") pod \"4771f9ed-a556-43de-b0b2-5f91c6b02768\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.616270 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config-secret\") pod \"4771f9ed-a556-43de-b0b2-5f91c6b02768\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.617005 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config\") pod \"4771f9ed-a556-43de-b0b2-5f91c6b02768\" (UID: \"4771f9ed-a556-43de-b0b2-5f91c6b02768\") " Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.621643 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4771f9ed-a556-43de-b0b2-5f91c6b02768-kube-api-access-596mc" (OuterVolumeSpecName: "kube-api-access-596mc") pod "4771f9ed-a556-43de-b0b2-5f91c6b02768" (UID: "4771f9ed-a556-43de-b0b2-5f91c6b02768"). InnerVolumeSpecName "kube-api-access-596mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.645403 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4771f9ed-a556-43de-b0b2-5f91c6b02768" (UID: "4771f9ed-a556-43de-b0b2-5f91c6b02768"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.655406 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4771f9ed-a556-43de-b0b2-5f91c6b02768" (UID: "4771f9ed-a556-43de-b0b2-5f91c6b02768"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.678566 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4771f9ed-a556-43de-b0b2-5f91c6b02768" (UID: "4771f9ed-a556-43de-b0b2-5f91c6b02768"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.720371 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.720622 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-596mc\" (UniqueName: \"kubernetes.io/projected/4771f9ed-a556-43de-b0b2-5f91c6b02768-kube-api-access-596mc\") on node \"crc\" DevicePath \"\"" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.720747 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.720878 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4771f9ed-a556-43de-b0b2-5f91c6b02768-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.860291 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerStarted","Data":"399a9d567dd2f4e55befb7133ca1409faaf8fea5474e1315268d03714ec92f56"} Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.861846 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d1dfd277-77e2-41fb-ab90-1548be6194d9","Type":"ContainerStarted","Data":"4174001cd6897bd2f49364208aadfa47aab9074ae6e54e9e8c0954335a03a299"} Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.862014 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.863588 4786 generic.go:334] "Generic (PLEG): container finished" podID="4771f9ed-a556-43de-b0b2-5f91c6b02768" containerID="73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560" exitCode=137 Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.863646 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.863660 4786 scope.go:117] "RemoveContainer" containerID="73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.864877 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"98f1c455-5023-4f1f-989c-4c63b48a696b","Type":"ContainerStarted","Data":"7af4d8c985db112c34d3ad5bd7f13cf7eeede35fc6f40118e2d70be84183b880"} Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.885185 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.426660734 podStartE2EDuration="2.885166583s" podCreationTimestamp="2026-03-13 16:52:34 +0000 UTC" firstStartedPulling="2026-03-13 16:52:35.086951825 +0000 UTC m=+6585.250163636" lastFinishedPulling="2026-03-13 16:52:35.545457674 +0000 UTC m=+6585.708669485" observedRunningTime="2026-03-13 16:52:36.87590418 +0000 UTC m=+6587.039116011" watchObservedRunningTime="2026-03-13 16:52:36.885166583 +0000 UTC m=+6587.048378394" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.891006 4786 scope.go:117] "RemoveContainer" containerID="73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560" Mar 13 16:52:36 crc kubenswrapper[4786]: E0313 16:52:36.894809 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560\": container with ID starting with 73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560 not found: ID does not exist" containerID="73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560" Mar 13 16:52:36 crc kubenswrapper[4786]: I0313 16:52:36.894905 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560"} err="failed to get container status \"73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560\": rpc error: code = NotFound desc = could not find container \"73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560\": container with ID starting with 73cb9a3ad7360228efa9c00fd84987b911eed685ca1b959edf2eb39ef5d56560 not found: ID does not exist" Mar 13 16:52:37 crc kubenswrapper[4786]: I0313 16:52:37.868873 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:52:37 crc kubenswrapper[4786]: I0313 16:52:37.869128 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:52:37 crc kubenswrapper[4786]: I0313 16:52:37.869160 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:52:37 crc kubenswrapper[4786]: I0313 16:52:37.869822 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b49392c99cf5e104c67fff6a8b879c097bd7fea9986e8b283b323621dcd6d857"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:52:37 crc kubenswrapper[4786]: I0313 16:52:37.869875 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://b49392c99cf5e104c67fff6a8b879c097bd7fea9986e8b283b323621dcd6d857" gracePeriod=600 Mar 13 16:52:38 crc kubenswrapper[4786]: I0313 16:52:38.037924 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-4c84r"] Mar 13 16:52:38 crc kubenswrapper[4786]: I0313 16:52:38.048488 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-4c84r"] Mar 13 16:52:38 crc kubenswrapper[4786]: I0313 16:52:38.573517 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4771f9ed-a556-43de-b0b2-5f91c6b02768" path="/var/lib/kubelet/pods/4771f9ed-a556-43de-b0b2-5f91c6b02768/volumes" Mar 13 16:52:38 crc kubenswrapper[4786]: I0313 16:52:38.575705 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa484794-3f56-40f8-b139-b1c8ed536c2c" path="/var/lib/kubelet/pods/aa484794-3f56-40f8-b139-b1c8ed536c2c/volumes" Mar 13 16:52:38 crc kubenswrapper[4786]: I0313 16:52:38.895321 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="b49392c99cf5e104c67fff6a8b879c097bd7fea9986e8b283b323621dcd6d857" exitCode=0 Mar 13 16:52:38 crc kubenswrapper[4786]: I0313 16:52:38.895380 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"b49392c99cf5e104c67fff6a8b879c097bd7fea9986e8b283b323621dcd6d857"} Mar 13 16:52:38 crc kubenswrapper[4786]: I0313 16:52:38.895426 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d"} Mar 13 16:52:38 crc kubenswrapper[4786]: I0313 16:52:38.895453 4786 scope.go:117] "RemoveContainer" containerID="8cf71e4e3060515662693fd067a9e53e8ce31ccba415f84674a8ec6c27b7ab5d" Mar 13 16:52:42 crc kubenswrapper[4786]: I0313 16:52:42.945105 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerStarted","Data":"ae3b2a9aa4aaf219f07db829d43b1c5f6bf08268a7a8fba507524d067da0a8d8"} Mar 13 16:52:42 crc kubenswrapper[4786]: I0313 16:52:42.948522 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"98f1c455-5023-4f1f-989c-4c63b48a696b","Type":"ContainerStarted","Data":"eed14bfe3ca7046eceb358f4edaf9b520b1650d42b441e4624a459b8d05960f7"} Mar 13 16:52:44 crc kubenswrapper[4786]: I0313 16:52:44.550985 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 16:52:51 crc kubenswrapper[4786]: I0313 16:52:51.051375 4786 generic.go:334] "Generic (PLEG): container finished" podID="dd3541d1-dd5d-4779-ad92-25491f443513" containerID="ae3b2a9aa4aaf219f07db829d43b1c5f6bf08268a7a8fba507524d067da0a8d8" exitCode=0 Mar 13 16:52:51 crc kubenswrapper[4786]: I0313 16:52:51.051495 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerDied","Data":"ae3b2a9aa4aaf219f07db829d43b1c5f6bf08268a7a8fba507524d067da0a8d8"} Mar 13 16:52:52 crc kubenswrapper[4786]: I0313 16:52:52.062919 4786 generic.go:334] "Generic (PLEG): container finished" podID="98f1c455-5023-4f1f-989c-4c63b48a696b" containerID="eed14bfe3ca7046eceb358f4edaf9b520b1650d42b441e4624a459b8d05960f7" exitCode=0 Mar 13 16:52:52 crc kubenswrapper[4786]: I0313 16:52:52.063062 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"98f1c455-5023-4f1f-989c-4c63b48a696b","Type":"ContainerDied","Data":"eed14bfe3ca7046eceb358f4edaf9b520b1650d42b441e4624a459b8d05960f7"} Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.144469 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"98f1c455-5023-4f1f-989c-4c63b48a696b","Type":"ContainerStarted","Data":"28eb9d5de057221b5dbe60f282dea224925a8b0dd67ad18b5c49b3a3f925d7dd"} Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.147396 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerStarted","Data":"64cc75100b6a49620a14a655bda22a7648214b394bb56bf487dbcffc42789783"} Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.707850 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gn8kp"] Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.717515 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.722944 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn8kp"] Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.795231 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-catalog-content\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.795440 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-utilities\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.795506 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r24x\" (UniqueName: \"kubernetes.io/projected/c438e563-3338-4dd3-bfda-0ea9ec97214b-kube-api-access-7r24x\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.896757 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-utilities\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.896826 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r24x\" (UniqueName: \"kubernetes.io/projected/c438e563-3338-4dd3-bfda-0ea9ec97214b-kube-api-access-7r24x\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.896877 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-catalog-content\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.897262 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-utilities\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.897337 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-catalog-content\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:52:59 crc kubenswrapper[4786]: I0313 16:52:59.919987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r24x\" (UniqueName: \"kubernetes.io/projected/c438e563-3338-4dd3-bfda-0ea9ec97214b-kube-api-access-7r24x\") pod \"certified-operators-gn8kp\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:53:00 crc kubenswrapper[4786]: I0313 16:53:00.045700 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:53:00 crc kubenswrapper[4786]: I0313 16:53:00.549750 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn8kp"] Mar 13 16:53:01 crc kubenswrapper[4786]: I0313 16:53:01.181546 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8kp" event={"ID":"c438e563-3338-4dd3-bfda-0ea9ec97214b","Type":"ContainerStarted","Data":"aa2f9318bbabdeab243223b7e0bab737da7b09e19b02d494cfd3e2f6a7a1905a"} Mar 13 16:53:02 crc kubenswrapper[4786]: I0313 16:53:02.193780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"98f1c455-5023-4f1f-989c-4c63b48a696b","Type":"ContainerStarted","Data":"a7fad91d7c6b7632c26f5a75158b3fa046950a0f87bed350b79dabb60968f29c"} Mar 13 16:53:02 crc kubenswrapper[4786]: I0313 16:53:02.194037 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 13 16:53:02 crc kubenswrapper[4786]: I0313 16:53:02.196785 4786 generic.go:334] "Generic (PLEG): container finished" podID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerID="82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98" exitCode=0 Mar 13 16:53:02 crc kubenswrapper[4786]: I0313 16:53:02.196833 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8kp" event={"ID":"c438e563-3338-4dd3-bfda-0ea9ec97214b","Type":"ContainerDied","Data":"82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98"} Mar 13 16:53:02 crc kubenswrapper[4786]: I0313 16:53:02.198440 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 13 16:53:02 crc kubenswrapper[4786]: I0313 16:53:02.218970 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.361199702 podStartE2EDuration="28.218941201s" podCreationTimestamp="2026-03-13 16:52:34 +0000 UTC" firstStartedPulling="2026-03-13 16:52:36.08741526 +0000 UTC m=+6586.250627071" lastFinishedPulling="2026-03-13 16:52:57.945156749 +0000 UTC m=+6608.108368570" observedRunningTime="2026-03-13 16:53:02.215886255 +0000 UTC m=+6612.379098096" watchObservedRunningTime="2026-03-13 16:53:02.218941201 +0000 UTC m=+6612.382153022" Mar 13 16:53:03 crc kubenswrapper[4786]: I0313 16:53:03.207621 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8kp" event={"ID":"c438e563-3338-4dd3-bfda-0ea9ec97214b","Type":"ContainerStarted","Data":"bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678"} Mar 13 16:53:03 crc kubenswrapper[4786]: I0313 16:53:03.212339 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerStarted","Data":"24c124b8e43caaad61676e66327baa78f3796309d19e0c81362863d689946b8d"} Mar 13 16:53:06 crc kubenswrapper[4786]: I0313 16:53:06.243456 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerStarted","Data":"89a3101c406a5ea15d8f7f976f2b6e7bc1cabe84a63c4500ffc1f953c9965062"} Mar 13 16:53:06 crc kubenswrapper[4786]: I0313 16:53:06.245504 4786 generic.go:334] "Generic (PLEG): container finished" podID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerID="bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678" exitCode=0 Mar 13 16:53:06 crc kubenswrapper[4786]: I0313 16:53:06.245532 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8kp" event={"ID":"c438e563-3338-4dd3-bfda-0ea9ec97214b","Type":"ContainerDied","Data":"bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678"} Mar 13 16:53:06 crc kubenswrapper[4786]: I0313 16:53:06.276169 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.224311203 podStartE2EDuration="32.276154494s" podCreationTimestamp="2026-03-13 16:52:34 +0000 UTC" firstStartedPulling="2026-03-13 16:52:36.305206672 +0000 UTC m=+6586.468418483" lastFinishedPulling="2026-03-13 16:53:05.357049963 +0000 UTC m=+6615.520261774" observedRunningTime="2026-03-13 16:53:06.27122311 +0000 UTC m=+6616.434434921" watchObservedRunningTime="2026-03-13 16:53:06.276154494 +0000 UTC m=+6616.439366305" Mar 13 16:53:07 crc kubenswrapper[4786]: I0313 16:53:07.258925 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8kp" event={"ID":"c438e563-3338-4dd3-bfda-0ea9ec97214b","Type":"ContainerStarted","Data":"cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d"} Mar 13 16:53:07 crc kubenswrapper[4786]: I0313 16:53:07.284754 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gn8kp" podStartSLOduration=3.8067897 podStartE2EDuration="8.284728593s" podCreationTimestamp="2026-03-13 16:52:59 +0000 UTC" firstStartedPulling="2026-03-13 16:53:02.198368634 +0000 UTC m=+6612.361580445" lastFinishedPulling="2026-03-13 16:53:06.676307487 +0000 UTC m=+6616.839519338" observedRunningTime="2026-03-13 16:53:07.283503903 +0000 UTC m=+6617.446715724" watchObservedRunningTime="2026-03-13 16:53:07.284728593 +0000 UTC m=+6617.447940424" Mar 13 16:53:08 crc kubenswrapper[4786]: I0313 16:53:08.065266 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7e2d-account-create-update-sxwss"] Mar 13 16:53:08 crc kubenswrapper[4786]: I0313 16:53:08.082284 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2chdt"] Mar 13 16:53:08 crc kubenswrapper[4786]: I0313 16:53:08.097890 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2chdt"] Mar 13 16:53:08 crc kubenswrapper[4786]: I0313 16:53:08.111904 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7e2d-account-create-update-sxwss"] Mar 13 16:53:08 crc kubenswrapper[4786]: I0313 16:53:08.568681 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ac72a8-e2ed-4402-b828-e930f5a90177" path="/var/lib/kubelet/pods/32ac72a8-e2ed-4402-b828-e930f5a90177/volumes" Mar 13 16:53:08 crc kubenswrapper[4786]: I0313 16:53:08.570104 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8222e878-a64d-4afb-910c-7772b7226a4a" path="/var/lib/kubelet/pods/8222e878-a64d-4afb-910c-7772b7226a4a/volumes" Mar 13 16:53:09 crc kubenswrapper[4786]: I0313 16:53:09.987014 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:09 crc kubenswrapper[4786]: I0313 16:53:09.990183 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:09 crc kubenswrapper[4786]: I0313 16:53:09.993877 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 16:53:09 crc kubenswrapper[4786]: I0313 16:53:09.994094 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.001909 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.032217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-log-httpd\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.032364 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-scripts\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.032409 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.032844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-config-data\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.032954 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-run-httpd\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.033003 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srtbr\" (UniqueName: \"kubernetes.io/projected/11342837-8700-476a-9e91-f22aa073e82e-kube-api-access-srtbr\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.033112 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.046043 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.046088 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.135698 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.135775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-log-httpd\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.135840 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-scripts\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.135891 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.136055 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-config-data\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.136109 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-run-httpd\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.136145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srtbr\" (UniqueName: \"kubernetes.io/projected/11342837-8700-476a-9e91-f22aa073e82e-kube-api-access-srtbr\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.136723 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-run-httpd\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.136994 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-log-httpd\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.143130 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-config-data\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.143837 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.147563 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.155426 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-scripts\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.157108 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srtbr\" (UniqueName: \"kubernetes.io/projected/11342837-8700-476a-9e91-f22aa073e82e-kube-api-access-srtbr\") pod \"ceilometer-0\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.383193 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.786774 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:10 crc kubenswrapper[4786]: I0313 16:53:10.906763 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:11 crc kubenswrapper[4786]: I0313 16:53:11.102185 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-gn8kp" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="registry-server" probeResult="failure" output=< Mar 13 16:53:11 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 16:53:11 crc kubenswrapper[4786]: > Mar 13 16:53:11 crc kubenswrapper[4786]: I0313 16:53:11.306737 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerStarted","Data":"46c8b43719bc9f4d0299ca999d2475e70815b325aeee12355dcb14d4b1d0afd2"} Mar 13 16:53:12 crc kubenswrapper[4786]: I0313 16:53:12.317480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerStarted","Data":"bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad"} Mar 13 16:53:13 crc kubenswrapper[4786]: I0313 16:53:13.031946 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-gv2vc"] Mar 13 16:53:13 crc kubenswrapper[4786]: I0313 16:53:13.044192 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-gv2vc"] Mar 13 16:53:13 crc kubenswrapper[4786]: I0313 16:53:13.331950 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerStarted","Data":"14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132"} Mar 13 16:53:14 crc kubenswrapper[4786]: I0313 16:53:14.345317 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerStarted","Data":"ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a"} Mar 13 16:53:14 crc kubenswrapper[4786]: I0313 16:53:14.563848 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42796ed0-ed45-4786-b1e9-544ac1a13d7d" path="/var/lib/kubelet/pods/42796ed0-ed45-4786-b1e9-544ac1a13d7d/volumes" Mar 13 16:53:16 crc kubenswrapper[4786]: I0313 16:53:16.368097 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerStarted","Data":"5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa"} Mar 13 16:53:16 crc kubenswrapper[4786]: I0313 16:53:16.368743 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 16:53:16 crc kubenswrapper[4786]: I0313 16:53:16.393477 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.366518962 podStartE2EDuration="7.393457928s" podCreationTimestamp="2026-03-13 16:53:09 +0000 UTC" firstStartedPulling="2026-03-13 16:53:10.921756909 +0000 UTC m=+6621.084968720" lastFinishedPulling="2026-03-13 16:53:15.948695865 +0000 UTC m=+6626.111907686" observedRunningTime="2026-03-13 16:53:16.386712989 +0000 UTC m=+6626.549924800" watchObservedRunningTime="2026-03-13 16:53:16.393457928 +0000 UTC m=+6626.556669739" Mar 13 16:53:16 crc kubenswrapper[4786]: I0313 16:53:16.942690 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rhmrw"] Mar 13 16:53:16 crc kubenswrapper[4786]: I0313 16:53:16.945354 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:16 crc kubenswrapper[4786]: I0313 16:53:16.973400 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhmrw"] Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.078663 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-catalog-content\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.078776 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-utilities\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.079091 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwg6w\" (UniqueName: \"kubernetes.io/projected/8be8e7b4-f7d5-42e8-b95c-520088563e29-kube-api-access-cwg6w\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.181255 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-catalog-content\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.181461 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-utilities\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.181599 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwg6w\" (UniqueName: \"kubernetes.io/projected/8be8e7b4-f7d5-42e8-b95c-520088563e29-kube-api-access-cwg6w\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.182065 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-utilities\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.182127 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-catalog-content\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.205842 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwg6w\" (UniqueName: \"kubernetes.io/projected/8be8e7b4-f7d5-42e8-b95c-520088563e29-kube-api-access-cwg6w\") pod \"redhat-marketplace-rhmrw\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.292561 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.364800 4786 scope.go:117] "RemoveContainer" containerID="7ebe02a3bf38c5b519df664d7024ae116da02c953617fef97b7557774ea9aa42" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.451276 4786 scope.go:117] "RemoveContainer" containerID="b7d902e76f51b7964af1cd42a40023e68e91f010a7413dade62022cc54f82a1c" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.510549 4786 scope.go:117] "RemoveContainer" containerID="f970d382746d79cc405a4fe8036dfc3ac12f1d9ffb9e5f79325ac5ed4256ccd3" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.618442 4786 scope.go:117] "RemoveContainer" containerID="022abf23a4cbcb59bf29df003e8cfe42028f0066832674e4ac8300b89c2c2d4a" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.676393 4786 scope.go:117] "RemoveContainer" containerID="8089a9933b42cfbda9422dc906b9d45c0b5b96ae1f07194d156593394e18489e" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.715055 4786 scope.go:117] "RemoveContainer" containerID="3f0355fa5a3e553fa139e964751be8306a12464b6081f5e860114efbb812d6cf" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.737849 4786 scope.go:117] "RemoveContainer" containerID="9a787eba2620d73508e6833afa61d022be18ec4a4216112f7254e8381d174ead" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.766066 4786 scope.go:117] "RemoveContainer" containerID="7adf84386f2c7e8fc5d33877be0b4bcc40c815b4c4847700145610e4b83a278f" Mar 13 16:53:17 crc kubenswrapper[4786]: I0313 16:53:17.854481 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhmrw"] Mar 13 16:53:18 crc kubenswrapper[4786]: I0313 16:53:18.391743 4786 generic.go:334] "Generic (PLEG): container finished" podID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerID="89a7f1c60a027ac982ad540e6ffeed10c36253307923c59d022cddd7d85a5300" exitCode=0 Mar 13 16:53:18 crc kubenswrapper[4786]: I0313 16:53:18.391788 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhmrw" event={"ID":"8be8e7b4-f7d5-42e8-b95c-520088563e29","Type":"ContainerDied","Data":"89a7f1c60a027ac982ad540e6ffeed10c36253307923c59d022cddd7d85a5300"} Mar 13 16:53:18 crc kubenswrapper[4786]: I0313 16:53:18.391811 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhmrw" event={"ID":"8be8e7b4-f7d5-42e8-b95c-520088563e29","Type":"ContainerStarted","Data":"1a741188e06f504cc009476860e3051ab0f0948fe402608afb6fcc2671f477db"} Mar 13 16:53:19 crc kubenswrapper[4786]: I0313 16:53:19.402373 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhmrw" event={"ID":"8be8e7b4-f7d5-42e8-b95c-520088563e29","Type":"ContainerStarted","Data":"29aa0c357ea86d002c8f169140e0a563d63dbb18f130a345f8e85a84afb0435e"} Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.105298 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.172465 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.480516 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-8k747"] Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.482414 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-8k747" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.490585 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-8k747"] Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.655687 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhj5d\" (UniqueName: \"kubernetes.io/projected/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-kube-api-access-rhj5d\") pod \"aodh-db-create-8k747\" (UID: \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\") " pod="openstack/aodh-db-create-8k747" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.655861 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-operator-scripts\") pod \"aodh-db-create-8k747\" (UID: \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\") " pod="openstack/aodh-db-create-8k747" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.708076 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-ff67-account-create-update-nrhwr"] Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.709464 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.716289 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.721671 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-ff67-account-create-update-nrhwr"] Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.770544 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-operator-scripts\") pod \"aodh-db-create-8k747\" (UID: \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\") " pod="openstack/aodh-db-create-8k747" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.771274 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhj5d\" (UniqueName: \"kubernetes.io/projected/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-kube-api-access-rhj5d\") pod \"aodh-db-create-8k747\" (UID: \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\") " pod="openstack/aodh-db-create-8k747" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.772873 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-operator-scripts\") pod \"aodh-db-create-8k747\" (UID: \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\") " pod="openstack/aodh-db-create-8k747" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.787584 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.789333 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhj5d\" (UniqueName: \"kubernetes.io/projected/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-kube-api-access-rhj5d\") pod \"aodh-db-create-8k747\" (UID: \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\") " pod="openstack/aodh-db-create-8k747" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.790999 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.811587 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-8k747" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.873587 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f276c87-a633-4e23-b3c8-61353255c1e0-operator-scripts\") pod \"aodh-ff67-account-create-update-nrhwr\" (UID: \"7f276c87-a633-4e23-b3c8-61353255c1e0\") " pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.874290 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq6x2\" (UniqueName: \"kubernetes.io/projected/7f276c87-a633-4e23-b3c8-61353255c1e0-kube-api-access-fq6x2\") pod \"aodh-ff67-account-create-update-nrhwr\" (UID: \"7f276c87-a633-4e23-b3c8-61353255c1e0\") " pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.976491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq6x2\" (UniqueName: \"kubernetes.io/projected/7f276c87-a633-4e23-b3c8-61353255c1e0-kube-api-access-fq6x2\") pod \"aodh-ff67-account-create-update-nrhwr\" (UID: \"7f276c87-a633-4e23-b3c8-61353255c1e0\") " pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.976801 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f276c87-a633-4e23-b3c8-61353255c1e0-operator-scripts\") pod \"aodh-ff67-account-create-update-nrhwr\" (UID: \"7f276c87-a633-4e23-b3c8-61353255c1e0\") " pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:20 crc kubenswrapper[4786]: I0313 16:53:20.978039 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f276c87-a633-4e23-b3c8-61353255c1e0-operator-scripts\") pod \"aodh-ff67-account-create-update-nrhwr\" (UID: \"7f276c87-a633-4e23-b3c8-61353255c1e0\") " pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:21 crc kubenswrapper[4786]: I0313 16:53:21.011297 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq6x2\" (UniqueName: \"kubernetes.io/projected/7f276c87-a633-4e23-b3c8-61353255c1e0-kube-api-access-fq6x2\") pod \"aodh-ff67-account-create-update-nrhwr\" (UID: \"7f276c87-a633-4e23-b3c8-61353255c1e0\") " pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:21 crc kubenswrapper[4786]: I0313 16:53:21.039752 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:21 crc kubenswrapper[4786]: I0313 16:53:21.431305 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-8k747"] Mar 13 16:53:21 crc kubenswrapper[4786]: I0313 16:53:21.450885 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-8k747" event={"ID":"50f1da9f-ee38-45cb-bc5d-21db3eda07f4","Type":"ContainerStarted","Data":"7add3cf284cfe38048715daccecbd45e79a729b1252e286526c15e96e3d37abb"} Mar 13 16:53:21 crc kubenswrapper[4786]: I0313 16:53:21.452869 4786 generic.go:334] "Generic (PLEG): container finished" podID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerID="29aa0c357ea86d002c8f169140e0a563d63dbb18f130a345f8e85a84afb0435e" exitCode=0 Mar 13 16:53:21 crc kubenswrapper[4786]: I0313 16:53:21.452954 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhmrw" event={"ID":"8be8e7b4-f7d5-42e8-b95c-520088563e29","Type":"ContainerDied","Data":"29aa0c357ea86d002c8f169140e0a563d63dbb18f130a345f8e85a84afb0435e"} Mar 13 16:53:21 crc kubenswrapper[4786]: I0313 16:53:21.454419 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:21 crc kubenswrapper[4786]: I0313 16:53:21.561573 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-ff67-account-create-update-nrhwr"] Mar 13 16:53:21 crc kubenswrapper[4786]: W0313 16:53:21.567626 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f276c87_a633_4e23_b3c8_61353255c1e0.slice/crio-a8fbcbdd968c8491d202d49dc97cfc519d01fc3828ec94820ca7a9ba1e66206d WatchSource:0}: Error finding container a8fbcbdd968c8491d202d49dc97cfc519d01fc3828ec94820ca7a9ba1e66206d: Status 404 returned error can't find the container with id a8fbcbdd968c8491d202d49dc97cfc519d01fc3828ec94820ca7a9ba1e66206d Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.465560 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhmrw" event={"ID":"8be8e7b4-f7d5-42e8-b95c-520088563e29","Type":"ContainerStarted","Data":"31adaf0ab822f60975f98e5a058ec08977f5dc642ca8797cbba38ec0960c8741"} Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.467794 4786 generic.go:334] "Generic (PLEG): container finished" podID="7f276c87-a633-4e23-b3c8-61353255c1e0" containerID="6bbf0945c8e6d155591de1ebe333f97f53e822af88059fedd5b9deb943e5ef70" exitCode=0 Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.467846 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ff67-account-create-update-nrhwr" event={"ID":"7f276c87-a633-4e23-b3c8-61353255c1e0","Type":"ContainerDied","Data":"6bbf0945c8e6d155591de1ebe333f97f53e822af88059fedd5b9deb943e5ef70"} Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.467880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ff67-account-create-update-nrhwr" event={"ID":"7f276c87-a633-4e23-b3c8-61353255c1e0","Type":"ContainerStarted","Data":"a8fbcbdd968c8491d202d49dc97cfc519d01fc3828ec94820ca7a9ba1e66206d"} Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.472211 4786 generic.go:334] "Generic (PLEG): container finished" podID="50f1da9f-ee38-45cb-bc5d-21db3eda07f4" containerID="b7d3219d584ba0552452654e9739dfc29ee21cfa32dd7c0128c4e70dae6f9f63" exitCode=0 Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.472283 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-8k747" event={"ID":"50f1da9f-ee38-45cb-bc5d-21db3eda07f4","Type":"ContainerDied","Data":"b7d3219d584ba0552452654e9739dfc29ee21cfa32dd7c0128c4e70dae6f9f63"} Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.484210 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rhmrw" podStartSLOduration=2.972135434 podStartE2EDuration="6.484193311s" podCreationTimestamp="2026-03-13 16:53:16 +0000 UTC" firstStartedPulling="2026-03-13 16:53:18.394258446 +0000 UTC m=+6628.557470257" lastFinishedPulling="2026-03-13 16:53:21.906316323 +0000 UTC m=+6632.069528134" observedRunningTime="2026-03-13 16:53:22.479155315 +0000 UTC m=+6632.642367126" watchObservedRunningTime="2026-03-13 16:53:22.484193311 +0000 UTC m=+6632.647405122" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.578135 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn8kp"] Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.578365 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gn8kp" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="registry-server" containerID="cri-o://cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d" gracePeriod=2 Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.787761 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.788330 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" containerName="openstackclient" containerID="cri-o://2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b" gracePeriod=2 Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.795629 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.825112 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 16:53:22 crc kubenswrapper[4786]: E0313 16:53:22.825507 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" containerName="openstackclient" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.825524 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" containerName="openstackclient" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.825716 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" containerName="openstackclient" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.826384 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.837482 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.851066 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" podUID="46843f0a-6e35-49f9-b304-31a452d756ed" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.918491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/46843f0a-6e35-49f9-b304-31a452d756ed-openstack-config-secret\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.918550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46843f0a-6e35-49f9-b304-31a452d756ed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.918795 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dlrq\" (UniqueName: \"kubernetes.io/projected/46843f0a-6e35-49f9-b304-31a452d756ed-kube-api-access-4dlrq\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:22 crc kubenswrapper[4786]: I0313 16:53:22.918903 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/46843f0a-6e35-49f9-b304-31a452d756ed-openstack-config\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.020456 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dlrq\" (UniqueName: \"kubernetes.io/projected/46843f0a-6e35-49f9-b304-31a452d756ed-kube-api-access-4dlrq\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.020606 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/46843f0a-6e35-49f9-b304-31a452d756ed-openstack-config\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.020631 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/46843f0a-6e35-49f9-b304-31a452d756ed-openstack-config-secret\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.020647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46843f0a-6e35-49f9-b304-31a452d756ed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.021915 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/46843f0a-6e35-49f9-b304-31a452d756ed-openstack-config\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.033011 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/46843f0a-6e35-49f9-b304-31a452d756ed-openstack-config-secret\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.044364 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dlrq\" (UniqueName: \"kubernetes.io/projected/46843f0a-6e35-49f9-b304-31a452d756ed-kube-api-access-4dlrq\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.046538 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46843f0a-6e35-49f9-b304-31a452d756ed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"46843f0a-6e35-49f9-b304-31a452d756ed\") " pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.147724 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.239720 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.325284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r24x\" (UniqueName: \"kubernetes.io/projected/c438e563-3338-4dd3-bfda-0ea9ec97214b-kube-api-access-7r24x\") pod \"c438e563-3338-4dd3-bfda-0ea9ec97214b\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.325327 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-catalog-content\") pod \"c438e563-3338-4dd3-bfda-0ea9ec97214b\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.325425 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-utilities\") pod \"c438e563-3338-4dd3-bfda-0ea9ec97214b\" (UID: \"c438e563-3338-4dd3-bfda-0ea9ec97214b\") " Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.326248 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-utilities" (OuterVolumeSpecName: "utilities") pod "c438e563-3338-4dd3-bfda-0ea9ec97214b" (UID: "c438e563-3338-4dd3-bfda-0ea9ec97214b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.329791 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c438e563-3338-4dd3-bfda-0ea9ec97214b-kube-api-access-7r24x" (OuterVolumeSpecName: "kube-api-access-7r24x") pod "c438e563-3338-4dd3-bfda-0ea9ec97214b" (UID: "c438e563-3338-4dd3-bfda-0ea9ec97214b"). InnerVolumeSpecName "kube-api-access-7r24x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.404501 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c438e563-3338-4dd3-bfda-0ea9ec97214b" (UID: "c438e563-3338-4dd3-bfda-0ea9ec97214b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.428187 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.428225 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r24x\" (UniqueName: \"kubernetes.io/projected/c438e563-3338-4dd3-bfda-0ea9ec97214b-kube-api-access-7r24x\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.428237 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438e563-3338-4dd3-bfda-0ea9ec97214b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.482150 4786 generic.go:334] "Generic (PLEG): container finished" podID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerID="cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d" exitCode=0 Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.482211 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn8kp" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.482255 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8kp" event={"ID":"c438e563-3338-4dd3-bfda-0ea9ec97214b","Type":"ContainerDied","Data":"cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d"} Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.482307 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8kp" event={"ID":"c438e563-3338-4dd3-bfda-0ea9ec97214b","Type":"ContainerDied","Data":"aa2f9318bbabdeab243223b7e0bab737da7b09e19b02d494cfd3e2f6a7a1905a"} Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.482327 4786 scope.go:117] "RemoveContainer" containerID="cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.521395 4786 scope.go:117] "RemoveContainer" containerID="bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.529853 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn8kp"] Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.548919 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gn8kp"] Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.557017 4786 scope.go:117] "RemoveContainer" containerID="82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.616323 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.616606 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="prometheus" containerID="cri-o://64cc75100b6a49620a14a655bda22a7648214b394bb56bf487dbcffc42789783" gracePeriod=600 Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.616729 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="thanos-sidecar" containerID="cri-o://89a3101c406a5ea15d8f7f976f2b6e7bc1cabe84a63c4500ffc1f953c9965062" gracePeriod=600 Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.616784 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="config-reloader" containerID="cri-o://24c124b8e43caaad61676e66327baa78f3796309d19e0c81362863d689946b8d" gracePeriod=600 Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.625302 4786 scope.go:117] "RemoveContainer" containerID="cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d" Mar 13 16:53:23 crc kubenswrapper[4786]: E0313 16:53:23.627821 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d\": container with ID starting with cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d not found: ID does not exist" containerID="cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.627890 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d"} err="failed to get container status \"cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d\": rpc error: code = NotFound desc = could not find container \"cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d\": container with ID starting with cefee1213f79f6b7b81c312e29304727f578879e0aa6b7014f6e65d8f1443b4d not found: ID does not exist" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.627920 4786 scope.go:117] "RemoveContainer" containerID="bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678" Mar 13 16:53:23 crc kubenswrapper[4786]: E0313 16:53:23.628293 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678\": container with ID starting with bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678 not found: ID does not exist" containerID="bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.628331 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678"} err="failed to get container status \"bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678\": rpc error: code = NotFound desc = could not find container \"bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678\": container with ID starting with bf4ebf458109221e25e44456628a100d158127b1105212145db48e3a0048d678 not found: ID does not exist" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.628361 4786 scope.go:117] "RemoveContainer" containerID="82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98" Mar 13 16:53:23 crc kubenswrapper[4786]: E0313 16:53:23.630807 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98\": container with ID starting with 82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98 not found: ID does not exist" containerID="82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.630849 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98"} err="failed to get container status \"82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98\": rpc error: code = NotFound desc = could not find container \"82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98\": container with ID starting with 82c934b26e5c357a3a3b88a05f6a464ff87852caa5c6e461431d05420c08de98 not found: ID does not exist" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.667221 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 16:53:23 crc kubenswrapper[4786]: W0313 16:53:23.677663 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46843f0a_6e35_49f9_b304_31a452d756ed.slice/crio-ac2a3910a878d054b917b872229bd820245cf299e4834b9ba93ead97da8d938e WatchSource:0}: Error finding container ac2a3910a878d054b917b872229bd820245cf299e4834b9ba93ead97da8d938e: Status 404 returned error can't find the container with id ac2a3910a878d054b917b872229bd820245cf299e4834b9ba93ead97da8d938e Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.989641 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-8k747" Mar 13 16:53:23 crc kubenswrapper[4786]: I0313 16:53:23.998225 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.054987 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhj5d\" (UniqueName: \"kubernetes.io/projected/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-kube-api-access-rhj5d\") pod \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\" (UID: \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.055140 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-operator-scripts\") pod \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\" (UID: \"50f1da9f-ee38-45cb-bc5d-21db3eda07f4\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.056602 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50f1da9f-ee38-45cb-bc5d-21db3eda07f4" (UID: "50f1da9f-ee38-45cb-bc5d-21db3eda07f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.081101 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-kube-api-access-rhj5d" (OuterVolumeSpecName: "kube-api-access-rhj5d") pod "50f1da9f-ee38-45cb-bc5d-21db3eda07f4" (UID: "50f1da9f-ee38-45cb-bc5d-21db3eda07f4"). InnerVolumeSpecName "kube-api-access-rhj5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.157364 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f276c87-a633-4e23-b3c8-61353255c1e0-operator-scripts\") pod \"7f276c87-a633-4e23-b3c8-61353255c1e0\" (UID: \"7f276c87-a633-4e23-b3c8-61353255c1e0\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.157561 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq6x2\" (UniqueName: \"kubernetes.io/projected/7f276c87-a633-4e23-b3c8-61353255c1e0-kube-api-access-fq6x2\") pod \"7f276c87-a633-4e23-b3c8-61353255c1e0\" (UID: \"7f276c87-a633-4e23-b3c8-61353255c1e0\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.158021 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhj5d\" (UniqueName: \"kubernetes.io/projected/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-kube-api-access-rhj5d\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.158041 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50f1da9f-ee38-45cb-bc5d-21db3eda07f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.158849 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f276c87-a633-4e23-b3c8-61353255c1e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f276c87-a633-4e23-b3c8-61353255c1e0" (UID: "7f276c87-a633-4e23-b3c8-61353255c1e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.163951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f276c87-a633-4e23-b3c8-61353255c1e0-kube-api-access-fq6x2" (OuterVolumeSpecName: "kube-api-access-fq6x2") pod "7f276c87-a633-4e23-b3c8-61353255c1e0" (UID: "7f276c87-a633-4e23-b3c8-61353255c1e0"). InnerVolumeSpecName "kube-api-access-fq6x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.259719 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq6x2\" (UniqueName: \"kubernetes.io/projected/7f276c87-a633-4e23-b3c8-61353255c1e0-kube-api-access-fq6x2\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.259754 4786 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f276c87-a633-4e23-b3c8-61353255c1e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.498390 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-8k747" event={"ID":"50f1da9f-ee38-45cb-bc5d-21db3eda07f4","Type":"ContainerDied","Data":"7add3cf284cfe38048715daccecbd45e79a729b1252e286526c15e96e3d37abb"} Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.498594 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7add3cf284cfe38048715daccecbd45e79a729b1252e286526c15e96e3d37abb" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.498403 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-8k747" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.505452 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"46843f0a-6e35-49f9-b304-31a452d756ed","Type":"ContainerStarted","Data":"96803b2248ca987d90d8daaba345f8ac636f7a8bf8a3b79f49dbb86e3b7c2800"} Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.505511 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"46843f0a-6e35-49f9-b304-31a452d756ed","Type":"ContainerStarted","Data":"ac2a3910a878d054b917b872229bd820245cf299e4834b9ba93ead97da8d938e"} Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.511159 4786 generic.go:334] "Generic (PLEG): container finished" podID="dd3541d1-dd5d-4779-ad92-25491f443513" containerID="89a3101c406a5ea15d8f7f976f2b6e7bc1cabe84a63c4500ffc1f953c9965062" exitCode=0 Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.511191 4786 generic.go:334] "Generic (PLEG): container finished" podID="dd3541d1-dd5d-4779-ad92-25491f443513" containerID="24c124b8e43caaad61676e66327baa78f3796309d19e0c81362863d689946b8d" exitCode=0 Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.511201 4786 generic.go:334] "Generic (PLEG): container finished" podID="dd3541d1-dd5d-4779-ad92-25491f443513" containerID="64cc75100b6a49620a14a655bda22a7648214b394bb56bf487dbcffc42789783" exitCode=0 Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.511256 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerDied","Data":"89a3101c406a5ea15d8f7f976f2b6e7bc1cabe84a63c4500ffc1f953c9965062"} Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.511283 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerDied","Data":"24c124b8e43caaad61676e66327baa78f3796309d19e0c81362863d689946b8d"} Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.511293 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerDied","Data":"64cc75100b6a49620a14a655bda22a7648214b394bb56bf487dbcffc42789783"} Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.530769 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ff67-account-create-update-nrhwr" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.549823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ff67-account-create-update-nrhwr" event={"ID":"7f276c87-a633-4e23-b3c8-61353255c1e0","Type":"ContainerDied","Data":"a8fbcbdd968c8491d202d49dc97cfc519d01fc3828ec94820ca7a9ba1e66206d"} Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.549910 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8fbcbdd968c8491d202d49dc97cfc519d01fc3828ec94820ca7a9ba1e66206d" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.602491 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.60246834 podStartE2EDuration="2.60246834s" podCreationTimestamp="2026-03-13 16:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:53:24.553068669 +0000 UTC m=+6634.716280480" watchObservedRunningTime="2026-03-13 16:53:24.60246834 +0000 UTC m=+6634.765680151" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.614069 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" path="/var/lib/kubelet/pods/c438e563-3338-4dd3-bfda-0ea9ec97214b/volumes" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.723621 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.889632 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rncqr\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-kube-api-access-rncqr\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890047 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890092 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-thanos-prometheus-http-client-file\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890148 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-tls-assets\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890201 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-config\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890256 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-2\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd3541d1-dd5d-4779-ad92-25491f443513-config-out\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890328 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-1\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890349 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-0\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.890427 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-web-config\") pod \"dd3541d1-dd5d-4779-ad92-25491f443513\" (UID: \"dd3541d1-dd5d-4779-ad92-25491f443513\") " Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.894729 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.896269 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3541d1-dd5d-4779-ad92-25491f443513-config-out" (OuterVolumeSpecName: "config-out") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.896677 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.897065 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.898334 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.901024 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-config" (OuterVolumeSpecName: "config") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.901060 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-kube-api-access-rncqr" (OuterVolumeSpecName: "kube-api-access-rncqr") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "kube-api-access-rncqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.904796 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.932626 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-web-config" (OuterVolumeSpecName: "web-config") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992231 4786 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992432 4786 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992518 4786 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-web-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992575 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rncqr\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-kube-api-access-rncqr\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992627 4786 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992684 4786 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dd3541d1-dd5d-4779-ad92-25491f443513-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992736 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd3541d1-dd5d-4779-ad92-25491f443513-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992797 4786 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dd3541d1-dd5d-4779-ad92-25491f443513-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:24 crc kubenswrapper[4786]: I0313 16:53:24.992849 4786 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dd3541d1-dd5d-4779-ad92-25491f443513-config-out\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.202247 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "dd3541d1-dd5d-4779-ad92-25491f443513" (UID: "dd3541d1-dd5d-4779-ad92-25491f443513"). InnerVolumeSpecName "pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.302362 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") on node \"crc\" " Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.358376 4786 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.358737 4786 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4") on node "crc" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.363827 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.404434 4786 reconciler_common.go:293] "Volume detached for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.505442 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfwt2\" (UniqueName: \"kubernetes.io/projected/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-kube-api-access-dfwt2\") pod \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.505514 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-combined-ca-bundle\") pod \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.505658 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config-secret\") pod \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.505686 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config\") pod \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\" (UID: \"4968f9bc-8e95-4fb8-bcbc-acc27742a76a\") " Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.511088 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-kube-api-access-dfwt2" (OuterVolumeSpecName: "kube-api-access-dfwt2") pod "4968f9bc-8e95-4fb8-bcbc-acc27742a76a" (UID: "4968f9bc-8e95-4fb8-bcbc-acc27742a76a"). InnerVolumeSpecName "kube-api-access-dfwt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.532546 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4968f9bc-8e95-4fb8-bcbc-acc27742a76a" (UID: "4968f9bc-8e95-4fb8-bcbc-acc27742a76a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.540566 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4968f9bc-8e95-4fb8-bcbc-acc27742a76a" (UID: "4968f9bc-8e95-4fb8-bcbc-acc27742a76a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.543006 4786 generic.go:334] "Generic (PLEG): container finished" podID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" containerID="2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b" exitCode=137 Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.543071 4786 scope.go:117] "RemoveContainer" containerID="2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.543173 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.548974 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.551186 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dd3541d1-dd5d-4779-ad92-25491f443513","Type":"ContainerDied","Data":"399a9d567dd2f4e55befb7133ca1409faaf8fea5474e1315268d03714ec92f56"} Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.556448 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4968f9bc-8e95-4fb8-bcbc-acc27742a76a" (UID: "4968f9bc-8e95-4fb8-bcbc-acc27742a76a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.610158 4786 scope.go:117] "RemoveContainer" containerID="2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.610753 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b\": container with ID starting with 2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b not found: ID does not exist" containerID="2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.610805 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b"} err="failed to get container status \"2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b\": rpc error: code = NotFound desc = could not find container \"2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b\": container with ID starting with 2cee8f5a3d5dfeef742ab1a728d6033aa36bc08d3e2a2d4574280117aa17a63b not found: ID does not exist" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.610832 4786 scope.go:117] "RemoveContainer" containerID="89a3101c406a5ea15d8f7f976f2b6e7bc1cabe84a63c4500ffc1f953c9965062" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.612199 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.612305 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.612379 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfwt2\" (UniqueName: \"kubernetes.io/projected/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-kube-api-access-dfwt2\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.612449 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4968f9bc-8e95-4fb8-bcbc-acc27742a76a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.620743 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.637606 4786 scope.go:117] "RemoveContainer" containerID="24c124b8e43caaad61676e66327baa78f3796309d19e0c81362863d689946b8d" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.639066 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656132 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656585 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="extract-content" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656601 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="extract-content" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656613 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="registry-server" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656619 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="registry-server" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656643 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="prometheus" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656649 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="prometheus" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656663 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="thanos-sidecar" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656670 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="thanos-sidecar" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656679 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="extract-utilities" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656685 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="extract-utilities" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656701 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f1da9f-ee38-45cb-bc5d-21db3eda07f4" containerName="mariadb-database-create" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656707 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f1da9f-ee38-45cb-bc5d-21db3eda07f4" containerName="mariadb-database-create" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656722 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="config-reloader" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656727 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="config-reloader" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656739 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="init-config-reloader" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656744 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="init-config-reloader" Mar 13 16:53:25 crc kubenswrapper[4786]: E0313 16:53:25.656762 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f276c87-a633-4e23-b3c8-61353255c1e0" containerName="mariadb-account-create-update" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656768 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f276c87-a633-4e23-b3c8-61353255c1e0" containerName="mariadb-account-create-update" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656961 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f1da9f-ee38-45cb-bc5d-21db3eda07f4" containerName="mariadb-database-create" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656979 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c438e563-3338-4dd3-bfda-0ea9ec97214b" containerName="registry-server" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656989 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="config-reloader" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.656998 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="thanos-sidecar" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.657008 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" containerName="prometheus" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.657019 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f276c87-a633-4e23-b3c8-61353255c1e0" containerName="mariadb-account-create-update" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.658837 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.663228 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.663235 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.663373 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.663249 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-9dzwq" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.663689 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.668173 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.668781 4786 scope.go:117] "RemoveContainer" containerID="64cc75100b6a49620a14a655bda22a7648214b394bb56bf487dbcffc42789783" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.669004 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.669115 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.670557 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.671020 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.706278 4786 scope.go:117] "RemoveContainer" containerID="ae3b2a9aa4aaf219f07db829d43b1c5f6bf08268a7a8fba507524d067da0a8d8" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714311 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-config\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714357 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714385 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714456 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b42d91a2-2438-453c-a5d0-6aa31991a770-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714479 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714499 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714525 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714557 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlmxn\" (UniqueName: \"kubernetes.io/projected/b42d91a2-2438-453c-a5d0-6aa31991a770-kube-api-access-nlmxn\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714607 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714630 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b42d91a2-2438-453c-a5d0-6aa31991a770-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714685 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.714714 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816482 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b42d91a2-2438-453c-a5d0-6aa31991a770-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816535 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816564 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816594 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816633 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlmxn\" (UniqueName: \"kubernetes.io/projected/b42d91a2-2438-453c-a5d0-6aa31991a770-kube-api-access-nlmxn\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816659 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816685 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816713 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b42d91a2-2438-453c-a5d0-6aa31991a770-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816777 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816809 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816834 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-config\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816890 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.816915 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.818019 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.819659 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.819794 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/22de98f187aafd7c57286e5c682d3283fcc085a3cc830afd132a53eec618e4ac/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.820701 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.820545 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b42d91a2-2438-453c-a5d0-6aa31991a770-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.821729 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.821767 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.822205 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b42d91a2-2438-453c-a5d0-6aa31991a770-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.824446 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.827991 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.831257 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-config\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.832216 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b42d91a2-2438-453c-a5d0-6aa31991a770-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.835371 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b42d91a2-2438-453c-a5d0-6aa31991a770-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.842117 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlmxn\" (UniqueName: \"kubernetes.io/projected/b42d91a2-2438-453c-a5d0-6aa31991a770-kube-api-access-nlmxn\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.865548 4786 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" podUID="46843f0a-6e35-49f9-b304-31a452d756ed" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.896370 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed0c859-dcb2-4d13-97c8-e7d7c58771a4\") pod \"prometheus-metric-storage-0\" (UID: \"b42d91a2-2438-453c-a5d0-6aa31991a770\") " pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:25 crc kubenswrapper[4786]: I0313 16:53:25.993816 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.213357 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-l7l9f"] Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.216483 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.218661 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.218781 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.221837 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9gzvm" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.221999 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.242607 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-l7l9f"] Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.325676 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-combined-ca-bundle\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.325724 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-config-data\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.325745 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-scripts\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.326016 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7hn\" (UniqueName: \"kubernetes.io/projected/92f0ab05-073c-4f6d-a863-d36e4fc32f25-kube-api-access-5m7hn\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.428013 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7hn\" (UniqueName: \"kubernetes.io/projected/92f0ab05-073c-4f6d-a863-d36e4fc32f25-kube-api-access-5m7hn\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.428191 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-combined-ca-bundle\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.428219 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-config-data\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.428906 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-scripts\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.435290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-scripts\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.435475 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-combined-ca-bundle\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.445828 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-config-data\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.450290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7hn\" (UniqueName: \"kubernetes.io/projected/92f0ab05-073c-4f6d-a863-d36e4fc32f25-kube-api-access-5m7hn\") pod \"aodh-db-sync-l7l9f\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.541977 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.561855 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4968f9bc-8e95-4fb8-bcbc-acc27742a76a" path="/var/lib/kubelet/pods/4968f9bc-8e95-4fb8-bcbc-acc27742a76a/volumes" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.563185 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3541d1-dd5d-4779-ad92-25491f443513" path="/var/lib/kubelet/pods/dd3541d1-dd5d-4779-ad92-25491f443513/volumes" Mar 13 16:53:26 crc kubenswrapper[4786]: I0313 16:53:26.597136 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 16:53:27 crc kubenswrapper[4786]: I0313 16:53:27.019500 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-l7l9f"] Mar 13 16:53:27 crc kubenswrapper[4786]: I0313 16:53:27.294019 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:27 crc kubenswrapper[4786]: I0313 16:53:27.294058 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:27 crc kubenswrapper[4786]: I0313 16:53:27.572166 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b42d91a2-2438-453c-a5d0-6aa31991a770","Type":"ContainerStarted","Data":"ebceadeb6e78605c7f260964b0fc0d69a73b45ea67af93ad6a5f9cfc513c7253"} Mar 13 16:53:27 crc kubenswrapper[4786]: I0313 16:53:27.574289 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-l7l9f" event={"ID":"92f0ab05-073c-4f6d-a863-d36e4fc32f25","Type":"ContainerStarted","Data":"1c79074ac968f265ee09c53ec6a9f8e59034a7689e4a051c6d9f444e80481eca"} Mar 13 16:53:28 crc kubenswrapper[4786]: I0313 16:53:28.351097 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rhmrw" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="registry-server" probeResult="failure" output=< Mar 13 16:53:28 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 16:53:28 crc kubenswrapper[4786]: > Mar 13 16:53:30 crc kubenswrapper[4786]: I0313 16:53:30.616631 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b42d91a2-2438-453c-a5d0-6aa31991a770","Type":"ContainerStarted","Data":"a816e109858888c74c445075cbf0ebf62b0d07bd30dcbb53dce9ebb50e8286b2"} Mar 13 16:53:32 crc kubenswrapper[4786]: I0313 16:53:32.636697 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-l7l9f" event={"ID":"92f0ab05-073c-4f6d-a863-d36e4fc32f25","Type":"ContainerStarted","Data":"599af0a9e83a294ea9969f43c35e96d2c84c96e308a51de72d0249014762e4fb"} Mar 13 16:53:32 crc kubenswrapper[4786]: I0313 16:53:32.662956 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-l7l9f" podStartSLOduration=1.759496575 podStartE2EDuration="6.662938019s" podCreationTimestamp="2026-03-13 16:53:26 +0000 UTC" firstStartedPulling="2026-03-13 16:53:27.018348975 +0000 UTC m=+6637.181560786" lastFinishedPulling="2026-03-13 16:53:31.921790409 +0000 UTC m=+6642.085002230" observedRunningTime="2026-03-13 16:53:32.656494677 +0000 UTC m=+6642.819706488" watchObservedRunningTime="2026-03-13 16:53:32.662938019 +0000 UTC m=+6642.826149830" Mar 13 16:53:34 crc kubenswrapper[4786]: I0313 16:53:34.662708 4786 generic.go:334] "Generic (PLEG): container finished" podID="92f0ab05-073c-4f6d-a863-d36e4fc32f25" containerID="599af0a9e83a294ea9969f43c35e96d2c84c96e308a51de72d0249014762e4fb" exitCode=0 Mar 13 16:53:34 crc kubenswrapper[4786]: I0313 16:53:34.662766 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-l7l9f" event={"ID":"92f0ab05-073c-4f6d-a863-d36e4fc32f25","Type":"ContainerDied","Data":"599af0a9e83a294ea9969f43c35e96d2c84c96e308a51de72d0249014762e4fb"} Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.099934 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.192461 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-combined-ca-bundle\") pod \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.192699 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m7hn\" (UniqueName: \"kubernetes.io/projected/92f0ab05-073c-4f6d-a863-d36e4fc32f25-kube-api-access-5m7hn\") pod \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.192779 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-scripts\") pod \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.192835 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-config-data\") pod \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\" (UID: \"92f0ab05-073c-4f6d-a863-d36e4fc32f25\") " Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.200692 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-scripts" (OuterVolumeSpecName: "scripts") pod "92f0ab05-073c-4f6d-a863-d36e4fc32f25" (UID: "92f0ab05-073c-4f6d-a863-d36e4fc32f25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.200943 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f0ab05-073c-4f6d-a863-d36e4fc32f25-kube-api-access-5m7hn" (OuterVolumeSpecName: "kube-api-access-5m7hn") pod "92f0ab05-073c-4f6d-a863-d36e4fc32f25" (UID: "92f0ab05-073c-4f6d-a863-d36e4fc32f25"). InnerVolumeSpecName "kube-api-access-5m7hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.227349 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-config-data" (OuterVolumeSpecName: "config-data") pod "92f0ab05-073c-4f6d-a863-d36e4fc32f25" (UID: "92f0ab05-073c-4f6d-a863-d36e4fc32f25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.250765 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92f0ab05-073c-4f6d-a863-d36e4fc32f25" (UID: "92f0ab05-073c-4f6d-a863-d36e4fc32f25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.296489 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.296835 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.297020 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f0ab05-073c-4f6d-a863-d36e4fc32f25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.297050 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m7hn\" (UniqueName: \"kubernetes.io/projected/92f0ab05-073c-4f6d-a863-d36e4fc32f25-kube-api-access-5m7hn\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.690641 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-l7l9f" event={"ID":"92f0ab05-073c-4f6d-a863-d36e4fc32f25","Type":"ContainerDied","Data":"1c79074ac968f265ee09c53ec6a9f8e59034a7689e4a051c6d9f444e80481eca"} Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.690701 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c79074ac968f265ee09c53ec6a9f8e59034a7689e4a051c6d9f444e80481eca" Mar 13 16:53:36 crc kubenswrapper[4786]: I0313 16:53:36.690706 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-l7l9f" Mar 13 16:53:37 crc kubenswrapper[4786]: I0313 16:53:37.378638 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:37 crc kubenswrapper[4786]: I0313 16:53:37.452170 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:37 crc kubenswrapper[4786]: I0313 16:53:37.646334 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhmrw"] Mar 13 16:53:37 crc kubenswrapper[4786]: I0313 16:53:37.709990 4786 generic.go:334] "Generic (PLEG): container finished" podID="b42d91a2-2438-453c-a5d0-6aa31991a770" containerID="a816e109858888c74c445075cbf0ebf62b0d07bd30dcbb53dce9ebb50e8286b2" exitCode=0 Mar 13 16:53:37 crc kubenswrapper[4786]: I0313 16:53:37.711554 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b42d91a2-2438-453c-a5d0-6aa31991a770","Type":"ContainerDied","Data":"a816e109858888c74c445075cbf0ebf62b0d07bd30dcbb53dce9ebb50e8286b2"} Mar 13 16:53:38 crc kubenswrapper[4786]: I0313 16:53:38.719658 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rhmrw" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="registry-server" containerID="cri-o://31adaf0ab822f60975f98e5a058ec08977f5dc642ca8797cbba38ec0960c8741" gracePeriod=2 Mar 13 16:53:38 crc kubenswrapper[4786]: I0313 16:53:38.719975 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b42d91a2-2438-453c-a5d0-6aa31991a770","Type":"ContainerStarted","Data":"b7180165692fd60323455bacaa202c6f304acaa02372468b9d597df701ee5610"} Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.742245 4786 generic.go:334] "Generic (PLEG): container finished" podID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerID="31adaf0ab822f60975f98e5a058ec08977f5dc642ca8797cbba38ec0960c8741" exitCode=0 Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.742331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhmrw" event={"ID":"8be8e7b4-f7d5-42e8-b95c-520088563e29","Type":"ContainerDied","Data":"31adaf0ab822f60975f98e5a058ec08977f5dc642ca8797cbba38ec0960c8741"} Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.742608 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhmrw" event={"ID":"8be8e7b4-f7d5-42e8-b95c-520088563e29","Type":"ContainerDied","Data":"1a741188e06f504cc009476860e3051ab0f0948fe402608afb6fcc2671f477db"} Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.742642 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a741188e06f504cc009476860e3051ab0f0948fe402608afb6fcc2671f477db" Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.785828 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.879774 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-catalog-content\") pod \"8be8e7b4-f7d5-42e8-b95c-520088563e29\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.880034 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-utilities\") pod \"8be8e7b4-f7d5-42e8-b95c-520088563e29\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.880290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwg6w\" (UniqueName: \"kubernetes.io/projected/8be8e7b4-f7d5-42e8-b95c-520088563e29-kube-api-access-cwg6w\") pod \"8be8e7b4-f7d5-42e8-b95c-520088563e29\" (UID: \"8be8e7b4-f7d5-42e8-b95c-520088563e29\") " Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.880820 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-utilities" (OuterVolumeSpecName: "utilities") pod "8be8e7b4-f7d5-42e8-b95c-520088563e29" (UID: "8be8e7b4-f7d5-42e8-b95c-520088563e29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.881364 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.885586 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be8e7b4-f7d5-42e8-b95c-520088563e29-kube-api-access-cwg6w" (OuterVolumeSpecName: "kube-api-access-cwg6w") pod "8be8e7b4-f7d5-42e8-b95c-520088563e29" (UID: "8be8e7b4-f7d5-42e8-b95c-520088563e29"). InnerVolumeSpecName "kube-api-access-cwg6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.915125 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8be8e7b4-f7d5-42e8-b95c-520088563e29" (UID: "8be8e7b4-f7d5-42e8-b95c-520088563e29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.983669 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8e7b4-f7d5-42e8-b95c-520088563e29-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:39 crc kubenswrapper[4786]: I0313 16:53:39.983736 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwg6w\" (UniqueName: \"kubernetes.io/projected/8be8e7b4-f7d5-42e8-b95c-520088563e29-kube-api-access-cwg6w\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.393130 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.597840 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 16:53:40 crc kubenswrapper[4786]: E0313 16:53:40.598382 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="extract-content" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.598406 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="extract-content" Mar 13 16:53:40 crc kubenswrapper[4786]: E0313 16:53:40.598426 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="extract-utilities" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.598448 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="extract-utilities" Mar 13 16:53:40 crc kubenswrapper[4786]: E0313 16:53:40.598487 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f0ab05-073c-4f6d-a863-d36e4fc32f25" containerName="aodh-db-sync" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.598496 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f0ab05-073c-4f6d-a863-d36e4fc32f25" containerName="aodh-db-sync" Mar 13 16:53:40 crc kubenswrapper[4786]: E0313 16:53:40.598519 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="registry-server" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.598528 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="registry-server" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.598767 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f0ab05-073c-4f6d-a863-d36e4fc32f25" containerName="aodh-db-sync" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.598798 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" containerName="registry-server" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.605556 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.608423 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9gzvm" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.608766 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.613059 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.624087 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.701811 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-scripts\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.702105 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsmmn\" (UniqueName: \"kubernetes.io/projected/d048ce03-aa11-47c6-8ace-17283ef21370-kube-api-access-vsmmn\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.702387 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.702557 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-config-data\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.750442 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhmrw" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.773014 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhmrw"] Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.790041 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhmrw"] Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.804702 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.804804 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-config-data\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.804841 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-scripts\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.804940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsmmn\" (UniqueName: \"kubernetes.io/projected/d048ce03-aa11-47c6-8ace-17283ef21370-kube-api-access-vsmmn\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.818899 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.819997 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-scripts\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.825419 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsmmn\" (UniqueName: \"kubernetes.io/projected/d048ce03-aa11-47c6-8ace-17283ef21370-kube-api-access-vsmmn\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.825592 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-config-data\") pod \"aodh-0\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " pod="openstack/aodh-0" Mar 13 16:53:40 crc kubenswrapper[4786]: I0313 16:53:40.947753 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 16:53:41 crc kubenswrapper[4786]: W0313 16:53:41.489053 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd048ce03_aa11_47c6_8ace_17283ef21370.slice/crio-847f02f54c867a470384880eb2f6817b40e927eb3bb3ee0d8c942d5b211cc084 WatchSource:0}: Error finding container 847f02f54c867a470384880eb2f6817b40e927eb3bb3ee0d8c942d5b211cc084: Status 404 returned error can't find the container with id 847f02f54c867a470384880eb2f6817b40e927eb3bb3ee0d8c942d5b211cc084 Mar 13 16:53:41 crc kubenswrapper[4786]: I0313 16:53:41.499116 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 16:53:41 crc kubenswrapper[4786]: I0313 16:53:41.761070 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b42d91a2-2438-453c-a5d0-6aa31991a770","Type":"ContainerStarted","Data":"0862864d93ffca23e89bb9da291c6ab308515812061b38560ebc1889ab7a35cd"} Mar 13 16:53:41 crc kubenswrapper[4786]: I0313 16:53:41.765172 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerStarted","Data":"847f02f54c867a470384880eb2f6817b40e927eb3bb3ee0d8c942d5b211cc084"} Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.562300 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be8e7b4-f7d5-42e8-b95c-520088563e29" path="/var/lib/kubelet/pods/8be8e7b4-f7d5-42e8-b95c-520088563e29/volumes" Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.812452 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b42d91a2-2438-453c-a5d0-6aa31991a770","Type":"ContainerStarted","Data":"ac920418bc040bb0e2e59ee30e27bb6034579189dfb2bbdebb91c3f051bf2c68"} Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.815450 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerStarted","Data":"f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24"} Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.853935 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.853918165 podStartE2EDuration="17.853918165s" podCreationTimestamp="2026-03-13 16:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 16:53:42.851623107 +0000 UTC m=+6653.014834918" watchObservedRunningTime="2026-03-13 16:53:42.853918165 +0000 UTC m=+6653.017129966" Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.927213 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.927505 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="ceilometer-central-agent" containerID="cri-o://bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad" gracePeriod=30 Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.927568 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="proxy-httpd" containerID="cri-o://5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa" gracePeriod=30 Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.927619 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="sg-core" containerID="cri-o://ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a" gracePeriod=30 Mar 13 16:53:42 crc kubenswrapper[4786]: I0313 16:53:42.927654 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="ceilometer-notification-agent" containerID="cri-o://14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132" gracePeriod=30 Mar 13 16:53:43 crc kubenswrapper[4786]: I0313 16:53:43.827808 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerStarted","Data":"1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f"} Mar 13 16:53:43 crc kubenswrapper[4786]: I0313 16:53:43.830545 4786 generic.go:334] "Generic (PLEG): container finished" podID="11342837-8700-476a-9e91-f22aa073e82e" containerID="5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa" exitCode=0 Mar 13 16:53:43 crc kubenswrapper[4786]: I0313 16:53:43.830619 4786 generic.go:334] "Generic (PLEG): container finished" podID="11342837-8700-476a-9e91-f22aa073e82e" containerID="ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a" exitCode=2 Mar 13 16:53:43 crc kubenswrapper[4786]: I0313 16:53:43.830630 4786 generic.go:334] "Generic (PLEG): container finished" podID="11342837-8700-476a-9e91-f22aa073e82e" containerID="bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad" exitCode=0 Mar 13 16:53:43 crc kubenswrapper[4786]: I0313 16:53:43.830634 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerDied","Data":"5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa"} Mar 13 16:53:43 crc kubenswrapper[4786]: I0313 16:53:43.830691 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerDied","Data":"ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a"} Mar 13 16:53:43 crc kubenswrapper[4786]: I0313 16:53:43.830706 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerDied","Data":"bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad"} Mar 13 16:53:43 crc kubenswrapper[4786]: I0313 16:53:43.960581 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.637942 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689119 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-run-httpd\") pod \"11342837-8700-476a-9e91-f22aa073e82e\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689266 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-config-data\") pod \"11342837-8700-476a-9e91-f22aa073e82e\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689287 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srtbr\" (UniqueName: \"kubernetes.io/projected/11342837-8700-476a-9e91-f22aa073e82e-kube-api-access-srtbr\") pod \"11342837-8700-476a-9e91-f22aa073e82e\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689319 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-combined-ca-bundle\") pod \"11342837-8700-476a-9e91-f22aa073e82e\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689336 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-sg-core-conf-yaml\") pod \"11342837-8700-476a-9e91-f22aa073e82e\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689389 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-log-httpd\") pod \"11342837-8700-476a-9e91-f22aa073e82e\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689529 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-scripts\") pod \"11342837-8700-476a-9e91-f22aa073e82e\" (UID: \"11342837-8700-476a-9e91-f22aa073e82e\") " Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689542 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "11342837-8700-476a-9e91-f22aa073e82e" (UID: "11342837-8700-476a-9e91-f22aa073e82e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.689952 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.698300 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "11342837-8700-476a-9e91-f22aa073e82e" (UID: "11342837-8700-476a-9e91-f22aa073e82e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.702577 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11342837-8700-476a-9e91-f22aa073e82e-kube-api-access-srtbr" (OuterVolumeSpecName: "kube-api-access-srtbr") pod "11342837-8700-476a-9e91-f22aa073e82e" (UID: "11342837-8700-476a-9e91-f22aa073e82e"). InnerVolumeSpecName "kube-api-access-srtbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.707095 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-scripts" (OuterVolumeSpecName: "scripts") pod "11342837-8700-476a-9e91-f22aa073e82e" (UID: "11342837-8700-476a-9e91-f22aa073e82e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.742762 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "11342837-8700-476a-9e91-f22aa073e82e" (UID: "11342837-8700-476a-9e91-f22aa073e82e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.780684 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11342837-8700-476a-9e91-f22aa073e82e" (UID: "11342837-8700-476a-9e91-f22aa073e82e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.792161 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srtbr\" (UniqueName: \"kubernetes.io/projected/11342837-8700-476a-9e91-f22aa073e82e-kube-api-access-srtbr\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.792189 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.792201 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.792210 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11342837-8700-476a-9e91-f22aa073e82e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.792221 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.826941 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-config-data" (OuterVolumeSpecName: "config-data") pod "11342837-8700-476a-9e91-f22aa073e82e" (UID: "11342837-8700-476a-9e91-f22aa073e82e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.845210 4786 generic.go:334] "Generic (PLEG): container finished" podID="11342837-8700-476a-9e91-f22aa073e82e" containerID="14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132" exitCode=0 Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.845252 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerDied","Data":"14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132"} Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.845272 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.845287 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11342837-8700-476a-9e91-f22aa073e82e","Type":"ContainerDied","Data":"46c8b43719bc9f4d0299ca999d2475e70815b325aeee12355dcb14d4b1d0afd2"} Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.845305 4786 scope.go:117] "RemoveContainer" containerID="5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.868016 4786 scope.go:117] "RemoveContainer" containerID="ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.887770 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.893871 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11342837-8700-476a-9e91-f22aa073e82e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.897133 4786 scope.go:117] "RemoveContainer" containerID="14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.899219 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.912110 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:44 crc kubenswrapper[4786]: E0313 16:53:44.912575 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="ceilometer-notification-agent" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.912594 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="ceilometer-notification-agent" Mar 13 16:53:44 crc kubenswrapper[4786]: E0313 16:53:44.912617 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="proxy-httpd" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.912625 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="proxy-httpd" Mar 13 16:53:44 crc kubenswrapper[4786]: E0313 16:53:44.912637 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="ceilometer-central-agent" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.912643 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="ceilometer-central-agent" Mar 13 16:53:44 crc kubenswrapper[4786]: E0313 16:53:44.912653 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="sg-core" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.912658 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="sg-core" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.914144 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="ceilometer-notification-agent" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.914166 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="proxy-httpd" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.914184 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="ceilometer-central-agent" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.914194 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="11342837-8700-476a-9e91-f22aa073e82e" containerName="sg-core" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.916406 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.919558 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.921377 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.921545 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.956032 4786 scope.go:117] "RemoveContainer" containerID="bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.995601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-log-httpd\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.995674 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.995697 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-config-data\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.995746 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5tpl\" (UniqueName: \"kubernetes.io/projected/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-kube-api-access-l5tpl\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.995793 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-run-httpd\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.995834 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-scripts\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:44 crc kubenswrapper[4786]: I0313 16:53:44.995888 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.098267 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-scripts\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.098373 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.098450 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-log-httpd\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.098511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.098547 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-config-data\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.098592 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5tpl\" (UniqueName: \"kubernetes.io/projected/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-kube-api-access-l5tpl\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.098669 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-run-httpd\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.099237 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-log-httpd\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.099828 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-run-httpd\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.102205 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-scripts\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.102332 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.104664 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-config-data\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.104784 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.124759 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5tpl\" (UniqueName: \"kubernetes.io/projected/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-kube-api-access-l5tpl\") pod \"ceilometer-0\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.246545 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.402526 4786 scope.go:117] "RemoveContainer" containerID="5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa" Mar 13 16:53:45 crc kubenswrapper[4786]: E0313 16:53:45.404281 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa\": container with ID starting with 5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa not found: ID does not exist" containerID="5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.404332 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa"} err="failed to get container status \"5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa\": rpc error: code = NotFound desc = could not find container \"5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa\": container with ID starting with 5cfbea4b9c3ea53ad19b9d4c61803c6a3435c49ae5a3f22905bbc9d3301ec8aa not found: ID does not exist" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.404361 4786 scope.go:117] "RemoveContainer" containerID="ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a" Mar 13 16:53:45 crc kubenswrapper[4786]: E0313 16:53:45.404777 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a\": container with ID starting with ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a not found: ID does not exist" containerID="ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.404796 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a"} err="failed to get container status \"ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a\": rpc error: code = NotFound desc = could not find container \"ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a\": container with ID starting with ef1693ae0c8769a7a865a7d3d13f10929351154d841426cb33833f8586464c8a not found: ID does not exist" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.404810 4786 scope.go:117] "RemoveContainer" containerID="14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132" Mar 13 16:53:45 crc kubenswrapper[4786]: E0313 16:53:45.405030 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132\": container with ID starting with 14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132 not found: ID does not exist" containerID="14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.405049 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132"} err="failed to get container status \"14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132\": rpc error: code = NotFound desc = could not find container \"14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132\": container with ID starting with 14028253c559340456d62422fd45ffaf1657cc0ec7b4acc0875be460c950d132 not found: ID does not exist" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.405060 4786 scope.go:117] "RemoveContainer" containerID="bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad" Mar 13 16:53:45 crc kubenswrapper[4786]: E0313 16:53:45.406660 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad\": container with ID starting with bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad not found: ID does not exist" containerID="bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.406696 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad"} err="failed to get container status \"bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad\": rpc error: code = NotFound desc = could not find container \"bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad\": container with ID starting with bcf856dc4ba7f8573a2046e0994ea20f1260cf37e842b8c84e9f6005b08ff4ad not found: ID does not exist" Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.801281 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.857520 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerStarted","Data":"d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87"} Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.859188 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerStarted","Data":"396d69805a5408de8cfe1a5449bed55bbd8721110e470b9fec825b4d7b5b1977"} Mar 13 16:53:45 crc kubenswrapper[4786]: I0313 16:53:45.994244 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:46 crc kubenswrapper[4786]: I0313 16:53:46.563360 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11342837-8700-476a-9e91-f22aa073e82e" path="/var/lib/kubelet/pods/11342837-8700-476a-9e91-f22aa073e82e/volumes" Mar 13 16:53:46 crc kubenswrapper[4786]: I0313 16:53:46.885225 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerStarted","Data":"03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752"} Mar 13 16:53:47 crc kubenswrapper[4786]: I0313 16:53:47.126773 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:47 crc kubenswrapper[4786]: I0313 16:53:47.898566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerStarted","Data":"e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe"} Mar 13 16:53:47 crc kubenswrapper[4786]: I0313 16:53:47.898688 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-api" containerID="cri-o://f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24" gracePeriod=30 Mar 13 16:53:47 crc kubenswrapper[4786]: I0313 16:53:47.898688 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-listener" containerID="cri-o://e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe" gracePeriod=30 Mar 13 16:53:47 crc kubenswrapper[4786]: I0313 16:53:47.898766 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-evaluator" containerID="cri-o://1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f" gracePeriod=30 Mar 13 16:53:47 crc kubenswrapper[4786]: I0313 16:53:47.898761 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-notifier" containerID="cri-o://d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87" gracePeriod=30 Mar 13 16:53:47 crc kubenswrapper[4786]: I0313 16:53:47.901567 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerStarted","Data":"222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece"} Mar 13 16:53:47 crc kubenswrapper[4786]: I0313 16:53:47.922201 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.596522118 podStartE2EDuration="7.922182869s" podCreationTimestamp="2026-03-13 16:53:40 +0000 UTC" firstStartedPulling="2026-03-13 16:53:41.496580073 +0000 UTC m=+6651.659791884" lastFinishedPulling="2026-03-13 16:53:46.822240824 +0000 UTC m=+6656.985452635" observedRunningTime="2026-03-13 16:53:47.91665795 +0000 UTC m=+6658.079869761" watchObservedRunningTime="2026-03-13 16:53:47.922182869 +0000 UTC m=+6658.085394680" Mar 13 16:53:48 crc kubenswrapper[4786]: I0313 16:53:48.911874 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerStarted","Data":"7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419"} Mar 13 16:53:48 crc kubenswrapper[4786]: I0313 16:53:48.915200 4786 generic.go:334] "Generic (PLEG): container finished" podID="d048ce03-aa11-47c6-8ace-17283ef21370" containerID="d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87" exitCode=0 Mar 13 16:53:48 crc kubenswrapper[4786]: I0313 16:53:48.915229 4786 generic.go:334] "Generic (PLEG): container finished" podID="d048ce03-aa11-47c6-8ace-17283ef21370" containerID="1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f" exitCode=0 Mar 13 16:53:48 crc kubenswrapper[4786]: I0313 16:53:48.915236 4786 generic.go:334] "Generic (PLEG): container finished" podID="d048ce03-aa11-47c6-8ace-17283ef21370" containerID="f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24" exitCode=0 Mar 13 16:53:48 crc kubenswrapper[4786]: I0313 16:53:48.915264 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerDied","Data":"d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87"} Mar 13 16:53:48 crc kubenswrapper[4786]: I0313 16:53:48.915304 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerDied","Data":"1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f"} Mar 13 16:53:48 crc kubenswrapper[4786]: I0313 16:53:48.915313 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerDied","Data":"f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24"} Mar 13 16:53:50 crc kubenswrapper[4786]: I0313 16:53:50.938442 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerStarted","Data":"b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d"} Mar 13 16:53:50 crc kubenswrapper[4786]: I0313 16:53:50.938929 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="ceilometer-central-agent" containerID="cri-o://03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752" gracePeriod=30 Mar 13 16:53:50 crc kubenswrapper[4786]: I0313 16:53:50.939167 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 16:53:50 crc kubenswrapper[4786]: I0313 16:53:50.939304 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="proxy-httpd" containerID="cri-o://b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d" gracePeriod=30 Mar 13 16:53:50 crc kubenswrapper[4786]: I0313 16:53:50.939444 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="sg-core" containerID="cri-o://7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419" gracePeriod=30 Mar 13 16:53:50 crc kubenswrapper[4786]: I0313 16:53:50.939519 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="ceilometer-notification-agent" containerID="cri-o://222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece" gracePeriod=30 Mar 13 16:53:51 crc kubenswrapper[4786]: I0313 16:53:51.952429 4786 generic.go:334] "Generic (PLEG): container finished" podID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerID="b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d" exitCode=0 Mar 13 16:53:51 crc kubenswrapper[4786]: I0313 16:53:51.952685 4786 generic.go:334] "Generic (PLEG): container finished" podID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerID="7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419" exitCode=2 Mar 13 16:53:51 crc kubenswrapper[4786]: I0313 16:53:51.952695 4786 generic.go:334] "Generic (PLEG): container finished" podID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerID="222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece" exitCode=0 Mar 13 16:53:51 crc kubenswrapper[4786]: I0313 16:53:51.952518 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerDied","Data":"b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d"} Mar 13 16:53:51 crc kubenswrapper[4786]: I0313 16:53:51.952730 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerDied","Data":"7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419"} Mar 13 16:53:51 crc kubenswrapper[4786]: I0313 16:53:51.952743 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerDied","Data":"222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece"} Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.703179 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.894699 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-scripts\") pod \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.895363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5tpl\" (UniqueName: \"kubernetes.io/projected/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-kube-api-access-l5tpl\") pod \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.895599 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-log-httpd\") pod \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.895715 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-run-httpd\") pod \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.895881 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-combined-ca-bundle\") pod \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.899724 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-sg-core-conf-yaml\") pod \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.899828 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-config-data\") pod \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\" (UID: \"91d8c4ed-7e66-42e2-ab07-a12ac82c2458\") " Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.897380 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "91d8c4ed-7e66-42e2-ab07-a12ac82c2458" (UID: "91d8c4ed-7e66-42e2-ab07-a12ac82c2458"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.897660 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "91d8c4ed-7e66-42e2-ab07-a12ac82c2458" (UID: "91d8c4ed-7e66-42e2-ab07-a12ac82c2458"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.901121 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.901361 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.901591 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-kube-api-access-l5tpl" (OuterVolumeSpecName: "kube-api-access-l5tpl") pod "91d8c4ed-7e66-42e2-ab07-a12ac82c2458" (UID: "91d8c4ed-7e66-42e2-ab07-a12ac82c2458"). InnerVolumeSpecName "kube-api-access-l5tpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.919248 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-scripts" (OuterVolumeSpecName: "scripts") pod "91d8c4ed-7e66-42e2-ab07-a12ac82c2458" (UID: "91d8c4ed-7e66-42e2-ab07-a12ac82c2458"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.944431 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "91d8c4ed-7e66-42e2-ab07-a12ac82c2458" (UID: "91d8c4ed-7e66-42e2-ab07-a12ac82c2458"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.977042 4786 generic.go:334] "Generic (PLEG): container finished" podID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerID="03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752" exitCode=0 Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.977086 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerDied","Data":"03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752"} Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.977126 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91d8c4ed-7e66-42e2-ab07-a12ac82c2458","Type":"ContainerDied","Data":"396d69805a5408de8cfe1a5449bed55bbd8721110e470b9fec825b4d7b5b1977"} Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.977144 4786 scope.go:117] "RemoveContainer" containerID="b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.977378 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:53 crc kubenswrapper[4786]: I0313 16:53:53.989988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91d8c4ed-7e66-42e2-ab07-a12ac82c2458" (UID: "91d8c4ed-7e66-42e2-ab07-a12ac82c2458"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.006147 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.006172 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.006183 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5tpl\" (UniqueName: \"kubernetes.io/projected/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-kube-api-access-l5tpl\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.006192 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.034068 4786 scope.go:117] "RemoveContainer" containerID="7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.039647 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-config-data" (OuterVolumeSpecName: "config-data") pod "91d8c4ed-7e66-42e2-ab07-a12ac82c2458" (UID: "91d8c4ed-7e66-42e2-ab07-a12ac82c2458"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.063561 4786 scope.go:117] "RemoveContainer" containerID="222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.097502 4786 scope.go:117] "RemoveContainer" containerID="03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.108087 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d8c4ed-7e66-42e2-ab07-a12ac82c2458-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.125503 4786 scope.go:117] "RemoveContainer" containerID="b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d" Mar 13 16:53:54 crc kubenswrapper[4786]: E0313 16:53:54.125832 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d\": container with ID starting with b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d not found: ID does not exist" containerID="b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.125967 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d"} err="failed to get container status \"b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d\": rpc error: code = NotFound desc = could not find container \"b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d\": container with ID starting with b546033d7b6cec10694dfd242a4c318ca4aab3254b9d5031ce413e6d870a942d not found: ID does not exist" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.125998 4786 scope.go:117] "RemoveContainer" containerID="7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419" Mar 13 16:53:54 crc kubenswrapper[4786]: E0313 16:53:54.126334 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419\": container with ID starting with 7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419 not found: ID does not exist" containerID="7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.126360 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419"} err="failed to get container status \"7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419\": rpc error: code = NotFound desc = could not find container \"7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419\": container with ID starting with 7070cc243a1006e93ef5cafb8a441f8b806dd73da59f9733389e9d2fde19d419 not found: ID does not exist" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.126374 4786 scope.go:117] "RemoveContainer" containerID="222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece" Mar 13 16:53:54 crc kubenswrapper[4786]: E0313 16:53:54.126654 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece\": container with ID starting with 222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece not found: ID does not exist" containerID="222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.126683 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece"} err="failed to get container status \"222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece\": rpc error: code = NotFound desc = could not find container \"222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece\": container with ID starting with 222b519c4ec9cc9bf70000fc214a41a66c516f94a1086336ebde6f2f9ef4dece not found: ID does not exist" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.126698 4786 scope.go:117] "RemoveContainer" containerID="03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752" Mar 13 16:53:54 crc kubenswrapper[4786]: E0313 16:53:54.126986 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752\": container with ID starting with 03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752 not found: ID does not exist" containerID="03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.127006 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752"} err="failed to get container status \"03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752\": rpc error: code = NotFound desc = could not find container \"03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752\": container with ID starting with 03495fb09d5cfa6736057e0a3c84f4ef70f27eb4e6b3b9e6efb4ca1809924752 not found: ID does not exist" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.379518 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.402130 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.416810 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:54 crc kubenswrapper[4786]: E0313 16:53:54.417495 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="sg-core" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.417519 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="sg-core" Mar 13 16:53:54 crc kubenswrapper[4786]: E0313 16:53:54.417557 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="proxy-httpd" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.417564 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="proxy-httpd" Mar 13 16:53:54 crc kubenswrapper[4786]: E0313 16:53:54.417579 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="ceilometer-central-agent" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.417588 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="ceilometer-central-agent" Mar 13 16:53:54 crc kubenswrapper[4786]: E0313 16:53:54.417598 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="ceilometer-notification-agent" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.417605 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="ceilometer-notification-agent" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.417900 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="proxy-httpd" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.417939 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="ceilometer-central-agent" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.417951 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="ceilometer-notification-agent" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.417976 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" containerName="sg-core" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.420323 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.426754 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.426880 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.426987 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.565785 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d8c4ed-7e66-42e2-ab07-a12ac82c2458" path="/var/lib/kubelet/pods/91d8c4ed-7e66-42e2-ab07-a12ac82c2458/volumes" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.620024 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-scripts\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.620060 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.620092 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-log-httpd\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.620130 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.620466 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-run-httpd\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.620608 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h47vs\" (UniqueName: \"kubernetes.io/projected/1ac22059-3e77-45a2-a7a9-27b92d221e05-kube-api-access-h47vs\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.620783 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-config-data\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.722692 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-run-httpd\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.722764 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h47vs\" (UniqueName: \"kubernetes.io/projected/1ac22059-3e77-45a2-a7a9-27b92d221e05-kube-api-access-h47vs\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.722826 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-config-data\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.722909 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-scripts\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.722930 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.722959 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-log-httpd\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.722992 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.723681 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-run-httpd\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.723757 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-log-httpd\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.727524 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-config-data\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.737392 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.741734 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-scripts\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.753888 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.754915 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h47vs\" (UniqueName: \"kubernetes.io/projected/1ac22059-3e77-45a2-a7a9-27b92d221e05-kube-api-access-h47vs\") pod \"ceilometer-0\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " pod="openstack/ceilometer-0" Mar 13 16:53:54 crc kubenswrapper[4786]: I0313 16:53:54.758022 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:53:55 crc kubenswrapper[4786]: I0313 16:53:55.238047 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:53:55 crc kubenswrapper[4786]: I0313 16:53:55.995146 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:56 crc kubenswrapper[4786]: I0313 16:53:56.002385 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:56 crc kubenswrapper[4786]: I0313 16:53:56.008574 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerStarted","Data":"72dacf12db0160c865c91620bdac662538376aad0b1a1fc764b9dfdada220182"} Mar 13 16:53:56 crc kubenswrapper[4786]: I0313 16:53:56.008613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerStarted","Data":"7ac824fdb40631b22374161fa7df47dac913a7d5b4b14e7137bb8b50308c4985"} Mar 13 16:53:57 crc kubenswrapper[4786]: I0313 16:53:57.017417 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerStarted","Data":"c6904aa577457f495451957a3160349147c824d2924a3a4fd1e65dfa49e49375"} Mar 13 16:53:57 crc kubenswrapper[4786]: I0313 16:53:57.021377 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 16:53:58 crc kubenswrapper[4786]: I0313 16:53:58.030777 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerStarted","Data":"55240f5a4397d0c52cfd005ae4a0f0bb659abef2062318bfb4f83667e3ab68b3"} Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.058485 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerStarted","Data":"e6f3f1764e42ef13bd1b493edfd59f0b52d74583c16747aaba4bd5581cb5127a"} Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.058964 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.083034 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.125766482 podStartE2EDuration="6.083009883s" podCreationTimestamp="2026-03-13 16:53:54 +0000 UTC" firstStartedPulling="2026-03-13 16:53:55.250092653 +0000 UTC m=+6665.413304494" lastFinishedPulling="2026-03-13 16:53:59.207336084 +0000 UTC m=+6669.370547895" observedRunningTime="2026-03-13 16:54:00.077618888 +0000 UTC m=+6670.240830709" watchObservedRunningTime="2026-03-13 16:54:00.083009883 +0000 UTC m=+6670.246221694" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.173758 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557014-glwtj"] Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.175222 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557014-glwtj" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.177317 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.177681 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.177710 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.186560 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557014-glwtj"] Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.376799 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnngk\" (UniqueName: \"kubernetes.io/projected/087f69b5-64ac-4e7e-a195-12e854b510e3-kube-api-access-lnngk\") pod \"auto-csr-approver-29557014-glwtj\" (UID: \"087f69b5-64ac-4e7e-a195-12e854b510e3\") " pod="openshift-infra/auto-csr-approver-29557014-glwtj" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.479488 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnngk\" (UniqueName: \"kubernetes.io/projected/087f69b5-64ac-4e7e-a195-12e854b510e3-kube-api-access-lnngk\") pod \"auto-csr-approver-29557014-glwtj\" (UID: \"087f69b5-64ac-4e7e-a195-12e854b510e3\") " pod="openshift-infra/auto-csr-approver-29557014-glwtj" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.499659 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnngk\" (UniqueName: \"kubernetes.io/projected/087f69b5-64ac-4e7e-a195-12e854b510e3-kube-api-access-lnngk\") pod \"auto-csr-approver-29557014-glwtj\" (UID: \"087f69b5-64ac-4e7e-a195-12e854b510e3\") " pod="openshift-infra/auto-csr-approver-29557014-glwtj" Mar 13 16:54:00 crc kubenswrapper[4786]: I0313 16:54:00.544621 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557014-glwtj" Mar 13 16:54:01 crc kubenswrapper[4786]: I0313 16:54:01.051844 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557014-glwtj"] Mar 13 16:54:02 crc kubenswrapper[4786]: I0313 16:54:02.086151 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557014-glwtj" event={"ID":"087f69b5-64ac-4e7e-a195-12e854b510e3","Type":"ContainerStarted","Data":"30a3731c221799ffbe7b2701ef79509c94b5842ea6e01256b6a5ec904eb95db9"} Mar 13 16:54:03 crc kubenswrapper[4786]: I0313 16:54:03.105523 4786 generic.go:334] "Generic (PLEG): container finished" podID="087f69b5-64ac-4e7e-a195-12e854b510e3" containerID="3b0e36ca162ddc5f4bd11c1f6c6848376eb336e87e73c96cbedb2310c97d4fce" exitCode=0 Mar 13 16:54:03 crc kubenswrapper[4786]: I0313 16:54:03.105633 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557014-glwtj" event={"ID":"087f69b5-64ac-4e7e-a195-12e854b510e3","Type":"ContainerDied","Data":"3b0e36ca162ddc5f4bd11c1f6c6848376eb336e87e73c96cbedb2310c97d4fce"} Mar 13 16:54:04 crc kubenswrapper[4786]: I0313 16:54:04.685444 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557014-glwtj" Mar 13 16:54:04 crc kubenswrapper[4786]: I0313 16:54:04.779921 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnngk\" (UniqueName: \"kubernetes.io/projected/087f69b5-64ac-4e7e-a195-12e854b510e3-kube-api-access-lnngk\") pod \"087f69b5-64ac-4e7e-a195-12e854b510e3\" (UID: \"087f69b5-64ac-4e7e-a195-12e854b510e3\") " Mar 13 16:54:04 crc kubenswrapper[4786]: I0313 16:54:04.786247 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087f69b5-64ac-4e7e-a195-12e854b510e3-kube-api-access-lnngk" (OuterVolumeSpecName: "kube-api-access-lnngk") pod "087f69b5-64ac-4e7e-a195-12e854b510e3" (UID: "087f69b5-64ac-4e7e-a195-12e854b510e3"). InnerVolumeSpecName "kube-api-access-lnngk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:54:04 crc kubenswrapper[4786]: I0313 16:54:04.882142 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnngk\" (UniqueName: \"kubernetes.io/projected/087f69b5-64ac-4e7e-a195-12e854b510e3-kube-api-access-lnngk\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:05 crc kubenswrapper[4786]: I0313 16:54:05.135709 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557014-glwtj" event={"ID":"087f69b5-64ac-4e7e-a195-12e854b510e3","Type":"ContainerDied","Data":"30a3731c221799ffbe7b2701ef79509c94b5842ea6e01256b6a5ec904eb95db9"} Mar 13 16:54:05 crc kubenswrapper[4786]: I0313 16:54:05.135778 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30a3731c221799ffbe7b2701ef79509c94b5842ea6e01256b6a5ec904eb95db9" Mar 13 16:54:05 crc kubenswrapper[4786]: I0313 16:54:05.135841 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557014-glwtj" Mar 13 16:54:05 crc kubenswrapper[4786]: I0313 16:54:05.762190 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557008-qg64l"] Mar 13 16:54:05 crc kubenswrapper[4786]: I0313 16:54:05.770176 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557008-qg64l"] Mar 13 16:54:06 crc kubenswrapper[4786]: I0313 16:54:06.575489 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c96bc48-a941-4a7a-a113-904255c84492" path="/var/lib/kubelet/pods/6c96bc48-a941-4a7a-a113-904255c84492/volumes" Mar 13 16:54:11 crc kubenswrapper[4786]: I0313 16:54:11.055266 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-m99vx"] Mar 13 16:54:11 crc kubenswrapper[4786]: I0313 16:54:11.069025 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-s6ctj"] Mar 13 16:54:11 crc kubenswrapper[4786]: I0313 16:54:11.077690 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f116-account-create-update-f2fc9"] Mar 13 16:54:11 crc kubenswrapper[4786]: I0313 16:54:11.085742 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-m99vx"] Mar 13 16:54:11 crc kubenswrapper[4786]: I0313 16:54:11.092671 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-s6ctj"] Mar 13 16:54:11 crc kubenswrapper[4786]: I0313 16:54:11.099587 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f116-account-create-update-f2fc9"] Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.053706 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0f54-account-create-update-l8l9w"] Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.068360 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0f54-account-create-update-l8l9w"] Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.079754 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5kd2h"] Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.090183 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5kd2h"] Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.099722 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fda7-account-create-update-fxslq"] Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.108271 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fda7-account-create-update-fxslq"] Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.568442 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbe78b0-93cc-4afe-839c-c0d3e26873c2" path="/var/lib/kubelet/pods/0fbe78b0-93cc-4afe-839c-c0d3e26873c2/volumes" Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.570450 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec79171-dff4-4672-bb9c-d5f71a44dd70" path="/var/lib/kubelet/pods/1ec79171-dff4-4672-bb9c-d5f71a44dd70/volumes" Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.572465 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="683c9de6-8e5a-4ea1-a17e-af0743fee223" path="/var/lib/kubelet/pods/683c9de6-8e5a-4ea1-a17e-af0743fee223/volumes" Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.573707 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dadc28e-92a8-4760-b6c1-d4bfc87535c2" path="/var/lib/kubelet/pods/6dadc28e-92a8-4760-b6c1-d4bfc87535c2/volumes" Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.577671 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ee8d0e-612f-4ef9-a135-d1bd89fc4136" path="/var/lib/kubelet/pods/81ee8d0e-612f-4ef9-a135-d1bd89fc4136/volumes" Mar 13 16:54:12 crc kubenswrapper[4786]: I0313 16:54:12.579200 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4ba9a6-435c-4672-83a6-c46c2ce855e7" path="/var/lib/kubelet/pods/be4ba9a6-435c-4672-83a6-c46c2ce855e7/volumes" Mar 13 16:54:17 crc kubenswrapper[4786]: W0313 16:54:17.961105 4786 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd048ce03_aa11_47c6_8ace_17283ef21370.slice/crio-847f02f54c867a470384880eb2f6817b40e927eb3bb3ee0d8c942d5b211cc084/cpu.weight": open /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd048ce03_aa11_47c6_8ace_17283ef21370.slice/crio-847f02f54c867a470384880eb2f6817b40e927eb3bb3ee0d8c942d5b211cc084/cpu.weight: no such device Mar 13 16:54:17 crc kubenswrapper[4786]: W0313 16:54:17.964818 4786 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod087f69b5_64ac_4e7e_a195_12e854b510e3.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod087f69b5_64ac_4e7e_a195_12e854b510e3.slice: no such file or directory Mar 13 16:54:17 crc kubenswrapper[4786]: I0313 16:54:17.986240 4786 scope.go:117] "RemoveContainer" containerID="6002c2e0f1c7471fe1e760bc188a4df7fb96e4fee58e36ca8f36bf268890e4d6" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.131087 4786 scope.go:117] "RemoveContainer" containerID="f0cd0a5970b00748bfcdbcd54f7ae854e808959da022f38a572bffb5c54e4220" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.188442 4786 scope.go:117] "RemoveContainer" containerID="846413b5b3803373eece2f112134a568159934affa9a7d7534f7a82069b7ecf3" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.245353 4786 scope.go:117] "RemoveContainer" containerID="a3eceef4608e58d05af49d15b9b49d79b87fcbd574226b60c89beea3a3e03fd2" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.273698 4786 scope.go:117] "RemoveContainer" containerID="6f5ef5d122d206dae85ba659cc44b6b768a80c67f38ce420bb81bf715a40f12a" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.309326 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.322434 4786 generic.go:334] "Generic (PLEG): container finished" podID="d048ce03-aa11-47c6-8ace-17283ef21370" containerID="e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe" exitCode=137 Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.322470 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerDied","Data":"e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe"} Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.322523 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.322877 4786 scope.go:117] "RemoveContainer" containerID="e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.322773 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d048ce03-aa11-47c6-8ace-17283ef21370","Type":"ContainerDied","Data":"847f02f54c867a470384880eb2f6817b40e927eb3bb3ee0d8c942d5b211cc084"} Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.357079 4786 scope.go:117] "RemoveContainer" containerID="7f27e8e08779d209490ffcbe0750f091e1dab517216867ffce25b02b396eb718" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.368510 4786 scope.go:117] "RemoveContainer" containerID="d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.387342 4786 scope.go:117] "RemoveContainer" containerID="99240a98222f5ac13b06a15c98ab00d1b2e4f7de0df1174fc7cab2346947576b" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.401829 4786 scope.go:117] "RemoveContainer" containerID="1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.414802 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-scripts\") pod \"d048ce03-aa11-47c6-8ace-17283ef21370\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.414893 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-combined-ca-bundle\") pod \"d048ce03-aa11-47c6-8ace-17283ef21370\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.415070 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-config-data\") pod \"d048ce03-aa11-47c6-8ace-17283ef21370\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.415101 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsmmn\" (UniqueName: \"kubernetes.io/projected/d048ce03-aa11-47c6-8ace-17283ef21370-kube-api-access-vsmmn\") pod \"d048ce03-aa11-47c6-8ace-17283ef21370\" (UID: \"d048ce03-aa11-47c6-8ace-17283ef21370\") " Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.421030 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d048ce03-aa11-47c6-8ace-17283ef21370-kube-api-access-vsmmn" (OuterVolumeSpecName: "kube-api-access-vsmmn") pod "d048ce03-aa11-47c6-8ace-17283ef21370" (UID: "d048ce03-aa11-47c6-8ace-17283ef21370"). InnerVolumeSpecName "kube-api-access-vsmmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.421201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-scripts" (OuterVolumeSpecName: "scripts") pod "d048ce03-aa11-47c6-8ace-17283ef21370" (UID: "d048ce03-aa11-47c6-8ace-17283ef21370"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.433114 4786 scope.go:117] "RemoveContainer" containerID="f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.468387 4786 scope.go:117] "RemoveContainer" containerID="e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe" Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.469326 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe\": container with ID starting with e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe not found: ID does not exist" containerID="e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.469361 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe"} err="failed to get container status \"e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe\": rpc error: code = NotFound desc = could not find container \"e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe\": container with ID starting with e36f2d55f209fe97dbd46c585af4321d25d6826a9565648a5781e803d0cb55fe not found: ID does not exist" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.469382 4786 scope.go:117] "RemoveContainer" containerID="d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87" Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.469802 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87\": container with ID starting with d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87 not found: ID does not exist" containerID="d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.469850 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87"} err="failed to get container status \"d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87\": rpc error: code = NotFound desc = could not find container \"d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87\": container with ID starting with d896cb28b898e911fea8a192a5d12eaf02577678bec4411a685e52e5248b4b87 not found: ID does not exist" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.469904 4786 scope.go:117] "RemoveContainer" containerID="1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f" Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.470242 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f\": container with ID starting with 1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f not found: ID does not exist" containerID="1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.470276 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f"} err="failed to get container status \"1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f\": rpc error: code = NotFound desc = could not find container \"1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f\": container with ID starting with 1da0b11595c74ea9f31a5dac8c32b785ff1e0969213936151c7d14d6534c610f not found: ID does not exist" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.470296 4786 scope.go:117] "RemoveContainer" containerID="f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24" Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.470588 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24\": container with ID starting with f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24 not found: ID does not exist" containerID="f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.470683 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24"} err="failed to get container status \"f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24\": rpc error: code = NotFound desc = could not find container \"f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24\": container with ID starting with f9604ad3c6bf210d70009bd31a63deba25e7a37f860326dcade7b38ea20e4a24 not found: ID does not exist" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.517416 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsmmn\" (UniqueName: \"kubernetes.io/projected/d048ce03-aa11-47c6-8ace-17283ef21370-kube-api-access-vsmmn\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.517453 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.544495 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-config-data" (OuterVolumeSpecName: "config-data") pod "d048ce03-aa11-47c6-8ace-17283ef21370" (UID: "d048ce03-aa11-47c6-8ace-17283ef21370"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.556611 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d048ce03-aa11-47c6-8ace-17283ef21370" (UID: "d048ce03-aa11-47c6-8ace-17283ef21370"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.619603 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.619634 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d048ce03-aa11-47c6-8ace-17283ef21370-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.650592 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.672154 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.681564 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.682087 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-listener" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682116 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-listener" Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.682133 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087f69b5-64ac-4e7e-a195-12e854b510e3" containerName="oc" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682144 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="087f69b5-64ac-4e7e-a195-12e854b510e3" containerName="oc" Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.682161 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-notifier" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682169 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-notifier" Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.682194 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-api" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682201 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-api" Mar 13 16:54:18 crc kubenswrapper[4786]: E0313 16:54:18.682241 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-evaluator" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682250 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-evaluator" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682486 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-listener" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682511 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-api" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682529 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="087f69b5-64ac-4e7e-a195-12e854b510e3" containerName="oc" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682546 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-evaluator" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.682558 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" containerName="aodh-notifier" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.684808 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.689660 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9gzvm" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.689840 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.689671 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.690169 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.695250 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.704312 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.829468 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-internal-tls-certs\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.829619 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-config-data\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.829687 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.829819 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-public-tls-certs\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.830194 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n24qk\" (UniqueName: \"kubernetes.io/projected/90376e9e-850b-475b-87f2-76efe597c3ea-kube-api-access-n24qk\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.830324 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-scripts\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.932674 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n24qk\" (UniqueName: \"kubernetes.io/projected/90376e9e-850b-475b-87f2-76efe597c3ea-kube-api-access-n24qk\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.932769 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-scripts\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.932845 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-internal-tls-certs\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.932897 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-config-data\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.932922 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.932949 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-public-tls-certs\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.938833 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.939418 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-internal-tls-certs\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.939540 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-public-tls-certs\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.939743 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-config-data\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.939887 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90376e9e-850b-475b-87f2-76efe597c3ea-scripts\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.956112 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n24qk\" (UniqueName: \"kubernetes.io/projected/90376e9e-850b-475b-87f2-76efe597c3ea-kube-api-access-n24qk\") pod \"aodh-0\" (UID: \"90376e9e-850b-475b-87f2-76efe597c3ea\") " pod="openstack/aodh-0" Mar 13 16:54:18 crc kubenswrapper[4786]: I0313 16:54:18.999570 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 16:54:19 crc kubenswrapper[4786]: I0313 16:54:19.537604 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 16:54:20 crc kubenswrapper[4786]: I0313 16:54:20.071819 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6fhlc"] Mar 13 16:54:20 crc kubenswrapper[4786]: I0313 16:54:20.081522 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6fhlc"] Mar 13 16:54:20 crc kubenswrapper[4786]: I0313 16:54:20.372172 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90376e9e-850b-475b-87f2-76efe597c3ea","Type":"ContainerStarted","Data":"80ee483606602ae040b8724eb9985b91a2843ec594441a9519590bd10f0c7358"} Mar 13 16:54:20 crc kubenswrapper[4786]: I0313 16:54:20.372216 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90376e9e-850b-475b-87f2-76efe597c3ea","Type":"ContainerStarted","Data":"2676c2423ebc7201fcff508b8abb072d434bc736e97e6c70617fc4135b90f1ee"} Mar 13 16:54:20 crc kubenswrapper[4786]: I0313 16:54:20.570128 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc792b6-1efc-495e-a21f-5a1d9c854918" path="/var/lib/kubelet/pods/1fc792b6-1efc-495e-a21f-5a1d9c854918/volumes" Mar 13 16:54:20 crc kubenswrapper[4786]: I0313 16:54:20.571226 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d048ce03-aa11-47c6-8ace-17283ef21370" path="/var/lib/kubelet/pods/d048ce03-aa11-47c6-8ace-17283ef21370/volumes" Mar 13 16:54:21 crc kubenswrapper[4786]: I0313 16:54:21.389131 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90376e9e-850b-475b-87f2-76efe597c3ea","Type":"ContainerStarted","Data":"95ece6bad0180e4b19c7e435fae34d88e01a690f5ecfae3737f7705e33cc9305"} Mar 13 16:54:22 crc kubenswrapper[4786]: I0313 16:54:22.409766 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90376e9e-850b-475b-87f2-76efe597c3ea","Type":"ContainerStarted","Data":"ec526b1ae6352e5b47f5f6ba0ae1fc5617349a13fd625784b1a7d31a0abf1ba0"} Mar 13 16:54:23 crc kubenswrapper[4786]: I0313 16:54:23.424274 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"90376e9e-850b-475b-87f2-76efe597c3ea","Type":"ContainerStarted","Data":"0a03aa90b7162b5e710a4e4e4dfeca514952fff3fa0661ae00eae3c8b8b35303"} Mar 13 16:54:23 crc kubenswrapper[4786]: I0313 16:54:23.464074 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.7121585550000002 podStartE2EDuration="5.464052203s" podCreationTimestamp="2026-03-13 16:54:18 +0000 UTC" firstStartedPulling="2026-03-13 16:54:19.548765846 +0000 UTC m=+6689.711977687" lastFinishedPulling="2026-03-13 16:54:22.300659484 +0000 UTC m=+6692.463871335" observedRunningTime="2026-03-13 16:54:23.451320693 +0000 UTC m=+6693.614532504" watchObservedRunningTime="2026-03-13 16:54:23.464052203 +0000 UTC m=+6693.627264004" Mar 13 16:54:24 crc kubenswrapper[4786]: I0313 16:54:24.765003 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 16:54:29 crc kubenswrapper[4786]: I0313 16:54:29.100620 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:54:29 crc kubenswrapper[4786]: I0313 16:54:29.101319 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d1dfd277-77e2-41fb-ab90-1548be6194d9" containerName="kube-state-metrics" containerID="cri-o://4174001cd6897bd2f49364208aadfa47aab9074ae6e54e9e8c0954335a03a299" gracePeriod=30 Mar 13 16:54:29 crc kubenswrapper[4786]: I0313 16:54:29.489793 4786 generic.go:334] "Generic (PLEG): container finished" podID="d1dfd277-77e2-41fb-ab90-1548be6194d9" containerID="4174001cd6897bd2f49364208aadfa47aab9074ae6e54e9e8c0954335a03a299" exitCode=2 Mar 13 16:54:29 crc kubenswrapper[4786]: I0313 16:54:29.489893 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d1dfd277-77e2-41fb-ab90-1548be6194d9","Type":"ContainerDied","Data":"4174001cd6897bd2f49364208aadfa47aab9074ae6e54e9e8c0954335a03a299"} Mar 13 16:54:29 crc kubenswrapper[4786]: I0313 16:54:29.657972 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 16:54:29 crc kubenswrapper[4786]: I0313 16:54:29.715750 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fklns\" (UniqueName: \"kubernetes.io/projected/d1dfd277-77e2-41fb-ab90-1548be6194d9-kube-api-access-fklns\") pod \"d1dfd277-77e2-41fb-ab90-1548be6194d9\" (UID: \"d1dfd277-77e2-41fb-ab90-1548be6194d9\") " Mar 13 16:54:29 crc kubenswrapper[4786]: I0313 16:54:29.722151 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1dfd277-77e2-41fb-ab90-1548be6194d9-kube-api-access-fklns" (OuterVolumeSpecName: "kube-api-access-fklns") pod "d1dfd277-77e2-41fb-ab90-1548be6194d9" (UID: "d1dfd277-77e2-41fb-ab90-1548be6194d9"). InnerVolumeSpecName "kube-api-access-fklns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:54:29 crc kubenswrapper[4786]: I0313 16:54:29.818124 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fklns\" (UniqueName: \"kubernetes.io/projected/d1dfd277-77e2-41fb-ab90-1548be6194d9-kube-api-access-fklns\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.505074 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d1dfd277-77e2-41fb-ab90-1548be6194d9","Type":"ContainerDied","Data":"973a6ab88f7b1d302acda236ba0069d5845763aa3c520b0320a4ac64ac667f35"} Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.505170 4786 scope.go:117] "RemoveContainer" containerID="4174001cd6897bd2f49364208aadfa47aab9074ae6e54e9e8c0954335a03a299" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.505103 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.583360 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.593036 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.619777 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:54:30 crc kubenswrapper[4786]: E0313 16:54:30.620411 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1dfd277-77e2-41fb-ab90-1548be6194d9" containerName="kube-state-metrics" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.620432 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1dfd277-77e2-41fb-ab90-1548be6194d9" containerName="kube-state-metrics" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.620658 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1dfd277-77e2-41fb-ab90-1548be6194d9" containerName="kube-state-metrics" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.621517 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.624488 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.624517 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.641529 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.737197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.737267 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g9l5\" (UniqueName: \"kubernetes.io/projected/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-api-access-2g9l5\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.737310 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.737389 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.839032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.840015 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.840102 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g9l5\" (UniqueName: \"kubernetes.io/projected/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-api-access-2g9l5\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.840168 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.844321 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.845248 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.848430 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.868712 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g9l5\" (UniqueName: \"kubernetes.io/projected/47c2f005-6321-48f7-a9e4-2abca43199ff-kube-api-access-2g9l5\") pod \"kube-state-metrics-0\" (UID: \"47c2f005-6321-48f7-a9e4-2abca43199ff\") " pod="openstack/kube-state-metrics-0" Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.922397 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.922947 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="ceilometer-central-agent" containerID="cri-o://72dacf12db0160c865c91620bdac662538376aad0b1a1fc764b9dfdada220182" gracePeriod=30 Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.922972 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="proxy-httpd" containerID="cri-o://e6f3f1764e42ef13bd1b493edfd59f0b52d74583c16747aaba4bd5581cb5127a" gracePeriod=30 Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.923017 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="sg-core" containerID="cri-o://55240f5a4397d0c52cfd005ae4a0f0bb659abef2062318bfb4f83667e3ab68b3" gracePeriod=30 Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.923011 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="ceilometer-notification-agent" containerID="cri-o://c6904aa577457f495451957a3160349147c824d2924a3a4fd1e65dfa49e49375" gracePeriod=30 Mar 13 16:54:30 crc kubenswrapper[4786]: I0313 16:54:30.950716 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 16:54:31 crc kubenswrapper[4786]: I0313 16:54:31.463721 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 16:54:31 crc kubenswrapper[4786]: I0313 16:54:31.519058 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"47c2f005-6321-48f7-a9e4-2abca43199ff","Type":"ContainerStarted","Data":"5ab1df3f13746ea92405816416124e64031dc4078a66a5d04c5fc66c3f9fa23c"} Mar 13 16:54:31 crc kubenswrapper[4786]: I0313 16:54:31.524663 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerID="e6f3f1764e42ef13bd1b493edfd59f0b52d74583c16747aaba4bd5581cb5127a" exitCode=0 Mar 13 16:54:31 crc kubenswrapper[4786]: I0313 16:54:31.524696 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerID="55240f5a4397d0c52cfd005ae4a0f0bb659abef2062318bfb4f83667e3ab68b3" exitCode=2 Mar 13 16:54:31 crc kubenswrapper[4786]: I0313 16:54:31.524708 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerID="72dacf12db0160c865c91620bdac662538376aad0b1a1fc764b9dfdada220182" exitCode=0 Mar 13 16:54:31 crc kubenswrapper[4786]: I0313 16:54:31.524729 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerDied","Data":"e6f3f1764e42ef13bd1b493edfd59f0b52d74583c16747aaba4bd5581cb5127a"} Mar 13 16:54:31 crc kubenswrapper[4786]: I0313 16:54:31.524754 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerDied","Data":"55240f5a4397d0c52cfd005ae4a0f0bb659abef2062318bfb4f83667e3ab68b3"} Mar 13 16:54:31 crc kubenswrapper[4786]: I0313 16:54:31.524768 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerDied","Data":"72dacf12db0160c865c91620bdac662538376aad0b1a1fc764b9dfdada220182"} Mar 13 16:54:32 crc kubenswrapper[4786]: I0313 16:54:32.536283 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"47c2f005-6321-48f7-a9e4-2abca43199ff","Type":"ContainerStarted","Data":"6d270b1300f987b929bbc724fc94c2483b94b4c80d3b015c72a786eb76af6f13"} Mar 13 16:54:32 crc kubenswrapper[4786]: I0313 16:54:32.536580 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 16:54:32 crc kubenswrapper[4786]: I0313 16:54:32.566345 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.200158415 podStartE2EDuration="2.566316205s" podCreationTimestamp="2026-03-13 16:54:30 +0000 UTC" firstStartedPulling="2026-03-13 16:54:31.477034748 +0000 UTC m=+6701.640246559" lastFinishedPulling="2026-03-13 16:54:31.843192498 +0000 UTC m=+6702.006404349" observedRunningTime="2026-03-13 16:54:32.565144976 +0000 UTC m=+6702.728356817" watchObservedRunningTime="2026-03-13 16:54:32.566316205 +0000 UTC m=+6702.729528056" Mar 13 16:54:32 crc kubenswrapper[4786]: I0313 16:54:32.574145 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1dfd277-77e2-41fb-ab90-1548be6194d9" path="/var/lib/kubelet/pods/d1dfd277-77e2-41fb-ab90-1548be6194d9/volumes" Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.553282 4786 generic.go:334] "Generic (PLEG): container finished" podID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerID="c6904aa577457f495451957a3160349147c824d2924a3a4fd1e65dfa49e49375" exitCode=0 Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.555845 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerDied","Data":"c6904aa577457f495451957a3160349147c824d2924a3a4fd1e65dfa49e49375"} Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.881095 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.914540 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h47vs\" (UniqueName: \"kubernetes.io/projected/1ac22059-3e77-45a2-a7a9-27b92d221e05-kube-api-access-h47vs\") pod \"1ac22059-3e77-45a2-a7a9-27b92d221e05\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.914637 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-config-data\") pod \"1ac22059-3e77-45a2-a7a9-27b92d221e05\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.914695 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-scripts\") pod \"1ac22059-3e77-45a2-a7a9-27b92d221e05\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.914846 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-combined-ca-bundle\") pod \"1ac22059-3e77-45a2-a7a9-27b92d221e05\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.915457 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-log-httpd\") pod \"1ac22059-3e77-45a2-a7a9-27b92d221e05\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.915511 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-sg-core-conf-yaml\") pod \"1ac22059-3e77-45a2-a7a9-27b92d221e05\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.915762 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-run-httpd\") pod \"1ac22059-3e77-45a2-a7a9-27b92d221e05\" (UID: \"1ac22059-3e77-45a2-a7a9-27b92d221e05\") " Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.917508 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ac22059-3e77-45a2-a7a9-27b92d221e05" (UID: "1ac22059-3e77-45a2-a7a9-27b92d221e05"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.925250 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ac22059-3e77-45a2-a7a9-27b92d221e05" (UID: "1ac22059-3e77-45a2-a7a9-27b92d221e05"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.943119 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac22059-3e77-45a2-a7a9-27b92d221e05-kube-api-access-h47vs" (OuterVolumeSpecName: "kube-api-access-h47vs") pod "1ac22059-3e77-45a2-a7a9-27b92d221e05" (UID: "1ac22059-3e77-45a2-a7a9-27b92d221e05"). InnerVolumeSpecName "kube-api-access-h47vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.955275 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ac22059-3e77-45a2-a7a9-27b92d221e05" (UID: "1ac22059-3e77-45a2-a7a9-27b92d221e05"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:54:33 crc kubenswrapper[4786]: I0313 16:54:33.955623 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-scripts" (OuterVolumeSpecName: "scripts") pod "1ac22059-3e77-45a2-a7a9-27b92d221e05" (UID: "1ac22059-3e77-45a2-a7a9-27b92d221e05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.012614 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ac22059-3e77-45a2-a7a9-27b92d221e05" (UID: "1ac22059-3e77-45a2-a7a9-27b92d221e05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.018409 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.018437 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h47vs\" (UniqueName: \"kubernetes.io/projected/1ac22059-3e77-45a2-a7a9-27b92d221e05-kube-api-access-h47vs\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.018449 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.018458 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.018466 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac22059-3e77-45a2-a7a9-27b92d221e05-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.018476 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.059412 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-config-data" (OuterVolumeSpecName: "config-data") pod "1ac22059-3e77-45a2-a7a9-27b92d221e05" (UID: "1ac22059-3e77-45a2-a7a9-27b92d221e05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.121079 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac22059-3e77-45a2-a7a9-27b92d221e05-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.584385 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac22059-3e77-45a2-a7a9-27b92d221e05","Type":"ContainerDied","Data":"7ac824fdb40631b22374161fa7df47dac913a7d5b4b14e7137bb8b50308c4985"} Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.584461 4786 scope.go:117] "RemoveContainer" containerID="e6f3f1764e42ef13bd1b493edfd59f0b52d74583c16747aaba4bd5581cb5127a" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.584628 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.610270 4786 scope.go:117] "RemoveContainer" containerID="55240f5a4397d0c52cfd005ae4a0f0bb659abef2062318bfb4f83667e3ab68b3" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.654030 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.665850 4786 scope.go:117] "RemoveContainer" containerID="c6904aa577457f495451957a3160349147c824d2924a3a4fd1e65dfa49e49375" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.669465 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.681226 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:54:34 crc kubenswrapper[4786]: E0313 16:54:34.681786 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="ceilometer-notification-agent" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.681813 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="ceilometer-notification-agent" Mar 13 16:54:34 crc kubenswrapper[4786]: E0313 16:54:34.681884 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="ceilometer-central-agent" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.681898 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="ceilometer-central-agent" Mar 13 16:54:34 crc kubenswrapper[4786]: E0313 16:54:34.681916 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="proxy-httpd" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.681939 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="proxy-httpd" Mar 13 16:54:34 crc kubenswrapper[4786]: E0313 16:54:34.682006 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="sg-core" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.682022 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="sg-core" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.682292 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="ceilometer-central-agent" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.682346 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="ceilometer-notification-agent" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.682367 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="sg-core" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.682383 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" containerName="proxy-httpd" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.687765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.690320 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.690976 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.691250 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.696337 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.698799 4786 scope.go:117] "RemoveContainer" containerID="72dacf12db0160c865c91620bdac662538376aad0b1a1fc764b9dfdada220182" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.734094 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.734160 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-scripts\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.734186 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.734446 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l24h8\" (UniqueName: \"kubernetes.io/projected/573cb783-9c37-4274-a50d-e1a85d94feda-kube-api-access-l24h8\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.734491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/573cb783-9c37-4274-a50d-e1a85d94feda-log-httpd\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.734654 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/573cb783-9c37-4274-a50d-e1a85d94feda-run-httpd\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.734743 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-config-data\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.734994 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.837643 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.837704 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-scripts\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.837725 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.837785 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l24h8\" (UniqueName: \"kubernetes.io/projected/573cb783-9c37-4274-a50d-e1a85d94feda-kube-api-access-l24h8\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.837803 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/573cb783-9c37-4274-a50d-e1a85d94feda-log-httpd\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.837850 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/573cb783-9c37-4274-a50d-e1a85d94feda-run-httpd\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.837890 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-config-data\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.837940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.838719 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/573cb783-9c37-4274-a50d-e1a85d94feda-run-httpd\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.838933 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/573cb783-9c37-4274-a50d-e1a85d94feda-log-httpd\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.845597 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-config-data\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.846944 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-scripts\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.847810 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.848365 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.852518 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573cb783-9c37-4274-a50d-e1a85d94feda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:34 crc kubenswrapper[4786]: I0313 16:54:34.857540 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l24h8\" (UniqueName: \"kubernetes.io/projected/573cb783-9c37-4274-a50d-e1a85d94feda-kube-api-access-l24h8\") pod \"ceilometer-0\" (UID: \"573cb783-9c37-4274-a50d-e1a85d94feda\") " pod="openstack/ceilometer-0" Mar 13 16:54:35 crc kubenswrapper[4786]: I0313 16:54:35.016775 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 16:54:35 crc kubenswrapper[4786]: I0313 16:54:35.550103 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 16:54:35 crc kubenswrapper[4786]: W0313 16:54:35.554909 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod573cb783_9c37_4274_a50d_e1a85d94feda.slice/crio-ba36c2f914e48d425ec13e46d80208720a29a2e5ad315def18933d328f4d9499 WatchSource:0}: Error finding container ba36c2f914e48d425ec13e46d80208720a29a2e5ad315def18933d328f4d9499: Status 404 returned error can't find the container with id ba36c2f914e48d425ec13e46d80208720a29a2e5ad315def18933d328f4d9499 Mar 13 16:54:35 crc kubenswrapper[4786]: I0313 16:54:35.600034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"573cb783-9c37-4274-a50d-e1a85d94feda","Type":"ContainerStarted","Data":"ba36c2f914e48d425ec13e46d80208720a29a2e5ad315def18933d328f4d9499"} Mar 13 16:54:36 crc kubenswrapper[4786]: I0313 16:54:36.569715 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac22059-3e77-45a2-a7a9-27b92d221e05" path="/var/lib/kubelet/pods/1ac22059-3e77-45a2-a7a9-27b92d221e05/volumes" Mar 13 16:54:36 crc kubenswrapper[4786]: I0313 16:54:36.616448 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"573cb783-9c37-4274-a50d-e1a85d94feda","Type":"ContainerStarted","Data":"4569c1ac566ea6fb6dc499223fe0a3d850a73074cc9fa5d880080bb298e4cdd1"} Mar 13 16:54:37 crc kubenswrapper[4786]: I0313 16:54:37.633811 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"573cb783-9c37-4274-a50d-e1a85d94feda","Type":"ContainerStarted","Data":"719e32f690331803c2d2da191fc47c8808d6656bf8f8aa8948891e4385fcc702"} Mar 13 16:54:38 crc kubenswrapper[4786]: I0313 16:54:38.649069 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"573cb783-9c37-4274-a50d-e1a85d94feda","Type":"ContainerStarted","Data":"887fa924174b784fa799282765b343987605df9858b914a2183487b8d67d1ffa"} Mar 13 16:54:39 crc kubenswrapper[4786]: I0313 16:54:39.039304 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dkm6m"] Mar 13 16:54:39 crc kubenswrapper[4786]: I0313 16:54:39.052923 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dkm6m"] Mar 13 16:54:40 crc kubenswrapper[4786]: I0313 16:54:40.031914 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mmn4t"] Mar 13 16:54:40 crc kubenswrapper[4786]: I0313 16:54:40.040092 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mmn4t"] Mar 13 16:54:40 crc kubenswrapper[4786]: I0313 16:54:40.597416 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76ef6159-b90f-418a-97d3-e256d87aefb5" path="/var/lib/kubelet/pods/76ef6159-b90f-418a-97d3-e256d87aefb5/volumes" Mar 13 16:54:40 crc kubenswrapper[4786]: I0313 16:54:40.607566 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59cb012-12c2-4674-850a-d9638f76670d" path="/var/lib/kubelet/pods/e59cb012-12c2-4674-850a-d9638f76670d/volumes" Mar 13 16:54:40 crc kubenswrapper[4786]: I0313 16:54:40.687876 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"573cb783-9c37-4274-a50d-e1a85d94feda","Type":"ContainerStarted","Data":"37f20d33ec33f6b8c5ef01e4b4ac5194079c5334d1f56c7ddc6e3b9782b3fa35"} Mar 13 16:54:40 crc kubenswrapper[4786]: I0313 16:54:40.688200 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 16:54:40 crc kubenswrapper[4786]: I0313 16:54:40.725613 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.785068536 podStartE2EDuration="6.725589817s" podCreationTimestamp="2026-03-13 16:54:34 +0000 UTC" firstStartedPulling="2026-03-13 16:54:35.559452393 +0000 UTC m=+6705.722664244" lastFinishedPulling="2026-03-13 16:54:39.499973714 +0000 UTC m=+6709.663185525" observedRunningTime="2026-03-13 16:54:40.724167831 +0000 UTC m=+6710.887379652" watchObservedRunningTime="2026-03-13 16:54:40.725589817 +0000 UTC m=+6710.888801668" Mar 13 16:54:40 crc kubenswrapper[4786]: I0313 16:54:40.964488 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 16:54:58 crc kubenswrapper[4786]: I0313 16:54:58.083156 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xrdtt"] Mar 13 16:54:58 crc kubenswrapper[4786]: I0313 16:54:58.094983 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xrdtt"] Mar 13 16:54:58 crc kubenswrapper[4786]: I0313 16:54:58.576692 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68020e92-0f8a-4061-8308-a3a20f80b4f3" path="/var/lib/kubelet/pods/68020e92-0f8a-4061-8308-a3a20f80b4f3/volumes" Mar 13 16:55:05 crc kubenswrapper[4786]: I0313 16:55:05.034925 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 16:55:07 crc kubenswrapper[4786]: I0313 16:55:07.869226 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:55:07 crc kubenswrapper[4786]: I0313 16:55:07.869830 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:55:18 crc kubenswrapper[4786]: I0313 16:55:18.657523 4786 scope.go:117] "RemoveContainer" containerID="2764304bcc8da9df6388222c8e88cec2fb0266ab2e1596b78027f30b57b21292" Mar 13 16:55:18 crc kubenswrapper[4786]: I0313 16:55:18.732082 4786 scope.go:117] "RemoveContainer" containerID="9118afe520cae42ed9ed04b641fd4161c1f159a572113ef1f608c799e44d85e9" Mar 13 16:55:18 crc kubenswrapper[4786]: I0313 16:55:18.794703 4786 scope.go:117] "RemoveContainer" containerID="94bbeea2fbe22940c761ac71078fe115aa8dc96a579d0c317c11767385a4334a" Mar 13 16:55:18 crc kubenswrapper[4786]: I0313 16:55:18.848384 4786 scope.go:117] "RemoveContainer" containerID="b8cd6f5b561b22034a89ae3b3d6f259bb4169e8a9c5c2988ba829fe9ebde792a" Mar 13 16:55:37 crc kubenswrapper[4786]: I0313 16:55:37.868926 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:55:37 crc kubenswrapper[4786]: I0313 16:55:37.869663 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.182114 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557016-rbnld"] Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.185167 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557016-rbnld" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.188210 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.188782 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.188916 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.193364 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557016-rbnld"] Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.248239 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snkww\" (UniqueName: \"kubernetes.io/projected/87dfaa69-9c28-4b64-8544-114860c0a557-kube-api-access-snkww\") pod \"auto-csr-approver-29557016-rbnld\" (UID: \"87dfaa69-9c28-4b64-8544-114860c0a557\") " pod="openshift-infra/auto-csr-approver-29557016-rbnld" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.351406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snkww\" (UniqueName: \"kubernetes.io/projected/87dfaa69-9c28-4b64-8544-114860c0a557-kube-api-access-snkww\") pod \"auto-csr-approver-29557016-rbnld\" (UID: \"87dfaa69-9c28-4b64-8544-114860c0a557\") " pod="openshift-infra/auto-csr-approver-29557016-rbnld" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.384400 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snkww\" (UniqueName: \"kubernetes.io/projected/87dfaa69-9c28-4b64-8544-114860c0a557-kube-api-access-snkww\") pod \"auto-csr-approver-29557016-rbnld\" (UID: \"87dfaa69-9c28-4b64-8544-114860c0a557\") " pod="openshift-infra/auto-csr-approver-29557016-rbnld" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.511356 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557016-rbnld" Mar 13 16:56:00 crc kubenswrapper[4786]: I0313 16:56:00.900627 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557016-rbnld"] Mar 13 16:56:01 crc kubenswrapper[4786]: I0313 16:56:01.651945 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557016-rbnld" event={"ID":"87dfaa69-9c28-4b64-8544-114860c0a557","Type":"ContainerStarted","Data":"1ba6050e971b140b413cc1cef14e8eac9c2489d6be9e7298022e9b72c6a82771"} Mar 13 16:56:03 crc kubenswrapper[4786]: I0313 16:56:03.678940 4786 generic.go:334] "Generic (PLEG): container finished" podID="87dfaa69-9c28-4b64-8544-114860c0a557" containerID="d6235fc18b9fe2323be60f1661fd1f7aaed1d3da0bb86fe4dd4bb31c99520b04" exitCode=0 Mar 13 16:56:03 crc kubenswrapper[4786]: I0313 16:56:03.679169 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557016-rbnld" event={"ID":"87dfaa69-9c28-4b64-8544-114860c0a557","Type":"ContainerDied","Data":"d6235fc18b9fe2323be60f1661fd1f7aaed1d3da0bb86fe4dd4bb31c99520b04"} Mar 13 16:56:05 crc kubenswrapper[4786]: I0313 16:56:05.109334 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557016-rbnld" Mar 13 16:56:05 crc kubenswrapper[4786]: I0313 16:56:05.262376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snkww\" (UniqueName: \"kubernetes.io/projected/87dfaa69-9c28-4b64-8544-114860c0a557-kube-api-access-snkww\") pod \"87dfaa69-9c28-4b64-8544-114860c0a557\" (UID: \"87dfaa69-9c28-4b64-8544-114860c0a557\") " Mar 13 16:56:05 crc kubenswrapper[4786]: I0313 16:56:05.271176 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87dfaa69-9c28-4b64-8544-114860c0a557-kube-api-access-snkww" (OuterVolumeSpecName: "kube-api-access-snkww") pod "87dfaa69-9c28-4b64-8544-114860c0a557" (UID: "87dfaa69-9c28-4b64-8544-114860c0a557"). InnerVolumeSpecName "kube-api-access-snkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:56:05 crc kubenswrapper[4786]: I0313 16:56:05.365041 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snkww\" (UniqueName: \"kubernetes.io/projected/87dfaa69-9c28-4b64-8544-114860c0a557-kube-api-access-snkww\") on node \"crc\" DevicePath \"\"" Mar 13 16:56:05 crc kubenswrapper[4786]: I0313 16:56:05.699577 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557016-rbnld" event={"ID":"87dfaa69-9c28-4b64-8544-114860c0a557","Type":"ContainerDied","Data":"1ba6050e971b140b413cc1cef14e8eac9c2489d6be9e7298022e9b72c6a82771"} Mar 13 16:56:05 crc kubenswrapper[4786]: I0313 16:56:05.699617 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ba6050e971b140b413cc1cef14e8eac9c2489d6be9e7298022e9b72c6a82771" Mar 13 16:56:05 crc kubenswrapper[4786]: I0313 16:56:05.699667 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557016-rbnld" Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.214184 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557010-wtt5x"] Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.230130 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557010-wtt5x"] Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.582184 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20fb1ba-8c7f-4ca9-a73f-43f515ee9112" path="/var/lib/kubelet/pods/c20fb1ba-8c7f-4ca9-a73f-43f515ee9112/volumes" Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.879018 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8s2h"] Mar 13 16:56:06 crc kubenswrapper[4786]: E0313 16:56:06.879644 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87dfaa69-9c28-4b64-8544-114860c0a557" containerName="oc" Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.879675 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="87dfaa69-9c28-4b64-8544-114860c0a557" containerName="oc" Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.880041 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="87dfaa69-9c28-4b64-8544-114860c0a557" containerName="oc" Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.882386 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.895035 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8s2h"] Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.902985 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfv5\" (UniqueName: \"kubernetes.io/projected/4d20d1cd-ed93-4848-bd29-95a6737f2b40-kube-api-access-rnfv5\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.903524 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-utilities\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:06 crc kubenswrapper[4786]: I0313 16:56:06.903777 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-catalog-content\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.009176 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-utilities\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.009304 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-catalog-content\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.009463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfv5\" (UniqueName: \"kubernetes.io/projected/4d20d1cd-ed93-4848-bd29-95a6737f2b40-kube-api-access-rnfv5\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.010060 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-utilities\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.010093 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-catalog-content\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.031944 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfv5\" (UniqueName: \"kubernetes.io/projected/4d20d1cd-ed93-4848-bd29-95a6737f2b40-kube-api-access-rnfv5\") pod \"community-operators-r8s2h\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.202503 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.551164 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8s2h"] Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.720223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8s2h" event={"ID":"4d20d1cd-ed93-4848-bd29-95a6737f2b40","Type":"ContainerStarted","Data":"90b6eeb0a5e5f8bce4446d0211d8f5cab7344d6af1082b5bcec2952fd0105723"} Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.868165 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.868479 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.868540 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.869663 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 16:56:07 crc kubenswrapper[4786]: I0313 16:56:07.869763 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" gracePeriod=600 Mar 13 16:56:07 crc kubenswrapper[4786]: E0313 16:56:07.996951 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:56:08 crc kubenswrapper[4786]: I0313 16:56:08.738110 4786 generic.go:334] "Generic (PLEG): container finished" podID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerID="d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f" exitCode=0 Mar 13 16:56:08 crc kubenswrapper[4786]: I0313 16:56:08.738217 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8s2h" event={"ID":"4d20d1cd-ed93-4848-bd29-95a6737f2b40","Type":"ContainerDied","Data":"d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f"} Mar 13 16:56:08 crc kubenswrapper[4786]: I0313 16:56:08.748242 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" exitCode=0 Mar 13 16:56:08 crc kubenswrapper[4786]: I0313 16:56:08.748296 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d"} Mar 13 16:56:08 crc kubenswrapper[4786]: I0313 16:56:08.748341 4786 scope.go:117] "RemoveContainer" containerID="b49392c99cf5e104c67fff6a8b879c097bd7fea9986e8b283b323621dcd6d857" Mar 13 16:56:08 crc kubenswrapper[4786]: I0313 16:56:08.749447 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:56:08 crc kubenswrapper[4786]: E0313 16:56:08.750201 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:56:10 crc kubenswrapper[4786]: I0313 16:56:10.784536 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8s2h" event={"ID":"4d20d1cd-ed93-4848-bd29-95a6737f2b40","Type":"ContainerStarted","Data":"c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062"} Mar 13 16:56:12 crc kubenswrapper[4786]: I0313 16:56:12.814505 4786 generic.go:334] "Generic (PLEG): container finished" podID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerID="c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062" exitCode=0 Mar 13 16:56:12 crc kubenswrapper[4786]: I0313 16:56:12.814702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8s2h" event={"ID":"4d20d1cd-ed93-4848-bd29-95a6737f2b40","Type":"ContainerDied","Data":"c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062"} Mar 13 16:56:14 crc kubenswrapper[4786]: I0313 16:56:14.843502 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8s2h" event={"ID":"4d20d1cd-ed93-4848-bd29-95a6737f2b40","Type":"ContainerStarted","Data":"a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb"} Mar 13 16:56:14 crc kubenswrapper[4786]: I0313 16:56:14.885778 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8s2h" podStartSLOduration=3.366998374 podStartE2EDuration="8.885751206s" podCreationTimestamp="2026-03-13 16:56:06 +0000 UTC" firstStartedPulling="2026-03-13 16:56:08.740641648 +0000 UTC m=+6798.903853499" lastFinishedPulling="2026-03-13 16:56:14.25939449 +0000 UTC m=+6804.422606331" observedRunningTime="2026-03-13 16:56:14.865409295 +0000 UTC m=+6805.028621146" watchObservedRunningTime="2026-03-13 16:56:14.885751206 +0000 UTC m=+6805.048963057" Mar 13 16:56:17 crc kubenswrapper[4786]: I0313 16:56:17.203575 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:17 crc kubenswrapper[4786]: I0313 16:56:17.204019 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:18 crc kubenswrapper[4786]: I0313 16:56:18.281509 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-r8s2h" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="registry-server" probeResult="failure" output=< Mar 13 16:56:18 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Mar 13 16:56:18 crc kubenswrapper[4786]: > Mar 13 16:56:19 crc kubenswrapper[4786]: I0313 16:56:19.070746 4786 scope.go:117] "RemoveContainer" containerID="14fd9b0a6a02aa1636d59b4a03ad8882b3192a8d047388e85a0163a3f1e4a77a" Mar 13 16:56:21 crc kubenswrapper[4786]: I0313 16:56:21.551775 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:56:21 crc kubenswrapper[4786]: E0313 16:56:21.552749 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:56:27 crc kubenswrapper[4786]: I0313 16:56:27.281765 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:27 crc kubenswrapper[4786]: I0313 16:56:27.360320 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:27 crc kubenswrapper[4786]: I0313 16:56:27.547136 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8s2h"] Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.022206 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r8s2h" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="registry-server" containerID="cri-o://a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb" gracePeriod=2 Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.569762 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.653849 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnfv5\" (UniqueName: \"kubernetes.io/projected/4d20d1cd-ed93-4848-bd29-95a6737f2b40-kube-api-access-rnfv5\") pod \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.653931 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-utilities\") pod \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.653967 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-catalog-content\") pod \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\" (UID: \"4d20d1cd-ed93-4848-bd29-95a6737f2b40\") " Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.655479 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-utilities" (OuterVolumeSpecName: "utilities") pod "4d20d1cd-ed93-4848-bd29-95a6737f2b40" (UID: "4d20d1cd-ed93-4848-bd29-95a6737f2b40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.655695 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.666370 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d20d1cd-ed93-4848-bd29-95a6737f2b40-kube-api-access-rnfv5" (OuterVolumeSpecName: "kube-api-access-rnfv5") pod "4d20d1cd-ed93-4848-bd29-95a6737f2b40" (UID: "4d20d1cd-ed93-4848-bd29-95a6737f2b40"). InnerVolumeSpecName "kube-api-access-rnfv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.726608 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d20d1cd-ed93-4848-bd29-95a6737f2b40" (UID: "4d20d1cd-ed93-4848-bd29-95a6737f2b40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.757035 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d20d1cd-ed93-4848-bd29-95a6737f2b40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:56:29 crc kubenswrapper[4786]: I0313 16:56:29.757061 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnfv5\" (UniqueName: \"kubernetes.io/projected/4d20d1cd-ed93-4848-bd29-95a6737f2b40-kube-api-access-rnfv5\") on node \"crc\" DevicePath \"\"" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.048282 4786 generic.go:334] "Generic (PLEG): container finished" podID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerID="a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb" exitCode=0 Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.048337 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8s2h" event={"ID":"4d20d1cd-ed93-4848-bd29-95a6737f2b40","Type":"ContainerDied","Data":"a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb"} Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.048372 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8s2h" event={"ID":"4d20d1cd-ed93-4848-bd29-95a6737f2b40","Type":"ContainerDied","Data":"90b6eeb0a5e5f8bce4446d0211d8f5cab7344d6af1082b5bcec2952fd0105723"} Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.048400 4786 scope.go:117] "RemoveContainer" containerID="a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.048410 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8s2h" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.075155 4786 scope.go:117] "RemoveContainer" containerID="c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.099302 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8s2h"] Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.109087 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r8s2h"] Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.123121 4786 scope.go:117] "RemoveContainer" containerID="d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.159097 4786 scope.go:117] "RemoveContainer" containerID="a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb" Mar 13 16:56:30 crc kubenswrapper[4786]: E0313 16:56:30.159527 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb\": container with ID starting with a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb not found: ID does not exist" containerID="a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.159578 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb"} err="failed to get container status \"a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb\": rpc error: code = NotFound desc = could not find container \"a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb\": container with ID starting with a1308ac993de575aae631d3e4e9a416b09ae49b8bc09cef0709d8f2bfd19b1bb not found: ID does not exist" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.159599 4786 scope.go:117] "RemoveContainer" containerID="c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062" Mar 13 16:56:30 crc kubenswrapper[4786]: E0313 16:56:30.159940 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062\": container with ID starting with c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062 not found: ID does not exist" containerID="c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.159973 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062"} err="failed to get container status \"c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062\": rpc error: code = NotFound desc = could not find container \"c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062\": container with ID starting with c23c39cc6a37390af3aabe58e22974cfbd9deb3bc95c3c64c855650ab519e062 not found: ID does not exist" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.159988 4786 scope.go:117] "RemoveContainer" containerID="d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f" Mar 13 16:56:30 crc kubenswrapper[4786]: E0313 16:56:30.160240 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f\": container with ID starting with d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f not found: ID does not exist" containerID="d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.160368 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f"} err="failed to get container status \"d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f\": rpc error: code = NotFound desc = could not find container \"d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f\": container with ID starting with d343d70e5edf184400b98d41322cebc643ba8c41be129d7e04e152cb23ecca1f not found: ID does not exist" Mar 13 16:56:30 crc kubenswrapper[4786]: I0313 16:56:30.577169 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" path="/var/lib/kubelet/pods/4d20d1cd-ed93-4848-bd29-95a6737f2b40/volumes" Mar 13 16:56:34 crc kubenswrapper[4786]: I0313 16:56:34.552838 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:56:34 crc kubenswrapper[4786]: E0313 16:56:34.554278 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:56:47 crc kubenswrapper[4786]: I0313 16:56:47.552905 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:56:47 crc kubenswrapper[4786]: E0313 16:56:47.554268 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:56:50 crc kubenswrapper[4786]: I0313 16:56:50.082714 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-qgn7f"] Mar 13 16:56:50 crc kubenswrapper[4786]: I0313 16:56:50.094262 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-qgn7f"] Mar 13 16:56:50 crc kubenswrapper[4786]: I0313 16:56:50.575228 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231c608d-b1ee-445f-af64-45aa3e82cca0" path="/var/lib/kubelet/pods/231c608d-b1ee-445f-af64-45aa3e82cca0/volumes" Mar 13 16:56:51 crc kubenswrapper[4786]: I0313 16:56:51.056673 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-9515-account-create-update-nqqng"] Mar 13 16:56:51 crc kubenswrapper[4786]: I0313 16:56:51.068318 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-9515-account-create-update-nqqng"] Mar 13 16:56:52 crc kubenswrapper[4786]: I0313 16:56:52.566771 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d662abaa-4326-48f5-b0ff-7ab37206b48c" path="/var/lib/kubelet/pods/d662abaa-4326-48f5-b0ff-7ab37206b48c/volumes" Mar 13 16:56:56 crc kubenswrapper[4786]: I0313 16:56:56.064399 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-gk5br"] Mar 13 16:56:56 crc kubenswrapper[4786]: I0313 16:56:56.098426 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-gk5br"] Mar 13 16:56:56 crc kubenswrapper[4786]: I0313 16:56:56.562544 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6472d1-1200-49d1-876c-b688bb2f4a14" path="/var/lib/kubelet/pods/5a6472d1-1200-49d1-876c-b688bb2f4a14/volumes" Mar 13 16:56:57 crc kubenswrapper[4786]: I0313 16:56:57.041269 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-8518-account-create-update-52bsl"] Mar 13 16:56:57 crc kubenswrapper[4786]: I0313 16:56:57.059997 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-8518-account-create-update-52bsl"] Mar 13 16:56:58 crc kubenswrapper[4786]: I0313 16:56:58.575618 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e558f33f-13cc-451c-8d7c-93990aa61ad7" path="/var/lib/kubelet/pods/e558f33f-13cc-451c-8d7c-93990aa61ad7/volumes" Mar 13 16:57:01 crc kubenswrapper[4786]: I0313 16:57:01.553169 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:57:01 crc kubenswrapper[4786]: E0313 16:57:01.554119 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.087843 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8fmsf/must-gather-smw9x"] Mar 13 16:57:02 crc kubenswrapper[4786]: E0313 16:57:02.088634 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="extract-utilities" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.088653 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="extract-utilities" Mar 13 16:57:02 crc kubenswrapper[4786]: E0313 16:57:02.088676 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="extract-content" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.088686 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="extract-content" Mar 13 16:57:02 crc kubenswrapper[4786]: E0313 16:57:02.088717 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="registry-server" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.088727 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="registry-server" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.089087 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d20d1cd-ed93-4848-bd29-95a6737f2b40" containerName="registry-server" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.118335 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.146527 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8fmsf"/"kube-root-ca.crt" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.147324 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8fmsf"/"openshift-service-ca.crt" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.147531 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8fmsf"/"default-dockercfg-w4mnr" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.184923 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8fmsf/must-gather-smw9x"] Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.283620 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whgnq\" (UniqueName: \"kubernetes.io/projected/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-kube-api-access-whgnq\") pod \"must-gather-smw9x\" (UID: \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\") " pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.283702 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-must-gather-output\") pod \"must-gather-smw9x\" (UID: \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\") " pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.386154 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whgnq\" (UniqueName: \"kubernetes.io/projected/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-kube-api-access-whgnq\") pod \"must-gather-smw9x\" (UID: \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\") " pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.386235 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-must-gather-output\") pod \"must-gather-smw9x\" (UID: \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\") " pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.386875 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-must-gather-output\") pod \"must-gather-smw9x\" (UID: \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\") " pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.417003 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whgnq\" (UniqueName: \"kubernetes.io/projected/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-kube-api-access-whgnq\") pod \"must-gather-smw9x\" (UID: \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\") " pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.453131 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.957829 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8fmsf/must-gather-smw9x"] Mar 13 16:57:02 crc kubenswrapper[4786]: I0313 16:57:02.959398 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 16:57:03 crc kubenswrapper[4786]: I0313 16:57:03.441025 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/must-gather-smw9x" event={"ID":"0b73a37c-f416-4418-bc3d-6752b4dbf7d8","Type":"ContainerStarted","Data":"a34bfbff6ce2962a4efc35c2dd4e9ee6d3c09bf73179b3acdbd98361b4c84755"} Mar 13 16:57:11 crc kubenswrapper[4786]: I0313 16:57:11.531007 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/must-gather-smw9x" event={"ID":"0b73a37c-f416-4418-bc3d-6752b4dbf7d8","Type":"ContainerStarted","Data":"cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891"} Mar 13 16:57:11 crc kubenswrapper[4786]: I0313 16:57:11.531830 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/must-gather-smw9x" event={"ID":"0b73a37c-f416-4418-bc3d-6752b4dbf7d8","Type":"ContainerStarted","Data":"28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94"} Mar 13 16:57:11 crc kubenswrapper[4786]: I0313 16:57:11.550437 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8fmsf/must-gather-smw9x" podStartSLOduration=2.006633656 podStartE2EDuration="9.550419464s" podCreationTimestamp="2026-03-13 16:57:02 +0000 UTC" firstStartedPulling="2026-03-13 16:57:02.959172079 +0000 UTC m=+6853.122383880" lastFinishedPulling="2026-03-13 16:57:10.502957877 +0000 UTC m=+6860.666169688" observedRunningTime="2026-03-13 16:57:11.546742772 +0000 UTC m=+6861.709954623" watchObservedRunningTime="2026-03-13 16:57:11.550419464 +0000 UTC m=+6861.713631285" Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.259235 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8fmsf/crc-debug-dsnjx"] Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.262207 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.382053 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a82989-d83f-437e-93de-994a94556815-host\") pod \"crc-debug-dsnjx\" (UID: \"25a82989-d83f-437e-93de-994a94556815\") " pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.383545 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8drg\" (UniqueName: \"kubernetes.io/projected/25a82989-d83f-437e-93de-994a94556815-kube-api-access-q8drg\") pod \"crc-debug-dsnjx\" (UID: \"25a82989-d83f-437e-93de-994a94556815\") " pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.486415 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a82989-d83f-437e-93de-994a94556815-host\") pod \"crc-debug-dsnjx\" (UID: \"25a82989-d83f-437e-93de-994a94556815\") " pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.486549 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a82989-d83f-437e-93de-994a94556815-host\") pod \"crc-debug-dsnjx\" (UID: \"25a82989-d83f-437e-93de-994a94556815\") " pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.487246 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8drg\" (UniqueName: \"kubernetes.io/projected/25a82989-d83f-437e-93de-994a94556815-kube-api-access-q8drg\") pod \"crc-debug-dsnjx\" (UID: \"25a82989-d83f-437e-93de-994a94556815\") " pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.524030 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8drg\" (UniqueName: \"kubernetes.io/projected/25a82989-d83f-437e-93de-994a94556815-kube-api-access-q8drg\") pod \"crc-debug-dsnjx\" (UID: \"25a82989-d83f-437e-93de-994a94556815\") " pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:15 crc kubenswrapper[4786]: I0313 16:57:15.585126 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:15 crc kubenswrapper[4786]: W0313 16:57:15.620514 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25a82989_d83f_437e_93de_994a94556815.slice/crio-3d015984229439a00ff3dd83c22d5c1970072e30b859a66ae4b5ee04342255bc WatchSource:0}: Error finding container 3d015984229439a00ff3dd83c22d5c1970072e30b859a66ae4b5ee04342255bc: Status 404 returned error can't find the container with id 3d015984229439a00ff3dd83c22d5c1970072e30b859a66ae4b5ee04342255bc Mar 13 16:57:16 crc kubenswrapper[4786]: I0313 16:57:16.552993 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:57:16 crc kubenswrapper[4786]: E0313 16:57:16.553805 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:57:16 crc kubenswrapper[4786]: I0313 16:57:16.595097 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" event={"ID":"25a82989-d83f-437e-93de-994a94556815","Type":"ContainerStarted","Data":"3d015984229439a00ff3dd83c22d5c1970072e30b859a66ae4b5ee04342255bc"} Mar 13 16:57:19 crc kubenswrapper[4786]: I0313 16:57:19.169066 4786 scope.go:117] "RemoveContainer" containerID="91d398c9242cbb5a9c80557765acf9b55a66ede9d7aa1c1c239d0e6e73e44b07" Mar 13 16:57:19 crc kubenswrapper[4786]: I0313 16:57:19.216011 4786 scope.go:117] "RemoveContainer" containerID="c4064521b891c40bb276a45439e34babfff6d9016d94b0404c3466785e0bf398" Mar 13 16:57:19 crc kubenswrapper[4786]: I0313 16:57:19.238392 4786 scope.go:117] "RemoveContainer" containerID="5bd36d7bc8267ea28728850cbbfeacbdc8837a28a601fdb9983e9d47eae18d2a" Mar 13 16:57:19 crc kubenswrapper[4786]: I0313 16:57:19.289754 4786 scope.go:117] "RemoveContainer" containerID="536dd126e5528f984a7ded958ac4998ac0db1c246ddbfd37f21c88773e2ad582" Mar 13 16:57:27 crc kubenswrapper[4786]: I0313 16:57:27.552594 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:57:27 crc kubenswrapper[4786]: E0313 16:57:27.553529 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:57:27 crc kubenswrapper[4786]: I0313 16:57:27.695316 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" event={"ID":"25a82989-d83f-437e-93de-994a94556815","Type":"ContainerStarted","Data":"9c0c5d34adc6517b7f1e42d3fb8e68d96534bdabfedb8787e860b43a5a85da44"} Mar 13 16:57:40 crc kubenswrapper[4786]: I0313 16:57:40.559413 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:57:40 crc kubenswrapper[4786]: E0313 16:57:40.560602 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:57:43 crc kubenswrapper[4786]: I0313 16:57:43.878339 4786 generic.go:334] "Generic (PLEG): container finished" podID="25a82989-d83f-437e-93de-994a94556815" containerID="9c0c5d34adc6517b7f1e42d3fb8e68d96534bdabfedb8787e860b43a5a85da44" exitCode=0 Mar 13 16:57:43 crc kubenswrapper[4786]: I0313 16:57:43.879050 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" event={"ID":"25a82989-d83f-437e-93de-994a94556815","Type":"ContainerDied","Data":"9c0c5d34adc6517b7f1e42d3fb8e68d96534bdabfedb8787e860b43a5a85da44"} Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.014413 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.055238 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8fmsf/crc-debug-dsnjx"] Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.065702 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8fmsf/crc-debug-dsnjx"] Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.181060 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8drg\" (UniqueName: \"kubernetes.io/projected/25a82989-d83f-437e-93de-994a94556815-kube-api-access-q8drg\") pod \"25a82989-d83f-437e-93de-994a94556815\" (UID: \"25a82989-d83f-437e-93de-994a94556815\") " Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.181208 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a82989-d83f-437e-93de-994a94556815-host\") pod \"25a82989-d83f-437e-93de-994a94556815\" (UID: \"25a82989-d83f-437e-93de-994a94556815\") " Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.181368 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25a82989-d83f-437e-93de-994a94556815-host" (OuterVolumeSpecName: "host") pod "25a82989-d83f-437e-93de-994a94556815" (UID: "25a82989-d83f-437e-93de-994a94556815"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.181682 4786 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25a82989-d83f-437e-93de-994a94556815-host\") on node \"crc\" DevicePath \"\"" Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.187054 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a82989-d83f-437e-93de-994a94556815-kube-api-access-q8drg" (OuterVolumeSpecName: "kube-api-access-q8drg") pod "25a82989-d83f-437e-93de-994a94556815" (UID: "25a82989-d83f-437e-93de-994a94556815"). InnerVolumeSpecName "kube-api-access-q8drg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.283236 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8drg\" (UniqueName: \"kubernetes.io/projected/25a82989-d83f-437e-93de-994a94556815-kube-api-access-q8drg\") on node \"crc\" DevicePath \"\"" Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.899126 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d015984229439a00ff3dd83c22d5c1970072e30b859a66ae4b5ee04342255bc" Mar 13 16:57:45 crc kubenswrapper[4786]: I0313 16:57:45.899212 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/crc-debug-dsnjx" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.281909 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8fmsf/crc-debug-c7s6j"] Mar 13 16:57:46 crc kubenswrapper[4786]: E0313 16:57:46.282335 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a82989-d83f-437e-93de-994a94556815" containerName="container-00" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.282349 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a82989-d83f-437e-93de-994a94556815" containerName="container-00" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.282622 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a82989-d83f-437e-93de-994a94556815" containerName="container-00" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.283719 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.406155 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzfhk\" (UniqueName: \"kubernetes.io/projected/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-kube-api-access-pzfhk\") pod \"crc-debug-c7s6j\" (UID: \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\") " pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.406256 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-host\") pod \"crc-debug-c7s6j\" (UID: \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\") " pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.508125 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzfhk\" (UniqueName: \"kubernetes.io/projected/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-kube-api-access-pzfhk\") pod \"crc-debug-c7s6j\" (UID: \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\") " pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.508216 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-host\") pod \"crc-debug-c7s6j\" (UID: \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\") " pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.508531 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-host\") pod \"crc-debug-c7s6j\" (UID: \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\") " pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.527241 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzfhk\" (UniqueName: \"kubernetes.io/projected/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-kube-api-access-pzfhk\") pod \"crc-debug-c7s6j\" (UID: \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\") " pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.571200 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a82989-d83f-437e-93de-994a94556815" path="/var/lib/kubelet/pods/25a82989-d83f-437e-93de-994a94556815/volumes" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.599149 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.909631 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" event={"ID":"61f27571-d26a-4aa7-bafd-46f8ac22f5cc","Type":"ContainerStarted","Data":"106ba1ac08465fc4a6fc2ee33cea1671630f817008722e1d4491177451fad159"} Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.910207 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" event={"ID":"61f27571-d26a-4aa7-bafd-46f8ac22f5cc","Type":"ContainerStarted","Data":"ccbb58371845d48928e375a2843b79d8ff0f1ed938fedca8402084bdd3c340ab"} Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.944488 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8fmsf/crc-debug-c7s6j"] Mar 13 16:57:46 crc kubenswrapper[4786]: I0313 16:57:46.953984 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8fmsf/crc-debug-c7s6j"] Mar 13 16:57:47 crc kubenswrapper[4786]: I0313 16:57:47.040710 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-ffctd"] Mar 13 16:57:47 crc kubenswrapper[4786]: I0313 16:57:47.052043 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-ffctd"] Mar 13 16:57:47 crc kubenswrapper[4786]: I0313 16:57:47.920502 4786 generic.go:334] "Generic (PLEG): container finished" podID="61f27571-d26a-4aa7-bafd-46f8ac22f5cc" containerID="106ba1ac08465fc4a6fc2ee33cea1671630f817008722e1d4491177451fad159" exitCode=1 Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.058095 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.139571 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzfhk\" (UniqueName: \"kubernetes.io/projected/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-kube-api-access-pzfhk\") pod \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\" (UID: \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\") " Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.139662 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-host\") pod \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\" (UID: \"61f27571-d26a-4aa7-bafd-46f8ac22f5cc\") " Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.140324 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-host" (OuterVolumeSpecName: "host") pod "61f27571-d26a-4aa7-bafd-46f8ac22f5cc" (UID: "61f27571-d26a-4aa7-bafd-46f8ac22f5cc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.145796 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-kube-api-access-pzfhk" (OuterVolumeSpecName: "kube-api-access-pzfhk") pod "61f27571-d26a-4aa7-bafd-46f8ac22f5cc" (UID: "61f27571-d26a-4aa7-bafd-46f8ac22f5cc"). InnerVolumeSpecName "kube-api-access-pzfhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.241839 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzfhk\" (UniqueName: \"kubernetes.io/projected/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-kube-api-access-pzfhk\") on node \"crc\" DevicePath \"\"" Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.241901 4786 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/61f27571-d26a-4aa7-bafd-46f8ac22f5cc-host\") on node \"crc\" DevicePath \"\"" Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.569599 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f27571-d26a-4aa7-bafd-46f8ac22f5cc" path="/var/lib/kubelet/pods/61f27571-d26a-4aa7-bafd-46f8ac22f5cc/volumes" Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.570988 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afbd4809-109d-4fa3-83f5-64a4f78d24a9" path="/var/lib/kubelet/pods/afbd4809-109d-4fa3-83f5-64a4f78d24a9/volumes" Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.928562 4786 scope.go:117] "RemoveContainer" containerID="106ba1ac08465fc4a6fc2ee33cea1671630f817008722e1d4491177451fad159" Mar 13 16:57:48 crc kubenswrapper[4786]: I0313 16:57:48.928614 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/crc-debug-c7s6j" Mar 13 16:57:53 crc kubenswrapper[4786]: I0313 16:57:53.552604 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:57:53 crc kubenswrapper[4786]: E0313 16:57:53.553452 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.152634 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557018-mqmh9"] Mar 13 16:58:00 crc kubenswrapper[4786]: E0313 16:58:00.153640 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f27571-d26a-4aa7-bafd-46f8ac22f5cc" containerName="container-00" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.153654 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f27571-d26a-4aa7-bafd-46f8ac22f5cc" containerName="container-00" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.153886 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f27571-d26a-4aa7-bafd-46f8ac22f5cc" containerName="container-00" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.154706 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.157812 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.158225 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.158488 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.161808 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557018-mqmh9"] Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.306969 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phzt\" (UniqueName: \"kubernetes.io/projected/1850f9bb-bcf0-48ad-84b5-b138b0e06ccf-kube-api-access-5phzt\") pod \"auto-csr-approver-29557018-mqmh9\" (UID: \"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf\") " pod="openshift-infra/auto-csr-approver-29557018-mqmh9" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.409604 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5phzt\" (UniqueName: \"kubernetes.io/projected/1850f9bb-bcf0-48ad-84b5-b138b0e06ccf-kube-api-access-5phzt\") pod \"auto-csr-approver-29557018-mqmh9\" (UID: \"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf\") " pod="openshift-infra/auto-csr-approver-29557018-mqmh9" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.428656 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phzt\" (UniqueName: \"kubernetes.io/projected/1850f9bb-bcf0-48ad-84b5-b138b0e06ccf-kube-api-access-5phzt\") pod \"auto-csr-approver-29557018-mqmh9\" (UID: \"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf\") " pod="openshift-infra/auto-csr-approver-29557018-mqmh9" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.487851 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" Mar 13 16:58:00 crc kubenswrapper[4786]: I0313 16:58:00.979964 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557018-mqmh9"] Mar 13 16:58:01 crc kubenswrapper[4786]: I0313 16:58:01.060459 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" event={"ID":"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf","Type":"ContainerStarted","Data":"dbd0d0e96c38dc0ee693c03cfcb1fbd742a2ba6e47ca334cc2e944e8f5ef04f1"} Mar 13 16:58:03 crc kubenswrapper[4786]: I0313 16:58:03.078695 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" event={"ID":"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf","Type":"ContainerStarted","Data":"23cbbb09890b74a306b516666d1336a6dc02c87024a2b448354040ef162893de"} Mar 13 16:58:03 crc kubenswrapper[4786]: I0313 16:58:03.096717 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" podStartSLOduration=1.841921922 podStartE2EDuration="3.096700968s" podCreationTimestamp="2026-03-13 16:58:00 +0000 UTC" firstStartedPulling="2026-03-13 16:58:00.995327434 +0000 UTC m=+6911.158539245" lastFinishedPulling="2026-03-13 16:58:02.25010647 +0000 UTC m=+6912.413318291" observedRunningTime="2026-03-13 16:58:03.091150249 +0000 UTC m=+6913.254362060" watchObservedRunningTime="2026-03-13 16:58:03.096700968 +0000 UTC m=+6913.259912769" Mar 13 16:58:04 crc kubenswrapper[4786]: I0313 16:58:04.092799 4786 generic.go:334] "Generic (PLEG): container finished" podID="1850f9bb-bcf0-48ad-84b5-b138b0e06ccf" containerID="23cbbb09890b74a306b516666d1336a6dc02c87024a2b448354040ef162893de" exitCode=0 Mar 13 16:58:04 crc kubenswrapper[4786]: I0313 16:58:04.093050 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" event={"ID":"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf","Type":"ContainerDied","Data":"23cbbb09890b74a306b516666d1336a6dc02c87024a2b448354040ef162893de"} Mar 13 16:58:05 crc kubenswrapper[4786]: I0313 16:58:05.470176 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" Mar 13 16:58:05 crc kubenswrapper[4786]: I0313 16:58:05.641705 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5phzt\" (UniqueName: \"kubernetes.io/projected/1850f9bb-bcf0-48ad-84b5-b138b0e06ccf-kube-api-access-5phzt\") pod \"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf\" (UID: \"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf\") " Mar 13 16:58:05 crc kubenswrapper[4786]: I0313 16:58:05.651161 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1850f9bb-bcf0-48ad-84b5-b138b0e06ccf-kube-api-access-5phzt" (OuterVolumeSpecName: "kube-api-access-5phzt") pod "1850f9bb-bcf0-48ad-84b5-b138b0e06ccf" (UID: "1850f9bb-bcf0-48ad-84b5-b138b0e06ccf"). InnerVolumeSpecName "kube-api-access-5phzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:58:05 crc kubenswrapper[4786]: I0313 16:58:05.746093 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5phzt\" (UniqueName: \"kubernetes.io/projected/1850f9bb-bcf0-48ad-84b5-b138b0e06ccf-kube-api-access-5phzt\") on node \"crc\" DevicePath \"\"" Mar 13 16:58:06 crc kubenswrapper[4786]: I0313 16:58:06.112759 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" event={"ID":"1850f9bb-bcf0-48ad-84b5-b138b0e06ccf","Type":"ContainerDied","Data":"dbd0d0e96c38dc0ee693c03cfcb1fbd742a2ba6e47ca334cc2e944e8f5ef04f1"} Mar 13 16:58:06 crc kubenswrapper[4786]: I0313 16:58:06.112809 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbd0d0e96c38dc0ee693c03cfcb1fbd742a2ba6e47ca334cc2e944e8f5ef04f1" Mar 13 16:58:06 crc kubenswrapper[4786]: I0313 16:58:06.112875 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557018-mqmh9" Mar 13 16:58:06 crc kubenswrapper[4786]: I0313 16:58:06.153472 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557012-j6lqx"] Mar 13 16:58:06 crc kubenswrapper[4786]: I0313 16:58:06.161678 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557012-j6lqx"] Mar 13 16:58:06 crc kubenswrapper[4786]: I0313 16:58:06.552154 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:58:06 crc kubenswrapper[4786]: E0313 16:58:06.552644 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:58:06 crc kubenswrapper[4786]: I0313 16:58:06.568188 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd" path="/var/lib/kubelet/pods/5a3fa989-c5a3-4e3e-abb1-6fdb8c5a60dd/volumes" Mar 13 16:58:19 crc kubenswrapper[4786]: I0313 16:58:19.429835 4786 scope.go:117] "RemoveContainer" containerID="84b74aa0ae04474301efb2fe7fdce8900d9b2cb70e891b950ee3d372511b996f" Mar 13 16:58:19 crc kubenswrapper[4786]: I0313 16:58:19.474756 4786 scope.go:117] "RemoveContainer" containerID="4ed1be7a06b205e2feca3f39bc4950d406cf286ad1e42266de30b5a964c7a111" Mar 13 16:58:19 crc kubenswrapper[4786]: I0313 16:58:19.509616 4786 scope.go:117] "RemoveContainer" containerID="404495674c7cb2940e2d3bc73ec98e48c84052dce98db11e233c192fbe35d01c" Mar 13 16:58:20 crc kubenswrapper[4786]: I0313 16:58:20.559914 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:58:20 crc kubenswrapper[4786]: E0313 16:58:20.561355 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:58:32 crc kubenswrapper[4786]: I0313 16:58:32.554072 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:58:32 crc kubenswrapper[4786]: E0313 16:58:32.554804 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:58:44 crc kubenswrapper[4786]: I0313 16:58:44.552226 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:58:44 crc kubenswrapper[4786]: E0313 16:58:44.553112 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:58:44 crc kubenswrapper[4786]: I0313 16:58:44.961957 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_98f1c455-5023-4f1f-989c-4c63b48a696b/init-config-reloader/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.142343 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_98f1c455-5023-4f1f-989c-4c63b48a696b/init-config-reloader/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.217325 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_98f1c455-5023-4f1f-989c-4c63b48a696b/alertmanager/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.293679 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_98f1c455-5023-4f1f-989c-4c63b48a696b/config-reloader/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.455746 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_90376e9e-850b-475b-87f2-76efe597c3ea/aodh-api/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.503654 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_90376e9e-850b-475b-87f2-76efe597c3ea/aodh-evaluator/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.587722 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_90376e9e-850b-475b-87f2-76efe597c3ea/aodh-listener/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.640280 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_90376e9e-850b-475b-87f2-76efe597c3ea/aodh-notifier/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.956747 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-create-8k747_50f1da9f-ee38-45cb-bc5d-21db3eda07f4/mariadb-database-create/0.log" Mar 13 16:58:45 crc kubenswrapper[4786]: I0313 16:58:45.996842 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-db-sync-l7l9f_92f0ab05-073c-4f6d-a863-d36e4fc32f25/aodh-db-sync/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.160983 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-ff67-account-create-update-nrhwr_7f276c87-a633-4e23-b3c8-61353255c1e0/mariadb-account-create-update/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.373948 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_573cb783-9c37-4274-a50d-e1a85d94feda/ceilometer-central-agent/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.383279 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_573cb783-9c37-4274-a50d-e1a85d94feda/ceilometer-notification-agent/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.409525 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_573cb783-9c37-4274-a50d-e1a85d94feda/proxy-httpd/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.445889 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_573cb783-9c37-4274-a50d-e1a85d94feda/sg-core/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.620243 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc/cinder-api-log/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.629516 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9fe4ebf2-fe58-4ab4-b646-0bcbcb6156bc/cinder-api/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.806038 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6c0dfd41-bd16-4216-b44c-41ebbd25af63/cinder-scheduler/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.853882 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6c0dfd41-bd16-4216-b44c-41ebbd25af63/probe/0.log" Mar 13 16:58:46 crc kubenswrapper[4786]: I0313 16:58:46.964439 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-559978d967-l5m4z_100899d2-4eab-4a4b-a33d-435515ece751/init/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.116956 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-559978d967-l5m4z_100899d2-4eab-4a4b-a33d-435515ece751/init/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.134603 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-559978d967-l5m4z_100899d2-4eab-4a4b-a33d-435515ece751/dnsmasq-dns/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.204324 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dc05c795-f8c5-42ef-a000-1932740ca77a/glance-httpd/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.340807 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dc05c795-f8c5-42ef-a000-1932740ca77a/glance-log/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.399238 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4/glance-httpd/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.493386 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f1e03eb4-2eeb-40a3-8783-c75c5e55c4b4/glance-log/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.640200 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5b44c95f89-pdjc6_c8a16c52-18c2-4d08-a87e-c32303cd89e9/heat-api/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.687644 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-c03e-account-create-update-qjtm7_7af1308e-8fe9-42d7-b748-c9bf713499d3/mariadb-account-create-update/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.878838 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-fd85f647c-9ndph_767cfaf3-e5cc-4fc0-9be0-26e357051ec3/heat-cfnapi/0.log" Mar 13 16:58:47 crc kubenswrapper[4786]: I0313 16:58:47.918955 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-create-qxr2j_3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37/mariadb-database-create/0.log" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.096869 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-sync-b952z_0a523430-946d-4f79-b557-d784192b2e95/heat-db-sync/0.log" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.218378 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6f955857ff-qd498_1d5cf355-7214-417e-85dc-cc9a451d1649/heat-engine/0.log" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.385945 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6b979877c4-zsjfz_78f5805d-cef6-45e1-bbac-e4edb49b8273/horizon/0.log" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.517741 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6b979877c4-zsjfz_78f5805d-cef6-45e1-bbac-e4edb49b8273/horizon-log/0.log" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.640579 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5d9c4d76b9-54mn5_1549fff2-8323-404e-a727-80d3bb71f7c8/keystone-api/0.log" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.690147 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_47c2f005-6321-48f7-a9e4-2abca43199ff/kube-state-metrics/0.log" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.808651 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mpwlc"] Mar 13 16:58:48 crc kubenswrapper[4786]: E0313 16:58:48.809060 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1850f9bb-bcf0-48ad-84b5-b138b0e06ccf" containerName="oc" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.809076 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1850f9bb-bcf0-48ad-84b5-b138b0e06ccf" containerName="oc" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.809283 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1850f9bb-bcf0-48ad-84b5-b138b0e06ccf" containerName="oc" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.814832 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.823298 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mpwlc"] Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.900671 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_d723d777-35e7-4884-b57d-bffbb87a4228/adoption/0.log" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.964240 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-utilities\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.964307 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-654x8\" (UniqueName: \"kubernetes.io/projected/f0d61d79-c0a7-4679-a098-3a61585c8caa-kube-api-access-654x8\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:48 crc kubenswrapper[4786]: I0313 16:58:48.964407 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-catalog-content\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.066006 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-utilities\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.066074 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-654x8\" (UniqueName: \"kubernetes.io/projected/f0d61d79-c0a7-4679-a098-3a61585c8caa-kube-api-access-654x8\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.066149 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-catalog-content\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.066501 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-utilities\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.066558 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-catalog-content\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.086439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-654x8\" (UniqueName: \"kubernetes.io/projected/f0d61d79-c0a7-4679-a098-3a61585c8caa-kube-api-access-654x8\") pod \"redhat-operators-mpwlc\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.136511 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.239988 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f5c6db85-hdk8q_d57b6476-92e9-4c2c-8577-6627254ae198/neutron-api/0.log" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.320589 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f5c6db85-hdk8q_d57b6476-92e9-4c2c-8577-6627254ae198/neutron-httpd/0.log" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.632286 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mpwlc"] Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.684169 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_40766174-3a3b-4797-b4f8-1a3bf0e9eb7c/nova-api-log/0.log" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.908664 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_40766174-3a3b-4797-b4f8-1a3bf0e9eb7c/nova-api-api/0.log" Mar 13 16:58:49 crc kubenswrapper[4786]: I0313 16:58:49.929522 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b658310c-9b6a-4a6b-babd-f5c50cce01d2/nova-cell0-conductor-conductor/0.log" Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.055308 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6d725d27-695e-4244-8ce7-440d167598ce/nova-cell1-conductor-conductor/0.log" Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.237716 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e137d965-4671-483c-80be-206f9c897681/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.536262 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_183127c3-ab56-4da7-8459-34b1d6060c41/nova-metadata-log/0.log" Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.573181 4786 generic.go:334] "Generic (PLEG): container finished" podID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerID="9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc" exitCode=0 Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.573220 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mpwlc" event={"ID":"f0d61d79-c0a7-4679-a098-3a61585c8caa","Type":"ContainerDied","Data":"9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc"} Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.573245 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mpwlc" event={"ID":"f0d61d79-c0a7-4679-a098-3a61585c8caa","Type":"ContainerStarted","Data":"ccfdc94e1f1d736bbf4276d35a1dd8cdbf783bb12f22b6b654401ab1ef26039d"} Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.787334 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_183127c3-ab56-4da7-8459-34b1d6060c41/nova-metadata-metadata/0.log" Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.807328 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_17b94d7f-1927-4369-a6df-460f0c896ced/nova-scheduler-scheduler/0.log" Mar 13 16:58:50 crc kubenswrapper[4786]: I0313 16:58:50.955620 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7d68854bb9-vv8ck_7b597ad4-6e16-4499-b61e-a1d01d253164/init/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.156043 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7d68854bb9-vv8ck_7b597ad4-6e16-4499-b61e-a1d01d253164/init/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.182262 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7d68854bb9-vv8ck_7b597ad4-6e16-4499-b61e-a1d01d253164/octavia-api-provider-agent/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.316915 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7d68854bb9-vv8ck_7b597ad4-6e16-4499-b61e-a1d01d253164/octavia-api/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.330309 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-fpprt_1a64beb5-4de2-40c1-9fc9-86d2f5b36048/init/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.551797 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-fpprt_1a64beb5-4de2-40c1-9fc9-86d2f5b36048/init/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.655155 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-vsftn_e7bcea0d-ef91-4510-8698-dea274f82f83/init/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.691155 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-fpprt_1a64beb5-4de2-40c1-9fc9-86d2f5b36048/octavia-healthmanager/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.878883 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-vsftn_e7bcea0d-ef91-4510-8698-dea274f82f83/init/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.879328 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-vsftn_e7bcea0d-ef91-4510-8698-dea274f82f83/octavia-housekeeping/0.log" Mar 13 16:58:51 crc kubenswrapper[4786]: I0313 16:58:51.998226 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-q4769_7b17f538-d321-4719-a3e1-651226075939/init/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.154316 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-q4769_7b17f538-d321-4719-a3e1-651226075939/octavia-rsyslog/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.245394 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-q4769_7b17f538-d321-4719-a3e1-651226075939/init/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.263224 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-flqbr_9de4aa0b-2f11-404c-9ae7-913912454f89/init/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.445184 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-flqbr_9de4aa0b-2f11-404c-9ae7-913912454f89/init/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.590867 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mpwlc" event={"ID":"f0d61d79-c0a7-4679-a098-3a61585c8caa","Type":"ContainerStarted","Data":"67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30"} Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.604846 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ce37391d-6853-4058-81bd-53d5009f12fc/mysql-bootstrap/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.666833 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-flqbr_9de4aa0b-2f11-404c-9ae7-913912454f89/octavia-worker/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.730312 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ce37391d-6853-4058-81bd-53d5009f12fc/mysql-bootstrap/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.852131 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ce37391d-6853-4058-81bd-53d5009f12fc/galera/0.log" Mar 13 16:58:52 crc kubenswrapper[4786]: I0313 16:58:52.915979 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4fe673d3-c989-4a19-9e4b-c06e0cc21ebc/mysql-bootstrap/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.113807 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4fe673d3-c989-4a19-9e4b-c06e0cc21ebc/mysql-bootstrap/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.146045 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_46843f0a-6e35-49f9-b304-31a452d756ed/openstackclient/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.146335 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4fe673d3-c989-4a19-9e4b-c06e0cc21ebc/galera/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.345951 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jksdp_ac0b2bbd-5bcb-4537-8273-09f3fdc43d61/ovn-controller/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.356439 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6gkng_937fe94a-0226-43da-963b-2c4d605b71de/openstack-network-exporter/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.603617 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7trr6_2ead52d0-3013-43d3-976f-6db98a50ce3b/ovsdb-server-init/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.803937 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7trr6_2ead52d0-3013-43d3-976f-6db98a50ce3b/ovsdb-server-init/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.828376 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7trr6_2ead52d0-3013-43d3-976f-6db98a50ce3b/ovs-vswitchd/0.log" Mar 13 16:58:53 crc kubenswrapper[4786]: I0313 16:58:53.887956 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7trr6_2ead52d0-3013-43d3-976f-6db98a50ce3b/ovsdb-server/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.068570 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87781344-d04e-483c-a281-8dfb63ec64b9/openstack-network-exporter/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.112470 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_afa86413-0653-43ff-9f79-327d16b5ea3c/adoption/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.151455 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87781344-d04e-483c-a281-8dfb63ec64b9/ovn-northd/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.417381 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13ea7c72-49d9-4e99-a890-cb51a6ea441e/openstack-network-exporter/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.430549 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13ea7c72-49d9-4e99-a890-cb51a6ea441e/ovsdbserver-nb/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.530738 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7a919b26-f2f2-411c-b7ec-afe290a48417/openstack-network-exporter/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.653479 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_9830d010-b8d6-43a4-b49e-5300740afa03/openstack-network-exporter/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.714314 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7a919b26-f2f2-411c-b7ec-afe290a48417/ovsdbserver-nb/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.786347 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_9830d010-b8d6-43a4-b49e-5300740afa03/ovsdbserver-nb/0.log" Mar 13 16:58:54 crc kubenswrapper[4786]: I0313 16:58:54.962643 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c778abf8-64be-4c6e-993a-df2bf80128c0/openstack-network-exporter/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.102776 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c778abf8-64be-4c6e-993a-df2bf80128c0/ovsdbserver-sb/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.115146 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_986264a5-c61a-430c-818f-b0c66693eb87/openstack-network-exporter/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.293186 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_986264a5-c61a-430c-818f-b0c66693eb87/ovsdbserver-sb/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.329757 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_44ab28cc-d71c-43f8-8448-a7175567fd08/openstack-network-exporter/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.369934 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_44ab28cc-d71c-43f8-8448-a7175567fd08/ovsdbserver-sb/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.558820 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58f4f77c48-brshm_8538ba8c-4af3-4757-8a9a-ebcf54a6c253/placement-api/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.670834 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58f4f77c48-brshm_8538ba8c-4af3-4757-8a9a-ebcf54a6c253/placement-log/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.750424 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b42d91a2-2438-453c-a5d0-6aa31991a770/init-config-reloader/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.960823 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b42d91a2-2438-453c-a5d0-6aa31991a770/init-config-reloader/0.log" Mar 13 16:58:55 crc kubenswrapper[4786]: I0313 16:58:55.991030 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b42d91a2-2438-453c-a5d0-6aa31991a770/prometheus/0.log" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.006149 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b42d91a2-2438-453c-a5d0-6aa31991a770/config-reloader/0.log" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.034627 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b42d91a2-2438-453c-a5d0-6aa31991a770/thanos-sidecar/0.log" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.202442 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_868904a0-2393-4ad2-93c1-ca840552abd8/setup-container/0.log" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.420122 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_868904a0-2393-4ad2-93c1-ca840552abd8/setup-container/0.log" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.469770 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_868904a0-2393-4ad2-93c1-ca840552abd8/rabbitmq/0.log" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.517505 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e73a6e9a-15e8-47a1-9818-086aa1a8e60e/setup-container/0.log" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.553836 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:58:56 crc kubenswrapper[4786]: E0313 16:58:56.554185 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.756165 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e73a6e9a-15e8-47a1-9818-086aa1a8e60e/setup-container/0.log" Mar 13 16:58:56 crc kubenswrapper[4786]: I0313 16:58:56.883085 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85f65c99b4-74hxg_b0cf0344-26b6-4d4f-919a-a478a51ffa7f/proxy-httpd/0.log" Mar 13 16:58:57 crc kubenswrapper[4786]: I0313 16:58:57.027258 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85f65c99b4-74hxg_b0cf0344-26b6-4d4f-919a-a478a51ffa7f/proxy-server/0.log" Mar 13 16:58:57 crc kubenswrapper[4786]: I0313 16:58:57.160352 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-nh76t_e53156aa-ccc2-4554-8ac4-718db21b1ca8/swift-ring-rebalance/0.log" Mar 13 16:58:57 crc kubenswrapper[4786]: I0313 16:58:57.783894 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e73a6e9a-15e8-47a1-9818-086aa1a8e60e/rabbitmq/0.log" Mar 13 16:58:58 crc kubenswrapper[4786]: I0313 16:58:58.663228 4786 generic.go:334] "Generic (PLEG): container finished" podID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerID="67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30" exitCode=0 Mar 13 16:58:58 crc kubenswrapper[4786]: I0313 16:58:58.663270 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mpwlc" event={"ID":"f0d61d79-c0a7-4679-a098-3a61585c8caa","Type":"ContainerDied","Data":"67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30"} Mar 13 16:58:59 crc kubenswrapper[4786]: I0313 16:58:59.684607 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mpwlc" event={"ID":"f0d61d79-c0a7-4679-a098-3a61585c8caa","Type":"ContainerStarted","Data":"a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab"} Mar 13 16:58:59 crc kubenswrapper[4786]: I0313 16:58:59.710530 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mpwlc" podStartSLOduration=3.180418391 podStartE2EDuration="11.71051125s" podCreationTimestamp="2026-03-13 16:58:48 +0000 UTC" firstStartedPulling="2026-03-13 16:58:50.575075213 +0000 UTC m=+6960.738287024" lastFinishedPulling="2026-03-13 16:58:59.105168072 +0000 UTC m=+6969.268379883" observedRunningTime="2026-03-13 16:58:59.703072253 +0000 UTC m=+6969.866284064" watchObservedRunningTime="2026-03-13 16:58:59.71051125 +0000 UTC m=+6969.873723061" Mar 13 16:59:09 crc kubenswrapper[4786]: I0313 16:59:09.137767 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:59:09 crc kubenswrapper[4786]: I0313 16:59:09.138269 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:59:09 crc kubenswrapper[4786]: I0313 16:59:09.200005 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:59:09 crc kubenswrapper[4786]: I0313 16:59:09.835151 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:59:09 crc kubenswrapper[4786]: I0313 16:59:09.882133 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mpwlc"] Mar 13 16:59:10 crc kubenswrapper[4786]: I0313 16:59:10.561577 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:59:10 crc kubenswrapper[4786]: E0313 16:59:10.562107 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:59:11 crc kubenswrapper[4786]: I0313 16:59:11.801059 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mpwlc" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerName="registry-server" containerID="cri-o://a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab" gracePeriod=2 Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.285887 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.366127 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-catalog-content\") pod \"f0d61d79-c0a7-4679-a098-3a61585c8caa\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.366420 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-654x8\" (UniqueName: \"kubernetes.io/projected/f0d61d79-c0a7-4679-a098-3a61585c8caa-kube-api-access-654x8\") pod \"f0d61d79-c0a7-4679-a098-3a61585c8caa\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.366544 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-utilities\") pod \"f0d61d79-c0a7-4679-a098-3a61585c8caa\" (UID: \"f0d61d79-c0a7-4679-a098-3a61585c8caa\") " Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.367568 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-utilities" (OuterVolumeSpecName: "utilities") pod "f0d61d79-c0a7-4679-a098-3a61585c8caa" (UID: "f0d61d79-c0a7-4679-a098-3a61585c8caa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.401813 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d61d79-c0a7-4679-a098-3a61585c8caa-kube-api-access-654x8" (OuterVolumeSpecName: "kube-api-access-654x8") pod "f0d61d79-c0a7-4679-a098-3a61585c8caa" (UID: "f0d61d79-c0a7-4679-a098-3a61585c8caa"). InnerVolumeSpecName "kube-api-access-654x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.474767 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-654x8\" (UniqueName: \"kubernetes.io/projected/f0d61d79-c0a7-4679-a098-3a61585c8caa-kube-api-access-654x8\") on node \"crc\" DevicePath \"\"" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.474832 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.498499 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0d61d79-c0a7-4679-a098-3a61585c8caa" (UID: "f0d61d79-c0a7-4679-a098-3a61585c8caa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.577421 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d61d79-c0a7-4679-a098-3a61585c8caa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.822935 4786 generic.go:334] "Generic (PLEG): container finished" podID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerID="a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab" exitCode=0 Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.823011 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mpwlc" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.823052 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mpwlc" event={"ID":"f0d61d79-c0a7-4679-a098-3a61585c8caa","Type":"ContainerDied","Data":"a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab"} Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.823508 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mpwlc" event={"ID":"f0d61d79-c0a7-4679-a098-3a61585c8caa","Type":"ContainerDied","Data":"ccfdc94e1f1d736bbf4276d35a1dd8cdbf783bb12f22b6b654401ab1ef26039d"} Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.823529 4786 scope.go:117] "RemoveContainer" containerID="a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.846325 4786 scope.go:117] "RemoveContainer" containerID="67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.852707 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mpwlc"] Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.861676 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mpwlc"] Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.875172 4786 scope.go:117] "RemoveContainer" containerID="9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.910552 4786 scope.go:117] "RemoveContainer" containerID="a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab" Mar 13 16:59:12 crc kubenswrapper[4786]: E0313 16:59:12.912352 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab\": container with ID starting with a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab not found: ID does not exist" containerID="a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.913063 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab"} err="failed to get container status \"a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab\": rpc error: code = NotFound desc = could not find container \"a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab\": container with ID starting with a2b30daea2f854fabafbe3a17e50cddec7eae5a2f2636442b5ca433b9564bdab not found: ID does not exist" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.913094 4786 scope.go:117] "RemoveContainer" containerID="67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30" Mar 13 16:59:12 crc kubenswrapper[4786]: E0313 16:59:12.913288 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30\": container with ID starting with 67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30 not found: ID does not exist" containerID="67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.913309 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30"} err="failed to get container status \"67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30\": rpc error: code = NotFound desc = could not find container \"67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30\": container with ID starting with 67f9e7111e1ef19d08bc7ef13fa7eb264bf61a59b5971a812094befdd6556a30 not found: ID does not exist" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.913322 4786 scope.go:117] "RemoveContainer" containerID="9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc" Mar 13 16:59:12 crc kubenswrapper[4786]: E0313 16:59:12.913552 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc\": container with ID starting with 9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc not found: ID does not exist" containerID="9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc" Mar 13 16:59:12 crc kubenswrapper[4786]: I0313 16:59:12.913574 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc"} err="failed to get container status \"9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc\": rpc error: code = NotFound desc = could not find container \"9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc\": container with ID starting with 9d5d7af8224a32879059de9422cd41b9723f4532a2926299997979d071945fdc not found: ID does not exist" Mar 13 16:59:13 crc kubenswrapper[4786]: I0313 16:59:13.284424 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_947a28af-b2fb-41bb-8be0-8b6723bb630e/memcached/0.log" Mar 13 16:59:14 crc kubenswrapper[4786]: I0313 16:59:14.570821 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" path="/var/lib/kubelet/pods/f0d61d79-c0a7-4679-a098-3a61585c8caa/volumes" Mar 13 16:59:19 crc kubenswrapper[4786]: I0313 16:59:19.644284 4786 scope.go:117] "RemoveContainer" containerID="29aa0c357ea86d002c8f169140e0a563d63dbb18f130a345f8e85a84afb0435e" Mar 13 16:59:19 crc kubenswrapper[4786]: I0313 16:59:19.670441 4786 scope.go:117] "RemoveContainer" containerID="89a7f1c60a027ac982ad540e6ffeed10c36253307923c59d022cddd7d85a5300" Mar 13 16:59:23 crc kubenswrapper[4786]: I0313 16:59:23.552600 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:59:23 crc kubenswrapper[4786]: E0313 16:59:23.553649 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:59:26 crc kubenswrapper[4786]: I0313 16:59:26.730872 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96_0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6/util/0.log" Mar 13 16:59:26 crc kubenswrapper[4786]: I0313 16:59:26.937508 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96_0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6/pull/0.log" Mar 13 16:59:26 crc kubenswrapper[4786]: I0313 16:59:26.947112 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96_0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6/util/0.log" Mar 13 16:59:26 crc kubenswrapper[4786]: I0313 16:59:26.947676 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96_0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6/pull/0.log" Mar 13 16:59:27 crc kubenswrapper[4786]: I0313 16:59:27.153131 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96_0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6/pull/0.log" Mar 13 16:59:27 crc kubenswrapper[4786]: I0313 16:59:27.174872 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96_0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6/util/0.log" Mar 13 16:59:27 crc kubenswrapper[4786]: I0313 16:59:27.177199 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a25da44700f2da9e511716e9ea5c2d7cc3bdaf6532fb20e8f09ee07298sql96_0948a0b8-4f2e-41a3-a1f1-84403bb3e2e6/extract/0.log" Mar 13 16:59:27 crc kubenswrapper[4786]: I0313 16:59:27.467690 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-wgpvr_e60c275e-371c-48d7-8816-56ae26f8e911/manager/0.log" Mar 13 16:59:27 crc kubenswrapper[4786]: I0313 16:59:27.542328 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-zq9jr_bad187db-13d7-4bf9-9b5f-9ce08a17b9c7/manager/0.log" Mar 13 16:59:27 crc kubenswrapper[4786]: I0313 16:59:27.854561 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-gtwlk_721d9249-da86-4bba-93bc-4f037cd3344d/manager/0.log" Mar 13 16:59:27 crc kubenswrapper[4786]: I0313 16:59:27.900706 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-gr9w7_d9425c10-8f83-4b9f-81a8-0502889571a0/manager/0.log" Mar 13 16:59:28 crc kubenswrapper[4786]: I0313 16:59:28.105886 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-mpfq5_c4687472-a411-424b-bcc2-d39f84de6a17/manager/0.log" Mar 13 16:59:28 crc kubenswrapper[4786]: I0313 16:59:28.425268 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-8bd8f_1a700a4d-b7ed-4ea7-9382-b3994ba3646e/manager/0.log" Mar 13 16:59:28 crc kubenswrapper[4786]: I0313 16:59:28.804097 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-wxjdd_b93a2ad9-e58e-4b33-8d34-8101b1fa2d38/manager/0.log" Mar 13 16:59:28 crc kubenswrapper[4786]: I0313 16:59:28.901139 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-hxcnt_8e63300a-b2d6-438f-9b17-989c103b9975/manager/0.log" Mar 13 16:59:28 crc kubenswrapper[4786]: I0313 16:59:28.945397 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-98tsp_7826406e-4038-4851-a54e-bf72ff94287f/manager/0.log" Mar 13 16:59:29 crc kubenswrapper[4786]: I0313 16:59:29.178374 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-65hkw_e2450a85-9b9c-49c6-8191-df5a87807e4f/manager/0.log" Mar 13 16:59:29 crc kubenswrapper[4786]: I0313 16:59:29.343958 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-skfmd_37c474a3-0434-4911-adf5-02d915b23d57/manager/0.log" Mar 13 16:59:29 crc kubenswrapper[4786]: I0313 16:59:29.603529 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-dzj6t_e32e285a-970c-49ac-8531-d0d87b217b08/manager/0.log" Mar 13 16:59:29 crc kubenswrapper[4786]: I0313 16:59:29.733780 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-wc9h2_d5cc42a1-e4ee-4f94-8edf-4c8a46a632db/manager/0.log" Mar 13 16:59:29 crc kubenswrapper[4786]: I0313 16:59:29.807539 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f7958d7742jq6p_365f42f0-aed6-4131-b8d7-c01b9a8418d1/manager/0.log" Mar 13 16:59:29 crc kubenswrapper[4786]: I0313 16:59:29.875611 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-xpqf7_a176571a-293b-4ff6-8928-1cc5f3b28c44/manager/0.log" Mar 13 16:59:30 crc kubenswrapper[4786]: I0313 16:59:30.086123 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6dc56d8cd6-n4mm9_54a9d39e-7ebd-4924-8cfb-2704bd61e22e/operator/0.log" Mar 13 16:59:30 crc kubenswrapper[4786]: I0313 16:59:30.381404 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qqq28_3a032c51-0082-416e-ae5e-b4c5eb59ff33/registry-server/0.log" Mar 13 16:59:30 crc kubenswrapper[4786]: I0313 16:59:30.524012 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-rx48d_9e1f799a-4f74-4d76-9bfc-67a3bc88ce1f/manager/0.log" Mar 13 16:59:30 crc kubenswrapper[4786]: I0313 16:59:30.603050 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-spsz5_82e7030d-fdee-4336-84f0-8a89605e1424/manager/0.log" Mar 13 16:59:30 crc kubenswrapper[4786]: I0313 16:59:30.801160 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-sfxhr_5dcee199-4a59-4394-a565-2ce8e15e787c/operator/0.log" Mar 13 16:59:30 crc kubenswrapper[4786]: I0313 16:59:30.894388 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-s5wjg_a8bee420-59e0-4eb5-a83a-1a518345ca42/manager/0.log" Mar 13 16:59:31 crc kubenswrapper[4786]: I0313 16:59:31.133791 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-zg7jh_9e012f2b-6b59-473d-8273-cb64d4957ad7/manager/0.log" Mar 13 16:59:31 crc kubenswrapper[4786]: I0313 16:59:31.254453 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6854b8b9d9-pnzg8_f4ff29e2-062c-4bbe-885c-4cd1a9e9eb53/manager/0.log" Mar 13 16:59:31 crc kubenswrapper[4786]: I0313 16:59:31.338642 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-nfvdd_b7d9e1c8-827e-447a-ba6d-b54fdb7d8f9f/manager/0.log" Mar 13 16:59:31 crc kubenswrapper[4786]: I0313 16:59:31.951234 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6484b7b757-tsssq_66bf8109-666c-469d-b33c-ba5152cde7d9/manager/0.log" Mar 13 16:59:38 crc kubenswrapper[4786]: I0313 16:59:38.553025 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:59:38 crc kubenswrapper[4786]: E0313 16:59:38.553667 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 16:59:50 crc kubenswrapper[4786]: I0313 16:59:50.927906 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dtwcs_3c6e0341-e5cb-4912-b3fb-8caedc0d4e10/control-plane-machine-set-operator/0.log" Mar 13 16:59:51 crc kubenswrapper[4786]: I0313 16:59:51.136752 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2c944_b960566c-bcc9-41ff-9fbc-c132f0e4d6e5/kube-rbac-proxy/0.log" Mar 13 16:59:51 crc kubenswrapper[4786]: I0313 16:59:51.171203 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2c944_b960566c-bcc9-41ff-9fbc-c132f0e4d6e5/machine-api-operator/0.log" Mar 13 16:59:52 crc kubenswrapper[4786]: I0313 16:59:52.553581 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 16:59:52 crc kubenswrapper[4786]: E0313 16:59:52.554042 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.158054 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557020-nwjjl"] Mar 13 17:00:00 crc kubenswrapper[4786]: E0313 17:00:00.159035 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerName="registry-server" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.159051 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerName="registry-server" Mar 13 17:00:00 crc kubenswrapper[4786]: E0313 17:00:00.159072 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerName="extract-utilities" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.159078 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerName="extract-utilities" Mar 13 17:00:00 crc kubenswrapper[4786]: E0313 17:00:00.159099 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerName="extract-content" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.159106 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerName="extract-content" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.159297 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d61d79-c0a7-4679-a098-3a61585c8caa" containerName="registry-server" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.160000 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.165595 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.167082 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.167336 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.169916 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl"] Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.171642 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.174508 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.174934 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.182535 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557020-nwjjl"] Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.197467 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl"] Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.302295 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25a908ef-87a0-43aa-84dc-3323c0729a07-secret-volume\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.302599 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25a908ef-87a0-43aa-84dc-3323c0729a07-config-volume\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.302836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gwkg\" (UniqueName: \"kubernetes.io/projected/25a908ef-87a0-43aa-84dc-3323c0729a07-kube-api-access-4gwkg\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.303054 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzs2\" (UniqueName: \"kubernetes.io/projected/910385c6-c8c4-48cb-942c-cd7f5a521aef-kube-api-access-4xzs2\") pod \"auto-csr-approver-29557020-nwjjl\" (UID: \"910385c6-c8c4-48cb-942c-cd7f5a521aef\") " pod="openshift-infra/auto-csr-approver-29557020-nwjjl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.406599 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gwkg\" (UniqueName: \"kubernetes.io/projected/25a908ef-87a0-43aa-84dc-3323c0729a07-kube-api-access-4gwkg\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.406804 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzs2\" (UniqueName: \"kubernetes.io/projected/910385c6-c8c4-48cb-942c-cd7f5a521aef-kube-api-access-4xzs2\") pod \"auto-csr-approver-29557020-nwjjl\" (UID: \"910385c6-c8c4-48cb-942c-cd7f5a521aef\") " pod="openshift-infra/auto-csr-approver-29557020-nwjjl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.406910 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25a908ef-87a0-43aa-84dc-3323c0729a07-secret-volume\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.407057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25a908ef-87a0-43aa-84dc-3323c0729a07-config-volume\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.408544 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25a908ef-87a0-43aa-84dc-3323c0729a07-config-volume\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.417711 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25a908ef-87a0-43aa-84dc-3323c0729a07-secret-volume\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.424972 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gwkg\" (UniqueName: \"kubernetes.io/projected/25a908ef-87a0-43aa-84dc-3323c0729a07-kube-api-access-4gwkg\") pod \"collect-profiles-29557020-7t6dl\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.438077 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzs2\" (UniqueName: \"kubernetes.io/projected/910385c6-c8c4-48cb-942c-cd7f5a521aef-kube-api-access-4xzs2\") pod \"auto-csr-approver-29557020-nwjjl\" (UID: \"910385c6-c8c4-48cb-942c-cd7f5a521aef\") " pod="openshift-infra/auto-csr-approver-29557020-nwjjl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.490308 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.496488 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:00 crc kubenswrapper[4786]: I0313 17:00:00.964311 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557020-nwjjl"] Mar 13 17:00:01 crc kubenswrapper[4786]: I0313 17:00:01.039507 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl"] Mar 13 17:00:01 crc kubenswrapper[4786]: W0313 17:00:01.043176 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25a908ef_87a0_43aa_84dc_3323c0729a07.slice/crio-5099dc30006528a2dbb094be27b6066420fbff426b9db8ae5e8b51d6214fef4d WatchSource:0}: Error finding container 5099dc30006528a2dbb094be27b6066420fbff426b9db8ae5e8b51d6214fef4d: Status 404 returned error can't find the container with id 5099dc30006528a2dbb094be27b6066420fbff426b9db8ae5e8b51d6214fef4d Mar 13 17:00:01 crc kubenswrapper[4786]: I0313 17:00:01.254810 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" event={"ID":"910385c6-c8c4-48cb-942c-cd7f5a521aef","Type":"ContainerStarted","Data":"d7370dd1a4517b5e61f9c4a26a823872ba1bc20aaf364c9964d8f60c29ae3d1b"} Mar 13 17:00:01 crc kubenswrapper[4786]: I0313 17:00:01.256823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" event={"ID":"25a908ef-87a0-43aa-84dc-3323c0729a07","Type":"ContainerStarted","Data":"5b3857fbc7830166f5bbcd9147767374e868cdefe696e2c7b362608c4c56a625"} Mar 13 17:00:01 crc kubenswrapper[4786]: I0313 17:00:01.256871 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" event={"ID":"25a908ef-87a0-43aa-84dc-3323c0729a07","Type":"ContainerStarted","Data":"5099dc30006528a2dbb094be27b6066420fbff426b9db8ae5e8b51d6214fef4d"} Mar 13 17:00:01 crc kubenswrapper[4786]: I0313 17:00:01.282147 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" podStartSLOduration=1.282123977 podStartE2EDuration="1.282123977s" podCreationTimestamp="2026-03-13 17:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 17:00:01.27429691 +0000 UTC m=+7031.437508731" watchObservedRunningTime="2026-03-13 17:00:01.282123977 +0000 UTC m=+7031.445335798" Mar 13 17:00:02 crc kubenswrapper[4786]: I0313 17:00:02.270762 4786 generic.go:334] "Generic (PLEG): container finished" podID="25a908ef-87a0-43aa-84dc-3323c0729a07" containerID="5b3857fbc7830166f5bbcd9147767374e868cdefe696e2c7b362608c4c56a625" exitCode=0 Mar 13 17:00:02 crc kubenswrapper[4786]: I0313 17:00:02.270821 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" event={"ID":"25a908ef-87a0-43aa-84dc-3323c0729a07","Type":"ContainerDied","Data":"5b3857fbc7830166f5bbcd9147767374e868cdefe696e2c7b362608c4c56a625"} Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.769947 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.879458 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25a908ef-87a0-43aa-84dc-3323c0729a07-secret-volume\") pod \"25a908ef-87a0-43aa-84dc-3323c0729a07\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.879714 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25a908ef-87a0-43aa-84dc-3323c0729a07-config-volume\") pod \"25a908ef-87a0-43aa-84dc-3323c0729a07\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.879752 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gwkg\" (UniqueName: \"kubernetes.io/projected/25a908ef-87a0-43aa-84dc-3323c0729a07-kube-api-access-4gwkg\") pod \"25a908ef-87a0-43aa-84dc-3323c0729a07\" (UID: \"25a908ef-87a0-43aa-84dc-3323c0729a07\") " Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.880358 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a908ef-87a0-43aa-84dc-3323c0729a07-config-volume" (OuterVolumeSpecName: "config-volume") pod "25a908ef-87a0-43aa-84dc-3323c0729a07" (UID: "25a908ef-87a0-43aa-84dc-3323c0729a07"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.884994 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a908ef-87a0-43aa-84dc-3323c0729a07-kube-api-access-4gwkg" (OuterVolumeSpecName: "kube-api-access-4gwkg") pod "25a908ef-87a0-43aa-84dc-3323c0729a07" (UID: "25a908ef-87a0-43aa-84dc-3323c0729a07"). InnerVolumeSpecName "kube-api-access-4gwkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.888128 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a908ef-87a0-43aa-84dc-3323c0729a07-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "25a908ef-87a0-43aa-84dc-3323c0729a07" (UID: "25a908ef-87a0-43aa-84dc-3323c0729a07"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.982107 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25a908ef-87a0-43aa-84dc-3323c0729a07-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.982138 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gwkg\" (UniqueName: \"kubernetes.io/projected/25a908ef-87a0-43aa-84dc-3323c0729a07-kube-api-access-4gwkg\") on node \"crc\" DevicePath \"\"" Mar 13 17:00:03 crc kubenswrapper[4786]: I0313 17:00:03.982156 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25a908ef-87a0-43aa-84dc-3323c0729a07-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 17:00:04 crc kubenswrapper[4786]: I0313 17:00:04.305950 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" event={"ID":"25a908ef-87a0-43aa-84dc-3323c0729a07","Type":"ContainerDied","Data":"5099dc30006528a2dbb094be27b6066420fbff426b9db8ae5e8b51d6214fef4d"} Mar 13 17:00:04 crc kubenswrapper[4786]: I0313 17:00:04.305989 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5099dc30006528a2dbb094be27b6066420fbff426b9db8ae5e8b51d6214fef4d" Mar 13 17:00:04 crc kubenswrapper[4786]: I0313 17:00:04.306049 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557020-7t6dl" Mar 13 17:00:04 crc kubenswrapper[4786]: I0313 17:00:04.372439 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf"] Mar 13 17:00:04 crc kubenswrapper[4786]: I0313 17:00:04.384496 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556975-kp2mf"] Mar 13 17:00:04 crc kubenswrapper[4786]: I0313 17:00:04.562793 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac97daa-9ad6-411b-9e46-a0e08cb55866" path="/var/lib/kubelet/pods/6ac97daa-9ad6-411b-9e46-a0e08cb55866/volumes" Mar 13 17:00:04 crc kubenswrapper[4786]: I0313 17:00:04.807719 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-sspj7_7d3cd12d-97ae-473f-ae6b-40837bbb9a5e/cert-manager-controller/0.log" Mar 13 17:00:04 crc kubenswrapper[4786]: I0313 17:00:04.918576 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-pqlz5_9257f04b-d34e-4a81-b897-e563a66b6b59/cert-manager-cainjector/0.log" Mar 13 17:00:05 crc kubenswrapper[4786]: I0313 17:00:05.004576 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-wqs9w_2134099c-6c07-4d86-9aa5-b7360e8f3ea1/cert-manager-webhook/0.log" Mar 13 17:00:07 crc kubenswrapper[4786]: I0313 17:00:07.551666 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 17:00:07 crc kubenswrapper[4786]: E0313 17:00:07.552288 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 17:00:18 crc kubenswrapper[4786]: I0313 17:00:18.552535 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 17:00:18 crc kubenswrapper[4786]: E0313 17:00:18.553210 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 17:00:18 crc kubenswrapper[4786]: I0313 17:00:18.923342 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-g5bjq_e993546a-4062-41d9-870c-36f2a467be39/nmstate-console-plugin/0.log" Mar 13 17:00:19 crc kubenswrapper[4786]: I0313 17:00:19.113212 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5hfgg_0bd50cd9-6517-4b70-8bf3-8e3469793cd3/nmstate-handler/0.log" Mar 13 17:00:19 crc kubenswrapper[4786]: I0313 17:00:19.182525 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zxmxw_9d81cf1e-8913-4975-b2f6-ce71220f51ad/kube-rbac-proxy/0.log" Mar 13 17:00:19 crc kubenswrapper[4786]: I0313 17:00:19.182893 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-zxmxw_9d81cf1e-8913-4975-b2f6-ce71220f51ad/nmstate-metrics/0.log" Mar 13 17:00:19 crc kubenswrapper[4786]: I0313 17:00:19.340573 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-p4cvd_7a77aa5e-e467-4edf-8db1-6961b154d940/nmstate-operator/0.log" Mar 13 17:00:19 crc kubenswrapper[4786]: I0313 17:00:19.402255 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-mq9sp_33d6a569-116c-4200-b6c4-24c75bfbda77/nmstate-webhook/0.log" Mar 13 17:00:19 crc kubenswrapper[4786]: I0313 17:00:19.783705 4786 scope.go:117] "RemoveContainer" containerID="31adaf0ab822f60975f98e5a058ec08977f5dc642ca8797cbba38ec0960c8741" Mar 13 17:00:19 crc kubenswrapper[4786]: I0313 17:00:19.825075 4786 scope.go:117] "RemoveContainer" containerID="88b349537522e7778a2c85e8a0c5a34a5408e9933a9d4b7aa0b9375cb2f1c15b" Mar 13 17:00:23 crc kubenswrapper[4786]: I0313 17:00:23.504985 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" event={"ID":"910385c6-c8c4-48cb-942c-cd7f5a521aef","Type":"ContainerStarted","Data":"9afc8262d85694dd50c517a2687b49deb0a5491e8d2159baf3d1d8a854f57db1"} Mar 13 17:00:23 crc kubenswrapper[4786]: I0313 17:00:23.524494 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" podStartSLOduration=1.369561026 podStartE2EDuration="23.524475157s" podCreationTimestamp="2026-03-13 17:00:00 +0000 UTC" firstStartedPulling="2026-03-13 17:00:00.962402934 +0000 UTC m=+7031.125614755" lastFinishedPulling="2026-03-13 17:00:23.117317075 +0000 UTC m=+7053.280528886" observedRunningTime="2026-03-13 17:00:23.521624076 +0000 UTC m=+7053.684835927" watchObservedRunningTime="2026-03-13 17:00:23.524475157 +0000 UTC m=+7053.687686978" Mar 13 17:00:24 crc kubenswrapper[4786]: I0313 17:00:24.518893 4786 generic.go:334] "Generic (PLEG): container finished" podID="910385c6-c8c4-48cb-942c-cd7f5a521aef" containerID="9afc8262d85694dd50c517a2687b49deb0a5491e8d2159baf3d1d8a854f57db1" exitCode=0 Mar 13 17:00:24 crc kubenswrapper[4786]: I0313 17:00:24.518953 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" event={"ID":"910385c6-c8c4-48cb-942c-cd7f5a521aef","Type":"ContainerDied","Data":"9afc8262d85694dd50c517a2687b49deb0a5491e8d2159baf3d1d8a854f57db1"} Mar 13 17:00:25 crc kubenswrapper[4786]: I0313 17:00:25.060155 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-c03e-account-create-update-qjtm7"] Mar 13 17:00:25 crc kubenswrapper[4786]: I0313 17:00:25.073428 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-qxr2j"] Mar 13 17:00:25 crc kubenswrapper[4786]: I0313 17:00:25.091148 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-c03e-account-create-update-qjtm7"] Mar 13 17:00:25 crc kubenswrapper[4786]: I0313 17:00:25.103076 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-qxr2j"] Mar 13 17:00:25 crc kubenswrapper[4786]: I0313 17:00:25.933754 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.081016 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xzs2\" (UniqueName: \"kubernetes.io/projected/910385c6-c8c4-48cb-942c-cd7f5a521aef-kube-api-access-4xzs2\") pod \"910385c6-c8c4-48cb-942c-cd7f5a521aef\" (UID: \"910385c6-c8c4-48cb-942c-cd7f5a521aef\") " Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.087844 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910385c6-c8c4-48cb-942c-cd7f5a521aef-kube-api-access-4xzs2" (OuterVolumeSpecName: "kube-api-access-4xzs2") pod "910385c6-c8c4-48cb-942c-cd7f5a521aef" (UID: "910385c6-c8c4-48cb-942c-cd7f5a521aef"). InnerVolumeSpecName "kube-api-access-4xzs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.183046 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xzs2\" (UniqueName: \"kubernetes.io/projected/910385c6-c8c4-48cb-942c-cd7f5a521aef-kube-api-access-4xzs2\") on node \"crc\" DevicePath \"\"" Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.540034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" event={"ID":"910385c6-c8c4-48cb-942c-cd7f5a521aef","Type":"ContainerDied","Data":"d7370dd1a4517b5e61f9c4a26a823872ba1bc20aaf364c9964d8f60c29ae3d1b"} Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.540552 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7370dd1a4517b5e61f9c4a26a823872ba1bc20aaf364c9964d8f60c29ae3d1b" Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.540639 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557020-nwjjl" Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.576819 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37" path="/var/lib/kubelet/pods/3fbf1fd0-6bcf-47bf-9567-f3abe55a4c37/volumes" Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.578599 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af1308e-8fe9-42d7-b748-c9bf713499d3" path="/var/lib/kubelet/pods/7af1308e-8fe9-42d7-b748-c9bf713499d3/volumes" Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.611539 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557014-glwtj"] Mar 13 17:00:26 crc kubenswrapper[4786]: I0313 17:00:26.621287 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557014-glwtj"] Mar 13 17:00:28 crc kubenswrapper[4786]: I0313 17:00:28.571107 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087f69b5-64ac-4e7e-a195-12e854b510e3" path="/var/lib/kubelet/pods/087f69b5-64ac-4e7e-a195-12e854b510e3/volumes" Mar 13 17:00:33 crc kubenswrapper[4786]: I0313 17:00:33.551791 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 17:00:33 crc kubenswrapper[4786]: E0313 17:00:33.552628 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 17:00:34 crc kubenswrapper[4786]: I0313 17:00:34.167938 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2rkbn_d2886207-16dc-47f5-bc4d-dba0a8a55ed1/prometheus-operator/0.log" Mar 13 17:00:34 crc kubenswrapper[4786]: I0313 17:00:34.226013 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd_cdb71ef5-96d3-42b6-86d9-05be37f6961a/prometheus-operator-admission-webhook/0.log" Mar 13 17:00:34 crc kubenswrapper[4786]: I0313 17:00:34.383657 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d866698b6-c7skp_23fe82eb-830c-4fc5-b855-421e358620d5/prometheus-operator-admission-webhook/0.log" Mar 13 17:00:34 crc kubenswrapper[4786]: I0313 17:00:34.396435 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-n4977_e11733a0-03f3-4862-b5f8-92b7cc51ae99/operator/0.log" Mar 13 17:00:34 crc kubenswrapper[4786]: I0313 17:00:34.551014 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-jx4bl_bdcbef08-46bc-46ef-8342-d8128a2d4de1/perses-operator/0.log" Mar 13 17:00:41 crc kubenswrapper[4786]: I0313 17:00:41.055959 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-b952z"] Mar 13 17:00:41 crc kubenswrapper[4786]: I0313 17:00:41.070472 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-b952z"] Mar 13 17:00:42 crc kubenswrapper[4786]: I0313 17:00:42.572462 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a523430-946d-4f79-b557-d784192b2e95" path="/var/lib/kubelet/pods/0a523430-946d-4f79-b557-d784192b2e95/volumes" Mar 13 17:00:48 crc kubenswrapper[4786]: I0313 17:00:48.557307 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 17:00:48 crc kubenswrapper[4786]: E0313 17:00:48.558137 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.215461 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-c4kjj_a2108fd5-0d1c-4e6c-8b30-adea0a1545ac/kube-rbac-proxy/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.472221 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-frr-files/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.553940 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-c4kjj_a2108fd5-0d1c-4e6c-8b30-adea0a1545ac/controller/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.617406 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-frr-files/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.676488 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-reloader/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.681245 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-metrics/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.749111 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-reloader/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.897869 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-reloader/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.945948 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-frr-files/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.964986 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-metrics/0.log" Mar 13 17:00:49 crc kubenswrapper[4786]: I0313 17:00:49.974182 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-metrics/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.156944 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-frr-files/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.177027 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-metrics/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.204430 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/cp-reloader/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.237067 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/controller/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.385005 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/frr-metrics/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.414762 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/kube-rbac-proxy/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.496080 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/kube-rbac-proxy-frr/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.626488 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/reloader/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.716020 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-gmz8b_9ebdbb3b-8b33-42de-8150-f13a385b6bb6/frr-k8s-webhook-server/0.log" Mar 13 17:00:50 crc kubenswrapper[4786]: I0313 17:00:50.861845 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6db9d94745-cm6rw_76502e53-99f8-40ce-a930-f34b84f5718f/manager/0.log" Mar 13 17:00:51 crc kubenswrapper[4786]: I0313 17:00:51.069538 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-55488488b6-jbcct_180f6fad-fff1-4831-8332-027c32b35d7d/webhook-server/0.log" Mar 13 17:00:51 crc kubenswrapper[4786]: I0313 17:00:51.138524 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4lzrc_d6114a7a-189a-4212-9669-4addfc43c839/kube-rbac-proxy/0.log" Mar 13 17:00:52 crc kubenswrapper[4786]: I0313 17:00:52.017388 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4lzrc_d6114a7a-189a-4212-9669-4addfc43c839/speaker/0.log" Mar 13 17:00:53 crc kubenswrapper[4786]: I0313 17:00:53.256829 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8tcjh_bdc20c1e-eff7-4478-a06a-05dacc2f169a/frr/0.log" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.181756 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29557021-8pxcz"] Mar 13 17:01:00 crc kubenswrapper[4786]: E0313 17:01:00.183052 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a908ef-87a0-43aa-84dc-3323c0729a07" containerName="collect-profiles" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.183073 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a908ef-87a0-43aa-84dc-3323c0729a07" containerName="collect-profiles" Mar 13 17:01:00 crc kubenswrapper[4786]: E0313 17:01:00.183103 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910385c6-c8c4-48cb-942c-cd7f5a521aef" containerName="oc" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.183114 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="910385c6-c8c4-48cb-942c-cd7f5a521aef" containerName="oc" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.183501 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a908ef-87a0-43aa-84dc-3323c0729a07" containerName="collect-profiles" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.183523 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="910385c6-c8c4-48cb-942c-cd7f5a521aef" containerName="oc" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.184588 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.199385 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557021-8pxcz"] Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.263218 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x62bp\" (UniqueName: \"kubernetes.io/projected/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-kube-api-access-x62bp\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.263387 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-combined-ca-bundle\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.263419 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-config-data\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.263451 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-fernet-keys\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.365309 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-fernet-keys\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.365445 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x62bp\" (UniqueName: \"kubernetes.io/projected/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-kube-api-access-x62bp\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.365629 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-combined-ca-bundle\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.365674 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-config-data\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.385405 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-config-data\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.388795 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-fernet-keys\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.389573 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-combined-ca-bundle\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.400367 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x62bp\" (UniqueName: \"kubernetes.io/projected/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-kube-api-access-x62bp\") pod \"keystone-cron-29557021-8pxcz\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:00 crc kubenswrapper[4786]: I0313 17:01:00.514426 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:01 crc kubenswrapper[4786]: I0313 17:01:01.046197 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557021-8pxcz"] Mar 13 17:01:01 crc kubenswrapper[4786]: I0313 17:01:01.552834 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 17:01:01 crc kubenswrapper[4786]: E0313 17:01:01.553757 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqb49_openshift-machine-config-operator(6b929603-1f9d-4b41-9bf8-528d7fd4ad56)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" Mar 13 17:01:01 crc kubenswrapper[4786]: I0313 17:01:01.896627 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557021-8pxcz" event={"ID":"0ac109e2-6471-4d1a-b455-cb6a96a98a7e","Type":"ContainerStarted","Data":"8113344d5e4e82c46b874d4d072390de7d69d106f3cf1ea7ca590d97288a4dbe"} Mar 13 17:01:01 crc kubenswrapper[4786]: I0313 17:01:01.896842 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557021-8pxcz" event={"ID":"0ac109e2-6471-4d1a-b455-cb6a96a98a7e","Type":"ContainerStarted","Data":"5203720dfa92b61c9e9cbba24df4f8162da412161b7d8d63a667d57a4d9486bc"} Mar 13 17:01:01 crc kubenswrapper[4786]: I0313 17:01:01.921980 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29557021-8pxcz" podStartSLOduration=1.921958449 podStartE2EDuration="1.921958449s" podCreationTimestamp="2026-03-13 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 17:01:01.91148559 +0000 UTC m=+7092.074697441" watchObservedRunningTime="2026-03-13 17:01:01.921958449 +0000 UTC m=+7092.085170260" Mar 13 17:01:04 crc kubenswrapper[4786]: I0313 17:01:04.930965 4786 generic.go:334] "Generic (PLEG): container finished" podID="0ac109e2-6471-4d1a-b455-cb6a96a98a7e" containerID="8113344d5e4e82c46b874d4d072390de7d69d106f3cf1ea7ca590d97288a4dbe" exitCode=0 Mar 13 17:01:04 crc kubenswrapper[4786]: I0313 17:01:04.931071 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557021-8pxcz" event={"ID":"0ac109e2-6471-4d1a-b455-cb6a96a98a7e","Type":"ContainerDied","Data":"8113344d5e4e82c46b874d4d072390de7d69d106f3cf1ea7ca590d97288a4dbe"} Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.347787 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.414842 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-combined-ca-bundle\") pod \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.414909 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-config-data\") pod \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.415064 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x62bp\" (UniqueName: \"kubernetes.io/projected/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-kube-api-access-x62bp\") pod \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.415128 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-fernet-keys\") pod \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\" (UID: \"0ac109e2-6471-4d1a-b455-cb6a96a98a7e\") " Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.434670 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0ac109e2-6471-4d1a-b455-cb6a96a98a7e" (UID: "0ac109e2-6471-4d1a-b455-cb6a96a98a7e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.440357 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-kube-api-access-x62bp" (OuterVolumeSpecName: "kube-api-access-x62bp") pod "0ac109e2-6471-4d1a-b455-cb6a96a98a7e" (UID: "0ac109e2-6471-4d1a-b455-cb6a96a98a7e"). InnerVolumeSpecName "kube-api-access-x62bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.466210 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ac109e2-6471-4d1a-b455-cb6a96a98a7e" (UID: "0ac109e2-6471-4d1a-b455-cb6a96a98a7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.496008 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-config-data" (OuterVolumeSpecName: "config-data") pod "0ac109e2-6471-4d1a-b455-cb6a96a98a7e" (UID: "0ac109e2-6471-4d1a-b455-cb6a96a98a7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.505732 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m_e1b88d4e-85fd-4163-987c-d60b5653637b/util/0.log" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.517257 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.517289 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.517301 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.517310 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x62bp\" (UniqueName: \"kubernetes.io/projected/0ac109e2-6471-4d1a-b455-cb6a96a98a7e-kube-api-access-x62bp\") on node \"crc\" DevicePath \"\"" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.724214 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m_e1b88d4e-85fd-4163-987c-d60b5653637b/pull/0.log" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.753950 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m_e1b88d4e-85fd-4163-987c-d60b5653637b/util/0.log" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.788722 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m_e1b88d4e-85fd-4163-987c-d60b5653637b/pull/0.log" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.922497 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m_e1b88d4e-85fd-4163-987c-d60b5653637b/util/0.log" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.941073 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m_e1b88d4e-85fd-4163-987c-d60b5653637b/pull/0.log" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.950840 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557021-8pxcz" event={"ID":"0ac109e2-6471-4d1a-b455-cb6a96a98a7e","Type":"ContainerDied","Data":"5203720dfa92b61c9e9cbba24df4f8162da412161b7d8d63a667d57a4d9486bc"} Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.950896 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5203720dfa92b61c9e9cbba24df4f8162da412161b7d8d63a667d57a4d9486bc" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.950922 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557021-8pxcz" Mar 13 17:01:06 crc kubenswrapper[4786]: I0313 17:01:06.952976 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874txf7m_e1b88d4e-85fd-4163-987c-d60b5653637b/extract/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.099533 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh_cdd29b42-059a-45b5-9ae0-9d1b25879f87/util/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.265530 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh_cdd29b42-059a-45b5-9ae0-9d1b25879f87/util/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.293075 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh_cdd29b42-059a-45b5-9ae0-9d1b25879f87/pull/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.312304 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh_cdd29b42-059a-45b5-9ae0-9d1b25879f87/pull/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.428827 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh_cdd29b42-059a-45b5-9ae0-9d1b25879f87/pull/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.442749 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh_cdd29b42-059a-45b5-9ae0-9d1b25879f87/util/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.482233 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1mcjhh_cdd29b42-059a-45b5-9ae0-9d1b25879f87/extract/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.610089 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q_f8c00607-24ac-4811-9a56-92304be2396e/util/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.786983 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q_f8c00607-24ac-4811-9a56-92304be2396e/util/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.807166 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q_f8c00607-24ac-4811-9a56-92304be2396e/pull/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.846953 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q_f8c00607-24ac-4811-9a56-92304be2396e/pull/0.log" Mar 13 17:01:07 crc kubenswrapper[4786]: I0313 17:01:07.985560 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q_f8c00607-24ac-4811-9a56-92304be2396e/util/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.035086 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q_f8c00607-24ac-4811-9a56-92304be2396e/extract/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.062167 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5s5q2q_f8c00607-24ac-4811-9a56-92304be2396e/pull/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.176207 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr_7eef90e3-9167-4e4c-941e-943965d480a3/util/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.339872 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr_7eef90e3-9167-4e4c-941e-943965d480a3/util/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.343110 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr_7eef90e3-9167-4e4c-941e-943965d480a3/pull/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.357726 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr_7eef90e3-9167-4e4c-941e-943965d480a3/pull/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.542265 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr_7eef90e3-9167-4e4c-941e-943965d480a3/pull/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.550367 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr_7eef90e3-9167-4e4c-941e-943965d480a3/util/0.log" Mar 13 17:01:08 crc kubenswrapper[4786]: I0313 17:01:08.765810 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vg8fj_65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43/extract-utilities/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.025687 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b89mr_7eef90e3-9167-4e4c-941e-943965d480a3/extract/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.265360 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vg8fj_65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43/extract-content/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.273622 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vg8fj_65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43/extract-content/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.277342 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vg8fj_65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43/extract-utilities/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.458117 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vg8fj_65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43/extract-content/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.482276 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vg8fj_65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43/extract-utilities/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.651081 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b45zm_c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6/extract-utilities/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.890655 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b45zm_c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6/extract-utilities/0.log" Mar 13 17:01:09 crc kubenswrapper[4786]: I0313 17:01:09.979793 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b45zm_c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6/extract-content/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.013971 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b45zm_c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6/extract-content/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.223660 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b45zm_c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6/extract-utilities/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.260141 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b45zm_c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6/extract-content/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.356929 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vg8fj_65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43/registry-server/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.549025 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8tbs9_37854105-dd2c-4a53-9a0e-45813f321114/marketplace-operator/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.694704 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qwfx9_04a298d0-6d11-47c4-9438-488c016e3d49/extract-utilities/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.937846 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qwfx9_04a298d0-6d11-47c4-9438-488c016e3d49/extract-utilities/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.948467 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qwfx9_04a298d0-6d11-47c4-9438-488c016e3d49/extract-content/0.log" Mar 13 17:01:10 crc kubenswrapper[4786]: I0313 17:01:10.978352 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qwfx9_04a298d0-6d11-47c4-9438-488c016e3d49/extract-content/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.144491 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qwfx9_04a298d0-6d11-47c4-9438-488c016e3d49/extract-content/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.155010 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qwfx9_04a298d0-6d11-47c4-9438-488c016e3d49/extract-utilities/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.344584 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dstmd_5c65dc28-4b5f-4159-af2e-83b4bffec120/extract-utilities/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.416735 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b45zm_c67bb9c5-b78b-4824-b2e3-c95e51c6c8c6/registry-server/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.444226 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-qwfx9_04a298d0-6d11-47c4-9438-488c016e3d49/registry-server/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.536094 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dstmd_5c65dc28-4b5f-4159-af2e-83b4bffec120/extract-utilities/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.570446 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dstmd_5c65dc28-4b5f-4159-af2e-83b4bffec120/extract-content/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.584324 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dstmd_5c65dc28-4b5f-4159-af2e-83b4bffec120/extract-content/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.731918 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dstmd_5c65dc28-4b5f-4159-af2e-83b4bffec120/extract-utilities/0.log" Mar 13 17:01:11 crc kubenswrapper[4786]: I0313 17:01:11.759825 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dstmd_5c65dc28-4b5f-4159-af2e-83b4bffec120/extract-content/0.log" Mar 13 17:01:12 crc kubenswrapper[4786]: I0313 17:01:12.536795 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dstmd_5c65dc28-4b5f-4159-af2e-83b4bffec120/registry-server/0.log" Mar 13 17:01:16 crc kubenswrapper[4786]: I0313 17:01:16.552815 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 17:01:17 crc kubenswrapper[4786]: I0313 17:01:17.072395 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"08dbf6d74035d8d2ccb4586df0c13c9e09d0f88484b3aef05d4459bc8e347cda"} Mar 13 17:01:19 crc kubenswrapper[4786]: I0313 17:01:19.904531 4786 scope.go:117] "RemoveContainer" containerID="9f80eb357d84177c0768363e0503ccf4857f9188d1bc18b525c4429d4dd4fb90" Mar 13 17:01:19 crc kubenswrapper[4786]: I0313 17:01:19.932226 4786 scope.go:117] "RemoveContainer" containerID="3b0e36ca162ddc5f4bd11c1f6c6848376eb336e87e73c96cbedb2310c97d4fce" Mar 13 17:01:20 crc kubenswrapper[4786]: I0313 17:01:20.010186 4786 scope.go:117] "RemoveContainer" containerID="de0519d3ae4697eff464b53e6e5539a2e5efb2a11285388fa46cca8212c2fb93" Mar 13 17:01:20 crc kubenswrapper[4786]: I0313 17:01:20.084778 4786 scope.go:117] "RemoveContainer" containerID="b5315a5c7ed9c91e293fe0a82d1fb1abb2b6be4f2eec20e61782fb82880c11fe" Mar 13 17:01:26 crc kubenswrapper[4786]: I0313 17:01:26.090251 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2rkbn_d2886207-16dc-47f5-bc4d-dba0a8a55ed1/prometheus-operator/0.log" Mar 13 17:01:26 crc kubenswrapper[4786]: I0313 17:01:26.149016 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d866698b6-8z6rd_cdb71ef5-96d3-42b6-86d9-05be37f6961a/prometheus-operator-admission-webhook/0.log" Mar 13 17:01:26 crc kubenswrapper[4786]: I0313 17:01:26.187433 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6d866698b6-c7skp_23fe82eb-830c-4fc5-b855-421e358620d5/prometheus-operator-admission-webhook/0.log" Mar 13 17:01:26 crc kubenswrapper[4786]: I0313 17:01:26.318878 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-n4977_e11733a0-03f3-4862-b5f8-92b7cc51ae99/operator/0.log" Mar 13 17:01:26 crc kubenswrapper[4786]: I0313 17:01:26.345597 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-jx4bl_bdcbef08-46bc-46ef-8342-d8128a2d4de1/perses-operator/0.log" Mar 13 17:01:30 crc kubenswrapper[4786]: E0313 17:01:30.130799 4786 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.12:52594->38.102.83.12:40601: write tcp 38.102.83.12:52594->38.102.83.12:40601: write: broken pipe Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.174206 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557022-crpjs"] Mar 13 17:02:00 crc kubenswrapper[4786]: E0313 17:02:00.175384 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac109e2-6471-4d1a-b455-cb6a96a98a7e" containerName="keystone-cron" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.175403 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac109e2-6471-4d1a-b455-cb6a96a98a7e" containerName="keystone-cron" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.175692 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac109e2-6471-4d1a-b455-cb6a96a98a7e" containerName="keystone-cron" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.176581 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557022-crpjs" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.179367 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.179563 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.179840 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.190569 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557022-crpjs"] Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.342053 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbhh\" (UniqueName: \"kubernetes.io/projected/d9a60ed6-a68f-43f8-8787-fc9fd37b9a88-kube-api-access-xfbhh\") pod \"auto-csr-approver-29557022-crpjs\" (UID: \"d9a60ed6-a68f-43f8-8787-fc9fd37b9a88\") " pod="openshift-infra/auto-csr-approver-29557022-crpjs" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.446257 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbhh\" (UniqueName: \"kubernetes.io/projected/d9a60ed6-a68f-43f8-8787-fc9fd37b9a88-kube-api-access-xfbhh\") pod \"auto-csr-approver-29557022-crpjs\" (UID: \"d9a60ed6-a68f-43f8-8787-fc9fd37b9a88\") " pod="openshift-infra/auto-csr-approver-29557022-crpjs" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.479940 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbhh\" (UniqueName: \"kubernetes.io/projected/d9a60ed6-a68f-43f8-8787-fc9fd37b9a88-kube-api-access-xfbhh\") pod \"auto-csr-approver-29557022-crpjs\" (UID: \"d9a60ed6-a68f-43f8-8787-fc9fd37b9a88\") " pod="openshift-infra/auto-csr-approver-29557022-crpjs" Mar 13 17:02:00 crc kubenswrapper[4786]: I0313 17:02:00.498953 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557022-crpjs" Mar 13 17:02:01 crc kubenswrapper[4786]: I0313 17:02:01.066972 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557022-crpjs"] Mar 13 17:02:01 crc kubenswrapper[4786]: I0313 17:02:01.599660 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557022-crpjs" event={"ID":"d9a60ed6-a68f-43f8-8787-fc9fd37b9a88","Type":"ContainerStarted","Data":"d45cc8a70c73e66c22f199905aa9525b2b78fd7fc91bb302d4d360a05ecf8db0"} Mar 13 17:02:03 crc kubenswrapper[4786]: I0313 17:02:03.627786 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557022-crpjs" event={"ID":"d9a60ed6-a68f-43f8-8787-fc9fd37b9a88","Type":"ContainerDied","Data":"d1131b93391c3c70048c5e48a9168f3d218971ff62f9332578cf5cc2e52d4e9b"} Mar 13 17:02:03 crc kubenswrapper[4786]: I0313 17:02:03.627515 4786 generic.go:334] "Generic (PLEG): container finished" podID="d9a60ed6-a68f-43f8-8787-fc9fd37b9a88" containerID="d1131b93391c3c70048c5e48a9168f3d218971ff62f9332578cf5cc2e52d4e9b" exitCode=0 Mar 13 17:02:05 crc kubenswrapper[4786]: I0313 17:02:05.054626 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557022-crpjs" Mar 13 17:02:05 crc kubenswrapper[4786]: I0313 17:02:05.154385 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbhh\" (UniqueName: \"kubernetes.io/projected/d9a60ed6-a68f-43f8-8787-fc9fd37b9a88-kube-api-access-xfbhh\") pod \"d9a60ed6-a68f-43f8-8787-fc9fd37b9a88\" (UID: \"d9a60ed6-a68f-43f8-8787-fc9fd37b9a88\") " Mar 13 17:02:05 crc kubenswrapper[4786]: I0313 17:02:05.162542 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a60ed6-a68f-43f8-8787-fc9fd37b9a88-kube-api-access-xfbhh" (OuterVolumeSpecName: "kube-api-access-xfbhh") pod "d9a60ed6-a68f-43f8-8787-fc9fd37b9a88" (UID: "d9a60ed6-a68f-43f8-8787-fc9fd37b9a88"). InnerVolumeSpecName "kube-api-access-xfbhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:02:05 crc kubenswrapper[4786]: I0313 17:02:05.257279 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbhh\" (UniqueName: \"kubernetes.io/projected/d9a60ed6-a68f-43f8-8787-fc9fd37b9a88-kube-api-access-xfbhh\") on node \"crc\" DevicePath \"\"" Mar 13 17:02:05 crc kubenswrapper[4786]: I0313 17:02:05.660620 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557022-crpjs" event={"ID":"d9a60ed6-a68f-43f8-8787-fc9fd37b9a88","Type":"ContainerDied","Data":"d45cc8a70c73e66c22f199905aa9525b2b78fd7fc91bb302d4d360a05ecf8db0"} Mar 13 17:02:05 crc kubenswrapper[4786]: I0313 17:02:05.660679 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45cc8a70c73e66c22f199905aa9525b2b78fd7fc91bb302d4d360a05ecf8db0" Mar 13 17:02:05 crc kubenswrapper[4786]: I0313 17:02:05.660915 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557022-crpjs" Mar 13 17:02:06 crc kubenswrapper[4786]: I0313 17:02:06.134831 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557016-rbnld"] Mar 13 17:02:06 crc kubenswrapper[4786]: I0313 17:02:06.143236 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557016-rbnld"] Mar 13 17:02:06 crc kubenswrapper[4786]: I0313 17:02:06.572997 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87dfaa69-9c28-4b64-8544-114860c0a557" path="/var/lib/kubelet/pods/87dfaa69-9c28-4b64-8544-114860c0a557/volumes" Mar 13 17:02:20 crc kubenswrapper[4786]: I0313 17:02:20.205370 4786 scope.go:117] "RemoveContainer" containerID="d6235fc18b9fe2323be60f1661fd1f7aaed1d3da0bb86fe4dd4bb31c99520b04" Mar 13 17:03:01 crc kubenswrapper[4786]: I0313 17:03:01.320613 4786 generic.go:334] "Generic (PLEG): container finished" podID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerID="28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94" exitCode=0 Mar 13 17:03:01 crc kubenswrapper[4786]: I0313 17:03:01.321062 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8fmsf/must-gather-smw9x" event={"ID":"0b73a37c-f416-4418-bc3d-6752b4dbf7d8","Type":"ContainerDied","Data":"28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94"} Mar 13 17:03:01 crc kubenswrapper[4786]: I0313 17:03:01.321494 4786 scope.go:117] "RemoveContainer" containerID="28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.192616 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8fmsf_must-gather-smw9x_0b73a37c-f416-4418-bc3d-6752b4dbf7d8/gather/0.log" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.703984 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wmlrw"] Mar 13 17:03:02 crc kubenswrapper[4786]: E0313 17:03:02.704579 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a60ed6-a68f-43f8-8787-fc9fd37b9a88" containerName="oc" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.704618 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a60ed6-a68f-43f8-8787-fc9fd37b9a88" containerName="oc" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.704896 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a60ed6-a68f-43f8-8787-fc9fd37b9a88" containerName="oc" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.736223 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmlrw"] Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.736321 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.842002 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-catalog-content\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.842286 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bp8j\" (UniqueName: \"kubernetes.io/projected/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-kube-api-access-5bp8j\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.842344 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-utilities\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.943892 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-catalog-content\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.943955 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bp8j\" (UniqueName: \"kubernetes.io/projected/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-kube-api-access-5bp8j\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.944028 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-utilities\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.944409 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-catalog-content\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.944554 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-utilities\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:02 crc kubenswrapper[4786]: I0313 17:03:02.980492 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bp8j\" (UniqueName: \"kubernetes.io/projected/c7dc61c3-8b55-4aa4-bf96-c064cabe6925-kube-api-access-5bp8j\") pod \"certified-operators-wmlrw\" (UID: \"c7dc61c3-8b55-4aa4-bf96-c064cabe6925\") " pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:03 crc kubenswrapper[4786]: I0313 17:03:03.063297 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:03 crc kubenswrapper[4786]: I0313 17:03:03.568446 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmlrw"] Mar 13 17:03:04 crc kubenswrapper[4786]: I0313 17:03:04.359815 4786 generic.go:334] "Generic (PLEG): container finished" podID="c7dc61c3-8b55-4aa4-bf96-c064cabe6925" containerID="5eb089404beec9bc49fbe0e47b4f15e111ca6d750142bb56134574185334bf59" exitCode=0 Mar 13 17:03:04 crc kubenswrapper[4786]: I0313 17:03:04.359933 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmlrw" event={"ID":"c7dc61c3-8b55-4aa4-bf96-c064cabe6925","Type":"ContainerDied","Data":"5eb089404beec9bc49fbe0e47b4f15e111ca6d750142bb56134574185334bf59"} Mar 13 17:03:04 crc kubenswrapper[4786]: I0313 17:03:04.360124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmlrw" event={"ID":"c7dc61c3-8b55-4aa4-bf96-c064cabe6925","Type":"ContainerStarted","Data":"949669e35eb46d3059326dbf0149d8ec7b6bdd1d57e590ee93405f320c2f181e"} Mar 13 17:03:04 crc kubenswrapper[4786]: I0313 17:03:04.362729 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 17:03:09 crc kubenswrapper[4786]: I0313 17:03:09.429078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmlrw" event={"ID":"c7dc61c3-8b55-4aa4-bf96-c064cabe6925","Type":"ContainerStarted","Data":"06bcfeefb9ba78500fec8d84ed49678a0614b60a6b8e4a74ff8f02218e912ea7"} Mar 13 17:03:09 crc kubenswrapper[4786]: I0313 17:03:09.635395 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8fmsf/must-gather-smw9x"] Mar 13 17:03:09 crc kubenswrapper[4786]: I0313 17:03:09.635777 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8fmsf/must-gather-smw9x" podUID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerName="copy" containerID="cri-o://cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891" gracePeriod=2 Mar 13 17:03:09 crc kubenswrapper[4786]: I0313 17:03:09.650435 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8fmsf/must-gather-smw9x"] Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.143117 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8fmsf_must-gather-smw9x_0b73a37c-f416-4418-bc3d-6752b4dbf7d8/copy/0.log" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.143697 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.149092 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-must-gather-output\") pod \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\" (UID: \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\") " Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.149333 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whgnq\" (UniqueName: \"kubernetes.io/projected/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-kube-api-access-whgnq\") pod \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\" (UID: \"0b73a37c-f416-4418-bc3d-6752b4dbf7d8\") " Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.157635 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-kube-api-access-whgnq" (OuterVolumeSpecName: "kube-api-access-whgnq") pod "0b73a37c-f416-4418-bc3d-6752b4dbf7d8" (UID: "0b73a37c-f416-4418-bc3d-6752b4dbf7d8"). InnerVolumeSpecName "kube-api-access-whgnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.252399 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whgnq\" (UniqueName: \"kubernetes.io/projected/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-kube-api-access-whgnq\") on node \"crc\" DevicePath \"\"" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.315375 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0b73a37c-f416-4418-bc3d-6752b4dbf7d8" (UID: "0b73a37c-f416-4418-bc3d-6752b4dbf7d8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.354704 4786 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b73a37c-f416-4418-bc3d-6752b4dbf7d8-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.439108 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8fmsf_must-gather-smw9x_0b73a37c-f416-4418-bc3d-6752b4dbf7d8/copy/0.log" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.439544 4786 generic.go:334] "Generic (PLEG): container finished" podID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerID="cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891" exitCode=143 Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.439628 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8fmsf/must-gather-smw9x" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.439654 4786 scope.go:117] "RemoveContainer" containerID="cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.441399 4786 generic.go:334] "Generic (PLEG): container finished" podID="c7dc61c3-8b55-4aa4-bf96-c064cabe6925" containerID="06bcfeefb9ba78500fec8d84ed49678a0614b60a6b8e4a74ff8f02218e912ea7" exitCode=0 Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.441438 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmlrw" event={"ID":"c7dc61c3-8b55-4aa4-bf96-c064cabe6925","Type":"ContainerDied","Data":"06bcfeefb9ba78500fec8d84ed49678a0614b60a6b8e4a74ff8f02218e912ea7"} Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.472138 4786 scope.go:117] "RemoveContainer" containerID="28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.557441 4786 scope.go:117] "RemoveContainer" containerID="cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891" Mar 13 17:03:10 crc kubenswrapper[4786]: E0313 17:03:10.557735 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891\": container with ID starting with cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891 not found: ID does not exist" containerID="cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.557765 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891"} err="failed to get container status \"cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891\": rpc error: code = NotFound desc = could not find container \"cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891\": container with ID starting with cf04aa6facccf6e7833f7df228fc0d1c621dc4bb3eda8beb8a5a34c4ffc8d891 not found: ID does not exist" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.557786 4786 scope.go:117] "RemoveContainer" containerID="28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94" Mar 13 17:03:10 crc kubenswrapper[4786]: E0313 17:03:10.558035 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94\": container with ID starting with 28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94 not found: ID does not exist" containerID="28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.558063 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94"} err="failed to get container status \"28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94\": rpc error: code = NotFound desc = could not find container \"28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94\": container with ID starting with 28dccdf32e2f722e9cc43140a3721f31cf09abe83267e71925872eb93356ef94 not found: ID does not exist" Mar 13 17:03:10 crc kubenswrapper[4786]: I0313 17:03:10.563967 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" path="/var/lib/kubelet/pods/0b73a37c-f416-4418-bc3d-6752b4dbf7d8/volumes" Mar 13 17:03:11 crc kubenswrapper[4786]: I0313 17:03:11.459096 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmlrw" event={"ID":"c7dc61c3-8b55-4aa4-bf96-c064cabe6925","Type":"ContainerStarted","Data":"20efd1a43714b14d87024ecab3855e2b4dc4b18a2c05a105933550c0eb2fbb39"} Mar 13 17:03:13 crc kubenswrapper[4786]: I0313 17:03:13.063723 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:13 crc kubenswrapper[4786]: I0313 17:03:13.064069 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:13 crc kubenswrapper[4786]: I0313 17:03:13.143368 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:13 crc kubenswrapper[4786]: I0313 17:03:13.173432 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wmlrw" podStartSLOduration=4.671167829 podStartE2EDuration="11.173413255s" podCreationTimestamp="2026-03-13 17:03:02 +0000 UTC" firstStartedPulling="2026-03-13 17:03:04.36238435 +0000 UTC m=+7214.525596171" lastFinishedPulling="2026-03-13 17:03:10.864629786 +0000 UTC m=+7221.027841597" observedRunningTime="2026-03-13 17:03:11.481727837 +0000 UTC m=+7221.644939648" watchObservedRunningTime="2026-03-13 17:03:13.173413255 +0000 UTC m=+7223.336625076" Mar 13 17:03:23 crc kubenswrapper[4786]: I0313 17:03:23.132504 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wmlrw" Mar 13 17:03:23 crc kubenswrapper[4786]: I0313 17:03:23.219081 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmlrw"] Mar 13 17:03:23 crc kubenswrapper[4786]: I0313 17:03:23.272554 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vg8fj"] Mar 13 17:03:23 crc kubenswrapper[4786]: I0313 17:03:23.274101 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vg8fj" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerName="registry-server" containerID="cri-o://5d0ece2f96c0f5c52230c9499b4a43daae11a80e1afef5aa38276c8e95800c3b" gracePeriod=2 Mar 13 17:03:23 crc kubenswrapper[4786]: I0313 17:03:23.619361 4786 generic.go:334] "Generic (PLEG): container finished" podID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerID="5d0ece2f96c0f5c52230c9499b4a43daae11a80e1afef5aa38276c8e95800c3b" exitCode=0 Mar 13 17:03:23 crc kubenswrapper[4786]: I0313 17:03:23.619467 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vg8fj" event={"ID":"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43","Type":"ContainerDied","Data":"5d0ece2f96c0f5c52230c9499b4a43daae11a80e1afef5aa38276c8e95800c3b"} Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.587228 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.648031 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vg8fj" event={"ID":"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43","Type":"ContainerDied","Data":"ca9538c0c2b1d376dda1e0441fd882e36aea882df868be102f125b0d1032d34b"} Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.648333 4786 scope.go:117] "RemoveContainer" containerID="5d0ece2f96c0f5c52230c9499b4a43daae11a80e1afef5aa38276c8e95800c3b" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.648455 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vg8fj" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.686689 4786 scope.go:117] "RemoveContainer" containerID="9c17993f35b58bab8b9b7c7389c2313629098c47d67dc5dc95d8703cd47e62ad" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.697951 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-catalog-content\") pod \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.698150 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkhkp\" (UniqueName: \"kubernetes.io/projected/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-kube-api-access-tkhkp\") pod \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.698185 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-utilities\") pod \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\" (UID: \"65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43\") " Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.699492 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-utilities" (OuterVolumeSpecName: "utilities") pod "65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" (UID: "65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.712107 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-kube-api-access-tkhkp" (OuterVolumeSpecName: "kube-api-access-tkhkp") pod "65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" (UID: "65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43"). InnerVolumeSpecName "kube-api-access-tkhkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.721044 4786 scope.go:117] "RemoveContainer" containerID="34fdc541a7a79c90770cba93b9801dcb29ff8adfd13e3b299247e1a7d5649146" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.774771 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" (UID: "65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.804380 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.804411 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkhkp\" (UniqueName: \"kubernetes.io/projected/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-kube-api-access-tkhkp\") on node \"crc\" DevicePath \"\"" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.804422 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 17:03:24 crc kubenswrapper[4786]: I0313 17:03:24.996055 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vg8fj"] Mar 13 17:03:25 crc kubenswrapper[4786]: I0313 17:03:25.008553 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vg8fj"] Mar 13 17:03:25 crc kubenswrapper[4786]: I0313 17:03:25.046234 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-8k747"] Mar 13 17:03:25 crc kubenswrapper[4786]: I0313 17:03:25.054204 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-ff67-account-create-update-nrhwr"] Mar 13 17:03:25 crc kubenswrapper[4786]: I0313 17:03:25.062527 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-8k747"] Mar 13 17:03:25 crc kubenswrapper[4786]: I0313 17:03:25.069944 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-ff67-account-create-update-nrhwr"] Mar 13 17:03:26 crc kubenswrapper[4786]: I0313 17:03:26.564243 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f1da9f-ee38-45cb-bc5d-21db3eda07f4" path="/var/lib/kubelet/pods/50f1da9f-ee38-45cb-bc5d-21db3eda07f4/volumes" Mar 13 17:03:26 crc kubenswrapper[4786]: I0313 17:03:26.567011 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" path="/var/lib/kubelet/pods/65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43/volumes" Mar 13 17:03:26 crc kubenswrapper[4786]: I0313 17:03:26.568542 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f276c87-a633-4e23-b3c8-61353255c1e0" path="/var/lib/kubelet/pods/7f276c87-a633-4e23-b3c8-61353255c1e0/volumes" Mar 13 17:03:36 crc kubenswrapper[4786]: I0313 17:03:36.044026 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-l7l9f"] Mar 13 17:03:36 crc kubenswrapper[4786]: I0313 17:03:36.049880 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-l7l9f"] Mar 13 17:03:36 crc kubenswrapper[4786]: I0313 17:03:36.564965 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f0ab05-073c-4f6d-a863-d36e4fc32f25" path="/var/lib/kubelet/pods/92f0ab05-073c-4f6d-a863-d36e4fc32f25/volumes" Mar 13 17:03:37 crc kubenswrapper[4786]: I0313 17:03:37.868869 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 17:03:37 crc kubenswrapper[4786]: I0313 17:03:37.869145 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.353122 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557024-7c4qv"] Mar 13 17:04:00 crc kubenswrapper[4786]: E0313 17:04:00.354812 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerName="gather" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.354913 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerName="gather" Mar 13 17:04:00 crc kubenswrapper[4786]: E0313 17:04:00.354994 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerName="copy" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.355066 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerName="copy" Mar 13 17:04:00 crc kubenswrapper[4786]: E0313 17:04:00.355130 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerName="registry-server" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.355200 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerName="registry-server" Mar 13 17:04:00 crc kubenswrapper[4786]: E0313 17:04:00.355314 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerName="extract-content" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.355406 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerName="extract-content" Mar 13 17:04:00 crc kubenswrapper[4786]: E0313 17:04:00.355478 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerName="extract-utilities" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.355534 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerName="extract-utilities" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.355810 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerName="gather" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.355897 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="65fe5c1c-407e-4a94-b6d0-a4ac3c22bc43" containerName="registry-server" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.355977 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b73a37c-f416-4418-bc3d-6752b4dbf7d8" containerName="copy" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.356772 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557024-7c4qv" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.359570 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.359794 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.359989 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.373457 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557024-7c4qv"] Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.535901 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7n6\" (UniqueName: \"kubernetes.io/projected/45837271-0d6b-4e64-bb78-e54174141d3d-kube-api-access-6z7n6\") pod \"auto-csr-approver-29557024-7c4qv\" (UID: \"45837271-0d6b-4e64-bb78-e54174141d3d\") " pod="openshift-infra/auto-csr-approver-29557024-7c4qv" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.638620 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7n6\" (UniqueName: \"kubernetes.io/projected/45837271-0d6b-4e64-bb78-e54174141d3d-kube-api-access-6z7n6\") pod \"auto-csr-approver-29557024-7c4qv\" (UID: \"45837271-0d6b-4e64-bb78-e54174141d3d\") " pod="openshift-infra/auto-csr-approver-29557024-7c4qv" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.660105 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7n6\" (UniqueName: \"kubernetes.io/projected/45837271-0d6b-4e64-bb78-e54174141d3d-kube-api-access-6z7n6\") pod \"auto-csr-approver-29557024-7c4qv\" (UID: \"45837271-0d6b-4e64-bb78-e54174141d3d\") " pod="openshift-infra/auto-csr-approver-29557024-7c4qv" Mar 13 17:04:00 crc kubenswrapper[4786]: I0313 17:04:00.680574 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557024-7c4qv" Mar 13 17:04:01 crc kubenswrapper[4786]: I0313 17:04:01.201381 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557024-7c4qv"] Mar 13 17:04:01 crc kubenswrapper[4786]: I0313 17:04:01.356456 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557024-7c4qv" event={"ID":"45837271-0d6b-4e64-bb78-e54174141d3d","Type":"ContainerStarted","Data":"27c21def48f9693f260215081ad5fea961885880cacd93fe58d0da6377a7c5ec"} Mar 13 17:04:04 crc kubenswrapper[4786]: I0313 17:04:04.393131 4786 generic.go:334] "Generic (PLEG): container finished" podID="45837271-0d6b-4e64-bb78-e54174141d3d" containerID="e89c1b5ccf15910b5ac31cf8ee56e7245c2b75bc701fbf1422fe8d1655e646b9" exitCode=0 Mar 13 17:04:04 crc kubenswrapper[4786]: I0313 17:04:04.393212 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557024-7c4qv" event={"ID":"45837271-0d6b-4e64-bb78-e54174141d3d","Type":"ContainerDied","Data":"e89c1b5ccf15910b5ac31cf8ee56e7245c2b75bc701fbf1422fe8d1655e646b9"} Mar 13 17:04:06 crc kubenswrapper[4786]: I0313 17:04:06.058157 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557024-7c4qv" Mar 13 17:04:06 crc kubenswrapper[4786]: I0313 17:04:06.170061 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z7n6\" (UniqueName: \"kubernetes.io/projected/45837271-0d6b-4e64-bb78-e54174141d3d-kube-api-access-6z7n6\") pod \"45837271-0d6b-4e64-bb78-e54174141d3d\" (UID: \"45837271-0d6b-4e64-bb78-e54174141d3d\") " Mar 13 17:04:06 crc kubenswrapper[4786]: I0313 17:04:06.176093 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45837271-0d6b-4e64-bb78-e54174141d3d-kube-api-access-6z7n6" (OuterVolumeSpecName: "kube-api-access-6z7n6") pod "45837271-0d6b-4e64-bb78-e54174141d3d" (UID: "45837271-0d6b-4e64-bb78-e54174141d3d"). InnerVolumeSpecName "kube-api-access-6z7n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:04:06 crc kubenswrapper[4786]: I0313 17:04:06.273657 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z7n6\" (UniqueName: \"kubernetes.io/projected/45837271-0d6b-4e64-bb78-e54174141d3d-kube-api-access-6z7n6\") on node \"crc\" DevicePath \"\"" Mar 13 17:04:06 crc kubenswrapper[4786]: I0313 17:04:06.448661 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557024-7c4qv" event={"ID":"45837271-0d6b-4e64-bb78-e54174141d3d","Type":"ContainerDied","Data":"27c21def48f9693f260215081ad5fea961885880cacd93fe58d0da6377a7c5ec"} Mar 13 17:04:06 crc kubenswrapper[4786]: I0313 17:04:06.448719 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557024-7c4qv" Mar 13 17:04:06 crc kubenswrapper[4786]: I0313 17:04:06.448740 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27c21def48f9693f260215081ad5fea961885880cacd93fe58d0da6377a7c5ec" Mar 13 17:04:07 crc kubenswrapper[4786]: I0313 17:04:07.159542 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557018-mqmh9"] Mar 13 17:04:07 crc kubenswrapper[4786]: I0313 17:04:07.175074 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557018-mqmh9"] Mar 13 17:04:07 crc kubenswrapper[4786]: I0313 17:04:07.868639 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 17:04:07 crc kubenswrapper[4786]: I0313 17:04:07.869126 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 17:04:08 crc kubenswrapper[4786]: I0313 17:04:08.569599 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1850f9bb-bcf0-48ad-84b5-b138b0e06ccf" path="/var/lib/kubelet/pods/1850f9bb-bcf0-48ad-84b5-b138b0e06ccf/volumes" Mar 13 17:04:20 crc kubenswrapper[4786]: I0313 17:04:20.468653 4786 scope.go:117] "RemoveContainer" containerID="23cbbb09890b74a306b516666d1336a6dc02c87024a2b448354040ef162893de" Mar 13 17:04:20 crc kubenswrapper[4786]: I0313 17:04:20.543235 4786 scope.go:117] "RemoveContainer" containerID="599af0a9e83a294ea9969f43c35e96d2c84c96e308a51de72d0249014762e4fb" Mar 13 17:04:20 crc kubenswrapper[4786]: I0313 17:04:20.573350 4786 scope.go:117] "RemoveContainer" containerID="b7d3219d584ba0552452654e9739dfc29ee21cfa32dd7c0128c4e70dae6f9f63" Mar 13 17:04:20 crc kubenswrapper[4786]: I0313 17:04:20.608980 4786 scope.go:117] "RemoveContainer" containerID="9c0c5d34adc6517b7f1e42d3fb8e68d96534bdabfedb8787e860b43a5a85da44" Mar 13 17:04:20 crc kubenswrapper[4786]: I0313 17:04:20.658542 4786 scope.go:117] "RemoveContainer" containerID="6bbf0945c8e6d155591de1ebe333f97f53e822af88059fedd5b9deb943e5ef70" Mar 13 17:04:37 crc kubenswrapper[4786]: I0313 17:04:37.869491 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 17:04:37 crc kubenswrapper[4786]: I0313 17:04:37.870203 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 17:04:37 crc kubenswrapper[4786]: I0313 17:04:37.870269 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" Mar 13 17:04:37 crc kubenswrapper[4786]: I0313 17:04:37.871470 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08dbf6d74035d8d2ccb4586df0c13c9e09d0f88484b3aef05d4459bc8e347cda"} pod="openshift-machine-config-operator/machine-config-daemon-zqb49" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 17:04:37 crc kubenswrapper[4786]: I0313 17:04:37.871568 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" containerID="cri-o://08dbf6d74035d8d2ccb4586df0c13c9e09d0f88484b3aef05d4459bc8e347cda" gracePeriod=600 Mar 13 17:04:38 crc kubenswrapper[4786]: I0313 17:04:38.694494 4786 generic.go:334] "Generic (PLEG): container finished" podID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerID="08dbf6d74035d8d2ccb4586df0c13c9e09d0f88484b3aef05d4459bc8e347cda" exitCode=0 Mar 13 17:04:38 crc kubenswrapper[4786]: I0313 17:04:38.694577 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerDied","Data":"08dbf6d74035d8d2ccb4586df0c13c9e09d0f88484b3aef05d4459bc8e347cda"} Mar 13 17:04:38 crc kubenswrapper[4786]: I0313 17:04:38.694964 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" event={"ID":"6b929603-1f9d-4b41-9bf8-528d7fd4ad56","Type":"ContainerStarted","Data":"4da627e7ab2b81da0fa97b48559c193785e5fc2302232e2a3f980876906bb845"} Mar 13 17:04:38 crc kubenswrapper[4786]: I0313 17:04:38.695000 4786 scope.go:117] "RemoveContainer" containerID="389537d8bc114b10d01a6d9e80d713ec773cc80dd76175dd13ac5a83cb02ef2d" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.176563 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557026-986f9"] Mar 13 17:06:00 crc kubenswrapper[4786]: E0313 17:06:00.177841 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45837271-0d6b-4e64-bb78-e54174141d3d" containerName="oc" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.177881 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="45837271-0d6b-4e64-bb78-e54174141d3d" containerName="oc" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.178219 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="45837271-0d6b-4e64-bb78-e54174141d3d" containerName="oc" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.179283 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557026-986f9" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.182302 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.182330 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-8s84k" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.182760 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.189177 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557026-986f9"] Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.238552 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4l4\" (UniqueName: \"kubernetes.io/projected/48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf-kube-api-access-5g4l4\") pod \"auto-csr-approver-29557026-986f9\" (UID: \"48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf\") " pod="openshift-infra/auto-csr-approver-29557026-986f9" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.340459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4l4\" (UniqueName: \"kubernetes.io/projected/48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf-kube-api-access-5g4l4\") pod \"auto-csr-approver-29557026-986f9\" (UID: \"48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf\") " pod="openshift-infra/auto-csr-approver-29557026-986f9" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.375238 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4l4\" (UniqueName: \"kubernetes.io/projected/48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf-kube-api-access-5g4l4\") pod \"auto-csr-approver-29557026-986f9\" (UID: \"48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf\") " pod="openshift-infra/auto-csr-approver-29557026-986f9" Mar 13 17:06:00 crc kubenswrapper[4786]: I0313 17:06:00.505304 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557026-986f9" Mar 13 17:06:01 crc kubenswrapper[4786]: I0313 17:06:01.004268 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557026-986f9"] Mar 13 17:06:01 crc kubenswrapper[4786]: I0313 17:06:01.805389 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557026-986f9" event={"ID":"48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf","Type":"ContainerStarted","Data":"e56be5bb88242d1b6e3d7fbdef95aedeec47430bd6c5b7b03e248490e0b03983"} Mar 13 17:06:03 crc kubenswrapper[4786]: I0313 17:06:03.828287 4786 generic.go:334] "Generic (PLEG): container finished" podID="48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf" containerID="159f194a0fc840d31ce41ccbb439650986a3af9c06db14835dff0cc931e66af1" exitCode=0 Mar 13 17:06:03 crc kubenswrapper[4786]: I0313 17:06:03.828365 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557026-986f9" event={"ID":"48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf","Type":"ContainerDied","Data":"159f194a0fc840d31ce41ccbb439650986a3af9c06db14835dff0cc931e66af1"} Mar 13 17:06:05 crc kubenswrapper[4786]: I0313 17:06:05.213744 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557026-986f9" Mar 13 17:06:05 crc kubenswrapper[4786]: I0313 17:06:05.257761 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g4l4\" (UniqueName: \"kubernetes.io/projected/48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf-kube-api-access-5g4l4\") pod \"48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf\" (UID: \"48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf\") " Mar 13 17:06:05 crc kubenswrapper[4786]: I0313 17:06:05.265497 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf-kube-api-access-5g4l4" (OuterVolumeSpecName: "kube-api-access-5g4l4") pod "48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf" (UID: "48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf"). InnerVolumeSpecName "kube-api-access-5g4l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:06:05 crc kubenswrapper[4786]: I0313 17:06:05.360050 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g4l4\" (UniqueName: \"kubernetes.io/projected/48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf-kube-api-access-5g4l4\") on node \"crc\" DevicePath \"\"" Mar 13 17:06:05 crc kubenswrapper[4786]: I0313 17:06:05.856098 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557026-986f9" event={"ID":"48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf","Type":"ContainerDied","Data":"e56be5bb88242d1b6e3d7fbdef95aedeec47430bd6c5b7b03e248490e0b03983"} Mar 13 17:06:05 crc kubenswrapper[4786]: I0313 17:06:05.856170 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e56be5bb88242d1b6e3d7fbdef95aedeec47430bd6c5b7b03e248490e0b03983" Mar 13 17:06:05 crc kubenswrapper[4786]: I0313 17:06:05.856179 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557026-986f9" Mar 13 17:06:06 crc kubenswrapper[4786]: I0313 17:06:06.325907 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557020-nwjjl"] Mar 13 17:06:06 crc kubenswrapper[4786]: I0313 17:06:06.343447 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557020-nwjjl"] Mar 13 17:06:06 crc kubenswrapper[4786]: I0313 17:06:06.572210 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910385c6-c8c4-48cb-942c-cd7f5a521aef" path="/var/lib/kubelet/pods/910385c6-c8c4-48cb-942c-cd7f5a521aef/volumes" Mar 13 17:07:07 crc kubenswrapper[4786]: I0313 17:07:07.869161 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 17:07:07 crc kubenswrapper[4786]: I0313 17:07:07.869785 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 17:07:17 crc kubenswrapper[4786]: I0313 17:07:17.896128 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbjtv"] Mar 13 17:07:17 crc kubenswrapper[4786]: E0313 17:07:17.897484 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf" containerName="oc" Mar 13 17:07:17 crc kubenswrapper[4786]: I0313 17:07:17.897510 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf" containerName="oc" Mar 13 17:07:17 crc kubenswrapper[4786]: I0313 17:07:17.897945 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c7b2f6-e261-4b12-8bc6-1f0c1239dbaf" containerName="oc" Mar 13 17:07:17 crc kubenswrapper[4786]: I0313 17:07:17.900709 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:17 crc kubenswrapper[4786]: I0313 17:07:17.910042 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbjtv"] Mar 13 17:07:17 crc kubenswrapper[4786]: I0313 17:07:17.994347 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-utilities\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:17 crc kubenswrapper[4786]: I0313 17:07:17.994497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-catalog-content\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:17 crc kubenswrapper[4786]: I0313 17:07:17.994573 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx7qw\" (UniqueName: \"kubernetes.io/projected/bdef66a3-eeff-4e13-9e6f-7d3c50532447-kube-api-access-mx7qw\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:18 crc kubenswrapper[4786]: I0313 17:07:18.096314 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-utilities\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:18 crc kubenswrapper[4786]: I0313 17:07:18.096422 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-catalog-content\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:18 crc kubenswrapper[4786]: I0313 17:07:18.096475 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx7qw\" (UniqueName: \"kubernetes.io/projected/bdef66a3-eeff-4e13-9e6f-7d3c50532447-kube-api-access-mx7qw\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:18 crc kubenswrapper[4786]: I0313 17:07:18.097030 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-utilities\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:18 crc kubenswrapper[4786]: I0313 17:07:18.097320 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-catalog-content\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:18 crc kubenswrapper[4786]: I0313 17:07:18.123439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx7qw\" (UniqueName: \"kubernetes.io/projected/bdef66a3-eeff-4e13-9e6f-7d3c50532447-kube-api-access-mx7qw\") pod \"community-operators-mbjtv\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:18 crc kubenswrapper[4786]: I0313 17:07:18.266761 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:18 crc kubenswrapper[4786]: I0313 17:07:18.823509 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbjtv"] Mar 13 17:07:19 crc kubenswrapper[4786]: I0313 17:07:19.047625 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbjtv" event={"ID":"bdef66a3-eeff-4e13-9e6f-7d3c50532447","Type":"ContainerStarted","Data":"7e25363f045b2363fca8fcec6d33698312546f78b7316f0fc144ad9d4f058c1e"} Mar 13 17:07:20 crc kubenswrapper[4786]: I0313 17:07:20.063666 4786 generic.go:334] "Generic (PLEG): container finished" podID="bdef66a3-eeff-4e13-9e6f-7d3c50532447" containerID="f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302" exitCode=0 Mar 13 17:07:20 crc kubenswrapper[4786]: I0313 17:07:20.063907 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbjtv" event={"ID":"bdef66a3-eeff-4e13-9e6f-7d3c50532447","Type":"ContainerDied","Data":"f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302"} Mar 13 17:07:20 crc kubenswrapper[4786]: I0313 17:07:20.861176 4786 scope.go:117] "RemoveContainer" containerID="9afc8262d85694dd50c517a2687b49deb0a5491e8d2159baf3d1d8a854f57db1" Mar 13 17:07:22 crc kubenswrapper[4786]: I0313 17:07:22.086387 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbjtv" event={"ID":"bdef66a3-eeff-4e13-9e6f-7d3c50532447","Type":"ContainerStarted","Data":"a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560"} Mar 13 17:07:23 crc kubenswrapper[4786]: I0313 17:07:23.108923 4786 generic.go:334] "Generic (PLEG): container finished" podID="bdef66a3-eeff-4e13-9e6f-7d3c50532447" containerID="a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560" exitCode=0 Mar 13 17:07:23 crc kubenswrapper[4786]: I0313 17:07:23.109073 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbjtv" event={"ID":"bdef66a3-eeff-4e13-9e6f-7d3c50532447","Type":"ContainerDied","Data":"a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560"} Mar 13 17:07:24 crc kubenswrapper[4786]: I0313 17:07:24.130689 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbjtv" event={"ID":"bdef66a3-eeff-4e13-9e6f-7d3c50532447","Type":"ContainerStarted","Data":"de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6"} Mar 13 17:07:24 crc kubenswrapper[4786]: I0313 17:07:24.179947 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbjtv" podStartSLOduration=3.4539387919999998 podStartE2EDuration="7.179926733s" podCreationTimestamp="2026-03-13 17:07:17 +0000 UTC" firstStartedPulling="2026-03-13 17:07:20.070030815 +0000 UTC m=+7470.233242666" lastFinishedPulling="2026-03-13 17:07:23.796018786 +0000 UTC m=+7473.959230607" observedRunningTime="2026-03-13 17:07:24.160489021 +0000 UTC m=+7474.323700842" watchObservedRunningTime="2026-03-13 17:07:24.179926733 +0000 UTC m=+7474.343138554" Mar 13 17:07:28 crc kubenswrapper[4786]: I0313 17:07:28.267515 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:28 crc kubenswrapper[4786]: I0313 17:07:28.268134 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:28 crc kubenswrapper[4786]: I0313 17:07:28.343505 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:29 crc kubenswrapper[4786]: I0313 17:07:29.265881 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:29 crc kubenswrapper[4786]: I0313 17:07:29.334977 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbjtv"] Mar 13 17:07:31 crc kubenswrapper[4786]: I0313 17:07:31.207250 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbjtv" podUID="bdef66a3-eeff-4e13-9e6f-7d3c50532447" containerName="registry-server" containerID="cri-o://de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6" gracePeriod=2 Mar 13 17:07:31 crc kubenswrapper[4786]: I0313 17:07:31.757596 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:31 crc kubenswrapper[4786]: I0313 17:07:31.949962 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-utilities\") pod \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " Mar 13 17:07:31 crc kubenswrapper[4786]: I0313 17:07:31.950258 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx7qw\" (UniqueName: \"kubernetes.io/projected/bdef66a3-eeff-4e13-9e6f-7d3c50532447-kube-api-access-mx7qw\") pod \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " Mar 13 17:07:31 crc kubenswrapper[4786]: I0313 17:07:31.950300 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-catalog-content\") pod \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\" (UID: \"bdef66a3-eeff-4e13-9e6f-7d3c50532447\") " Mar 13 17:07:31 crc kubenswrapper[4786]: I0313 17:07:31.950968 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-utilities" (OuterVolumeSpecName: "utilities") pod "bdef66a3-eeff-4e13-9e6f-7d3c50532447" (UID: "bdef66a3-eeff-4e13-9e6f-7d3c50532447"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 17:07:31 crc kubenswrapper[4786]: I0313 17:07:31.956251 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdef66a3-eeff-4e13-9e6f-7d3c50532447-kube-api-access-mx7qw" (OuterVolumeSpecName: "kube-api-access-mx7qw") pod "bdef66a3-eeff-4e13-9e6f-7d3c50532447" (UID: "bdef66a3-eeff-4e13-9e6f-7d3c50532447"). InnerVolumeSpecName "kube-api-access-mx7qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.052990 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.053036 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx7qw\" (UniqueName: \"kubernetes.io/projected/bdef66a3-eeff-4e13-9e6f-7d3c50532447-kube-api-access-mx7qw\") on node \"crc\" DevicePath \"\"" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.225114 4786 generic.go:334] "Generic (PLEG): container finished" podID="bdef66a3-eeff-4e13-9e6f-7d3c50532447" containerID="de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6" exitCode=0 Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.225162 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbjtv" event={"ID":"bdef66a3-eeff-4e13-9e6f-7d3c50532447","Type":"ContainerDied","Data":"de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6"} Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.225195 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbjtv" event={"ID":"bdef66a3-eeff-4e13-9e6f-7d3c50532447","Type":"ContainerDied","Data":"7e25363f045b2363fca8fcec6d33698312546f78b7316f0fc144ad9d4f058c1e"} Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.225218 4786 scope.go:117] "RemoveContainer" containerID="de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.225242 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbjtv" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.263947 4786 scope.go:117] "RemoveContainer" containerID="a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.310406 4786 scope.go:117] "RemoveContainer" containerID="f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.353361 4786 scope.go:117] "RemoveContainer" containerID="de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6" Mar 13 17:07:32 crc kubenswrapper[4786]: E0313 17:07:32.353890 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6\": container with ID starting with de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6 not found: ID does not exist" containerID="de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.353942 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6"} err="failed to get container status \"de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6\": rpc error: code = NotFound desc = could not find container \"de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6\": container with ID starting with de89a0f70977146d069c644f074d817cae1204e5ea4a721a7721c7dc3155aac6 not found: ID does not exist" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.353976 4786 scope.go:117] "RemoveContainer" containerID="a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560" Mar 13 17:07:32 crc kubenswrapper[4786]: E0313 17:07:32.354456 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560\": container with ID starting with a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560 not found: ID does not exist" containerID="a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.354495 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560"} err="failed to get container status \"a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560\": rpc error: code = NotFound desc = could not find container \"a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560\": container with ID starting with a9e57afdc33a6d6ea5b7b81bd06bc610860fdf6d36d03234976fe880084ba560 not found: ID does not exist" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.354523 4786 scope.go:117] "RemoveContainer" containerID="f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302" Mar 13 17:07:32 crc kubenswrapper[4786]: E0313 17:07:32.354780 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302\": container with ID starting with f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302 not found: ID does not exist" containerID="f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.354816 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302"} err="failed to get container status \"f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302\": rpc error: code = NotFound desc = could not find container \"f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302\": container with ID starting with f1f5e47e5d2d85929945a9ba7cfa219167a320873c5bf0d70237e899bbf95302 not found: ID does not exist" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.594559 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdef66a3-eeff-4e13-9e6f-7d3c50532447" (UID: "bdef66a3-eeff-4e13-9e6f-7d3c50532447"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.666331 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdef66a3-eeff-4e13-9e6f-7d3c50532447-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.858224 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbjtv"] Mar 13 17:07:32 crc kubenswrapper[4786]: I0313 17:07:32.871595 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbjtv"] Mar 13 17:07:34 crc kubenswrapper[4786]: I0313 17:07:34.573684 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdef66a3-eeff-4e13-9e6f-7d3c50532447" path="/var/lib/kubelet/pods/bdef66a3-eeff-4e13-9e6f-7d3c50532447/volumes" Mar 13 17:07:37 crc kubenswrapper[4786]: I0313 17:07:37.869084 4786 patch_prober.go:28] interesting pod/machine-config-daemon-zqb49 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 17:07:37 crc kubenswrapper[4786]: I0313 17:07:37.869441 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqb49" podUID="6b929603-1f9d-4b41-9bf8-528d7fd4ad56" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515155042150024443 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015155042151017361 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015155023056016507 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015155023056015457 5ustar corecore